# 3 lines C-prog hurts MS VC

do you trust your compiler? do you complete “secondhand” sources? yeah, we all do that. but do you know that evil sources might hurt your compiler and sometimes even abuse your host OS (if compiler works inside guest VM Ware/Virtual PC environment). no MS specific extension! just pure ANSI C!

well, try this…

#include “stdio.h”
#include “limits.h”
int hack[INT_MAX] = {1,2,3};
main(){
printf(“hello world!\n”); return hack[INT_MAX-1];
}

what happened? oh… trying to create a very big array (2Gb!) complier fed up all memory and end up with error. if you don’t have enough physical memory and page file is not disabled – OS is hunting hdd for a long time, slowing down the whole system.

the same trick works with some assemblers – just define a huge array (TASM and MASM are affected).

I wonder: how many translators have this bug? please, download the source code, try to compile it and mail result to info#re-lab.org or leave your comment here. please, test every translator you have under your hands. not only C/C++ but DELPHI/Pascal/ASM as well. thanks!

 

10 Comments

  1. Hi,

    Interesting idea ;> I’ve run a few tests, here are the results:

    MinGW gcc 3.4.5:
    “Linux” gcc 2.95:
    “Linux” gcc 3.4:
    “Linux” gcc 4.1:
    “QNX” gcc 2.95.3:
    c-DoS.c:32: error: size of array `hack’ is too large

    MS C/C++ compiler from VS 2003 (ver 13.10.3077):
    c-DoS.c(32) : error C2148: total size of array must not exceed ffffffff bytes

    MS C/C++ compiler from VS 9.0 (ver 15.00.21022.08):
    c-DoS.c(32) : error C2148: total size of array must not exceed 0x7fffffff bytes

    Pelles ISO C Compiler, Version 3.00.10 (old):
    c-DoS.c(32): error #2133: Size of ‘array of int’ exceeds 2147483647 bytes.
    c-DoS.c(32): error #2139: Too many initializers.

    Logiciels/Informatique lcc-win32 version 3.8. Compilation date: Jul 25 2007 00:35:32:
    Error c-dos.c: 32 size of ‘array of int’ exceeds 2147483647 bytes
    Error c-dos.c: 32 too many initializers

    NASM version 0.98.39 compiled on Jan 16 2005:
    Started eating RAM, after about 30 seconds it stoped and exited with:
    nasm: fatal: out of memory

    Well, that would be it for things I’ve got here…

  2. thanks! it’s interesting! could you decrease the size of the array (says INT_MAX/2 or INT_MAX/4) and recompile it? just to find the size where a compiler doesn’t give an immediately error, trying to allocate a huge memory block causing DoS?

  3. Very interesting idea. I tested it LCC-Win32 (a free, lightweight compiler) and it immediately errors out on INT_MAX (it knows that the resulting structure would be bigger than 2G) and exits quite fast with INT_MAX/4 (indicating that it couldn’t allocate enough memory).

    An other idea would be to test this in 64 bit hosts and/or with the executable targeting 64 (to avoid the compiler erroring out at the 2G limit).

  4. Sure,

    the Windows tests except for Pelles, are run on a machine with 4gb of memory (running vista 64)
    Pelles is run on a machine with 1gb of memory @ xp
    the Linux tests are run on a machine with 512MB of memory @ debian

    MinGW gcc 3.4.5:
    INT_MAX/2 still displays error
    INT_MAX/4 starts to eat some ram (around 400mb at my place) and then it writes set an .o file, but stops after around 2GB written with error:
    m:\/ccCsaaaa.s: Assembler messages:
    m:\/ccCsaaaa.s:37: Fatal error: can’t close m:\/cc4Kaaaa.o: File truncated
    0x1fffffff is the biggest array size that does not display error

    MS VS9 15.00.21022.08 C/C++ Compiler:
    0x1fffffff: eats around 1,9 GB of RAM for about 10 seconds, then exits with an error:
    c-DoS.c : fatal error C1002: compiler is out of heap space in pass 2
    Every higher size causes the error about array size from the previous post to be displayed.

    MS VS2k3 13.10.3077 C/C++ Compiler:
    0x1fffffff to 0x3fffffff: like the previous one, the error displayed:
    fatal error C1002: compiler is out of heap space in pass 2
    Every higher size causes the error about array size from the previous post to be displayed.

    “Linux” gcc 4.1:
    at 0x1fffffff: does something for some time (creates a 2GB file in /tmp, .o), and then it quits with the message:
    /usr/bin/ld:/tmp/ccQtdHzq.o: bfd_stat failed: Value too large for defined data type
    /usr/bin/ld: final link failed: Memory exhausted
    Every higher size causes the error about array size from the previous post to be displayed.

    “Linux” gcc 2.95:
    at 0x1fffffff: like gcc 4.1
    at 0×20000000 to 0x3fffffff: ehm.. just exits.. looks like a clean exit (exit_group(0)), no core dump, no errors.. no warnings.. huh
    Every higher size causes the error about array size from the previous post to be displayed.

    “Linux” gcc 3.4:
    like gcc 4.1

    Logiciels/Informatique lcc-win32 version 3.8. Compilation date: Jul 25 2007 00:35:32:
    at 0x1fffffff: “thinks” about 5 seconds, and displays an error:
    Error c-dos.c: 36 compiler error in Unable to allocate -2147483636 bytes
    at 0x0fffffff: starts to eat up memory, eats about 1GB of it, thinks a while, and outputs a 1GB obj file without errors.
    at 0x1affffff: like with 0x1fffffff, but error has positive value
    Error c-dos.c: 37 compiler error in Unable to allocate 1811939340 bytes

    Pelles ISO C Compiler, Version 3.00.10 (old):
    at 0x1fffffff: thinks some time, eats memory… still eating memory, the system started swaping stuff out, the page file grows and grows, over 2GB now… CPU usage 100% due to the swap outs…pagefile reached 2.5GB, and the compiler issued an error message (but is still running, however memory usage is decreasing):
    *** No message for error 0×20008008 ***
    Still running…PF reached 2GB, CPU usage at 30-50%…I’m going to get a sandwich…back.. PF=1.5GB, CPU still same usage level….. PF dropped to 0.6GB and the compiler stopped finally…

    Well, thats that…

  5. How is this a bug, exactly? The compiler is doing exactly what the programmer intended it to. If anything, it’s a programmer error for not understanding the consequence of allocating an extremely large array.

  6. @Skeptic
    As I understand it, the point is that You do not always compile Your code (this is true especially on *nix systems). At some point, there might be a bug when the compiler would fail the allocation, but assume it was OK, and start filling in the data. This might outcome in a DoS, or in some case, even a arbitrary-data-written-to-invalid-memory-location kinda bug. Let’s pretend it does end up with something like that, and it allows code execution. This would lead to a machine take over just by compiling (not even executing!) some code.
    A DoS that would use up much memory and CPU resources may also be unwelcome in some situations.

  7. @Gynvael: … except that you just tested it with a bunch of compilers, and they all exited gracefully with an error message (the proper behavior).

  8. Tested Python, PHP, and FASM.

    FASM has a limit for array size around 3.95 MB; if the array is larger, it reports “Out of memory” error.

    Python hangs for a few seconds when trying to allocate the memory, then fails. The Python code is:

    # If you are going to run this, please save your work first and close other applications
    import sys
    range(sys.maxint/16)

    PHP is protected with the memory_limit configuration setting, but it still hangs for a second if you try to allocate a larger array with array_fill.

  9. @Skeptic
    In that point You are correct my friend ;>

  10. I do compile MSVS2008.
    Result:
    error C2148: total size of array must not exceed 0x7fffffff bytes.
    warning C4307: ‘*’ : integral constant overflow

Leave a comment

Comments are closed.