C has a built-in tendency to produce programs with buffer overflows. Most Unix variants and Unix programs are written in C; consequently, buffer overflows take the lion's share of software problems. The majority of security violating software bugs on Unix systems are due to buffer overflows. Try finding a Y2K problem found in Unix software. I bet that against buffer overflows, the rate is about 1 in 100. Taking severity of the problems into account, Y2K becomes even less important. Not to mention that buffer overflows can be very hard to find.

The cause of the problem is C's shallow notion of datatypes. Sure, it supports arrays, structs, unions, and enums ('sets'), but they can always be bypassed through pointers that point directly into memory. And although the type checking in C helps a little, there is no size checking on arrays, so you can allocate a 3-item array 'foo' and then write to the 5th element without even getting a warning.

This makes the language small and easy to implement, and add-on libraries can add many runtime checks against buffer overflows, but it would be better if the language were designed to support them, and disabling the overhead were an optimization option.

In 1970, when C was designed, the choice was defensible: CPU and memory were expensive, and humans were considered cheap and reasonably flawless in comparison; today, human error is probably the main bottleneck in software projects, so the lack of programmer protection in C must be considered a Very Bad Thing (TM).

Log in or register to write something here or to contact authors.