Broadly speaking, there are two types of denial of service attacks:
Such attacks damage or destroy resources so you can't use them. Examples range from causing a disk crash that halts your system to deleting critical commands such as cc and ls. Although many of these attacks require shell access to the system, there are also network-based denial of service attacks that are designed to crash servers.
Such attacks overload some system service or exhaust some resource (either deliberately by an attacker, or accidentally as the result of a user's mistake), thus preventing others from using that service. This simplest type of overload involves filling up a disk partition so users and system programs can't create new files. The "bacteria" discussed in Chapter 23 perform this kind of attack. A network-based overload attack could bombard a network server with so many requests that it is unable to service them, or it could flood an organization's Internet connection so that there would be no bandwidth remaining to send desired information.
Many denial of service incidents are the result of bugs or inadvertent emergent behavior, rather than an intentional malicious attack. For example:
A programmer may make a typographical error, such as typing x=0 instead of x==0, which causes a program to never terminate. Over time, more and more copies of the program are left running, ultimately causing the denial of service.
A web server may be correctly sized for its anticipated user base, but one day a link to the web site may be posted on a vastly more popular site, such as CNN or Slashdot. The smaller web server may have insufficient computing power or bandwidth to satisfy the sudden surge in requests. Alternatively, the sudden increase in traffic may cause the server to crash because of a latent configuration error that did not matter under low-load conditions.
There may be inadvertent sequencing or timing dependencies on a system that do not appear under normal operation, but suddenly manifest themselves in a manner that causes damage. For example, a Unix system might be configured with a script that runs every five minutes to perform housecleaning functions. As long as this script finishes in less than five minutes, no problem may be evident. But if one day the script runs a bit behind schedule and requires seven minutes of time, a second instance of the "every 5" script will be started before the first instance finishes. This, in turn, might cause the computer to run slower. Because the computer is running slower, the second copy of the script might require nine minutes to complete execution. If the computer's speed slows down the cleanup process to the point that the script requires more than 10 minutes to execute, then 3 copies might be running simultaneously, and so on. Hours later the computer might crash because it has 30 or 50 copies of the CPU-intensive cleanup script running.
Modern Unix systems provide many mechanisms for protecting against denial of service problems. Most versions of Unix allow you to limit the maximum number of files or processes that a user is allowed, the amount of disk space that each user is allotted, and even the amount of CPU space that each user process may consume. Network services can be limited in terms of CPU time and rate. Nevertheless, many Unix systems in the field remain vulnerable to denial of service attacks because the protective measures are typically not enabled nor properly set.