In computer systems, Amdahl's Law will show how much performance increase one can expect from any kind of
enhancement; improved CPU architecture, optimising compiler technology, parallelism, higher speed memory, or
anything else. The law itself is fairly intuitive, but it's results are surprising and often very disappointing.
Simply stated, assuming one can test a computer system without the enhancement, where E is the fraction of the
original execution time that could influenced by the improvement, and S is the expected speedup given by the
improvement, the overall speed-up is given by:
1/((1-E)+(E/S))
To give an example, suppose that 95% of our execution time is due to a program segment that can be improved by using
a smarter compiler that will give it a 100x speed-up. This seems like quite a generous and unlikely
situation, and one would expect a massive, near 100x overall speed-up. But, substituting E=0.95 and S=100 gives a
speed-up of only 17 times. With S=10,000, we only get a 20x boost! Ack!
Qualitatively, one might say that as we improve the performance of the original 95%, we increase the significance
of the unimproved 5%.