A lot of the time, doubling the speed of your computer will double the amount of data it can process over a given time interval. For instance, if you have a computer that can crunch through 500 computations of a certain type in ten seconds, and you double the speed of that computer, it should be able to crunch through 1000 computations of the same type in ten seconds.

Sadly, this is not always true. For NP problems that do not have a known polynomial solution, doubling the speed of the computer will only add a fixed amount of computations.

For example, suppose our 100 MHz machine can process 500 units of data in 10 seconds. The 200 MHz machine will not necessarily be able to handle 1000 units. It may only be able to handle 550. To be able to handle 600, we would need a 400 MHz machine and so on.
Units   Speed
500     100 MHz
550     200 MHz
600     400 MHz
650     800 MHz
700     1.6 GHz
750     3.2 GHz
800     6.4 GHz


As you might see, solving a signifigantly large problem (one million units for instance) is going require a computer faster than the combined speeds of all computers in existence.

If P were equal to NP (Though unlikely) that would mean that there does in fact exist some algorithm which would be able to handle the problem in polynomial time.