According to http://whatis.techtarget.com/definition/0,,sid9_gci527311,00.html,
An order of magnitude is an exponential change of plus-or-minus 1 in the value of a quantity or unit. The term is generally used in conjunction with power-of-10 scientific notation.
For example, the Richter Scale is a (base 10) logarithmic scale; an earthquake measuring 8 on the Richter Scale is 10 times more powerful than a 7, so it can correctly be said that an 8 is an order of magnitude more powerful than a 7. Another example of a logarithmic scale is decibels, typically used to measure sound or radiant energy (e.g. radio waves). For more information on decibels, see http://arts.ucsc.edu/EMS/Music/tech_background/TE-06/teces_06.html.

Although I've never heard it used this way among my colleagues, isogolem points out, computer geeks might use order of magnitude to mean a doubling of a binary number, because technically in binary a doubling is an exponential change.

I suppose when used rhetorically instead of scientifically, an order of magnitude could just mean an really big difference without specific quantification. However, in everyday speech and occasionally in writing, you'll encounter a common misusage of "order of magnitude" to mean a doubling of a specific quantity. It's common because it sounds more dramatic than a doubling, kind of like saying something is 200% of something else. Unlike 200%, however, using order of magnitude to refer to a doubling of a quanity not expressed as a binary number is incorrect usage. In my opinion, when order of magnitude is used to refer to a specific, quantitative difference, it must refer to an exponential difference. Furthermore, unless otherwise specified, quantities are assumed to be represented in base 10.