Electromagnetic radiation in the microwave wavelengths (on the order of millimetres and centimetres) is absorbed by water molecules. The way this happens, simply put, is that energy is transferred from the oscillation of electric and magnetic fields to the mechanical vibration of water molecules. There is a slight charge separation in these molecules, so they behave like small dipoles. Dipoles try to orient themselves with an external field (think of many magnetic needles being swung back and forth when you oscillate a large magnet above them), thus the molecules acquire kinetic energy. Heat, of course, is exactly this motion of molecules.

Since any food has at least some moisture in it, there are always water molecules spread through the stuff we want heated; once these molecules begin to move, they will begin to collide with other molecules (not necessarily water), spreading the heat more evenly by convection. This is why microwave oven manuals often advise you to apply the oven for a certain period, and then leave the food for a while longer (for the heat to spread). Since the microwaves are not blocked by other substances in the food (only metal is likely to reflect them) they go right through it, heating any part with some moisture (usually everywhere, but you might note one ingredient heats upmore rapidly than another). Radiation not immediately absorbed is just reflected from the interior surface (sheets of metal, or even just a metallic web with holes smaller than the wavelength used), until eventually it hits water. If you don't have any water (say you're nuking a pebble) or there's a lot of metal reflecting the radiation away, then the microwaves won't be absorbed, probably overloading the magnetron inside, with unpleasant consequences for the device (and possibly innocent bystanders). If this happens to you, it's probably because you didn't RTFM, or you just don't care.

An interesting consequence of the way the microwave oven heats water, is that you can superheat water, heating it "beyond the boiling point". If you heat water in the microwave for long enough (but not until you see it boil) it will be in this state. The water will be quite still. If you then drop in a spoon or sprinkle some powder in, it will suddenly boil over. You might see this happen when heating a drink... Normally, the heat energy of a molecule divides "equally" between its several forms of kinetic energy. This should be more properly adressed in thermodynamics, but the point is there's a constant stable relation between the energy of a water molecule's oscillation, and the energy arising from its velocity (motion of the centre of mass). Except that if you quickly give the water molecules energy by vibrating them (in a microwave oven), it takes them quite a while to spread the energy out to other forms (e.g. velocity), which happens through numerous collisions. Since the phenomenon of boiling or evaporation is caused by molecules moving fast enough to break free of their neighbours, this will not occur for a while. Since the molecules are still there, they continue to acquire energy. If you then disturb the water somehow, the collisions are made more likely, and the excess energy is quickly distributed.

Contrary to what my so-called brother claims, superheating water is not an "interesting consequence", but a bloody nuisance. It can happen if you try to reheat coffee, say (which in and of itself should be considered a crime), and you leave it there for far too long. So far so good -- until your try to add sugar or sweetener (or even milk, for heavens' sake)...

The resulting mess (boiling coffee overflowing the cup) can take a while to clean.

(Not that I'm bitter because it ever happened to me or anything...)

I have heard two wives' tales regarding the origin of this CD-ending device. One involves a man who was working with a Magnetron and he noticed that a candy bar had melted in his pocket (this of course having nothing to do with the standard M&Ms claim). Through various other discovery, he managed to put this heating aspect of microwaves to good use. The other story being a more colorful tale about a How the Potatoe Delayed the microwave 10 years, (How the Potato Delayed the Microwave 10 years)

Several facts however remain undisputed: First, they were very unpopular at their beginnings out of Raytheon. They were used in commerical cooking applications early on (they were referred to then as a Radio Range), with several diners using this new invention as a primary cooking device. My grandmother used to tell stories about such (and while they are not 100% accurate, the color of them is more than enough to make up for some fudge here and there). The first units (as my friend likes to call his older microwave "the lead apron kind"), were huge and bulky. They started off in Boston, Massachusetts (where Raytheon is located), and spread from there. These stories came to me from residents of that area around that time, and the children of the Raytheon engineers on that project.

Another tale of the creation of the microwave was this:

In the early days when man was still trying to think of useful applications for microwave technology there was a scientist who liked to bring a lunch to work. One day he placed his lunch in front of a machine in the morning and in the afternoon returned to find it quite heated. Eventually he put 2 and 2 together and realized what was happening. Thus, the concept of the microwave was born.

Microwaves are magical, wonderful things. They heat, they cook, they reheat, they defrost. But surely they haven't always been so great, right? They must have once electrocuted children, nuked harmless housewives and mutated professors, right? Well, apparently not. Not in the history they'd have us believe, anyway...

It all started in 1946 when Dr Percy Spencer was messing around with radars and the newfangled magnetron, a vacuum tube. Upon discovering the chocolate bar in his pocket had melted, Dr Percy was suspicious (we can assume it happened in winter as he thought melted chocolate was a weird thing). He followed a hunch and put some corn kernels near the magnetron. They popped, like popcorn, and his scientific interest was raised. The next day, the good Doctor placed an egg near the tube, and it was found to be hot... then it exploded.

Dr Spencer realised this magnetron tube was a huge deal, as it had the ability to heat things immensely in spite of it working in a low-density energy level. He realised things could be heated even more if they were enclosed in a metal box from which the magnetrons rays were unable to escape. As a result, Dr S made such a box and noted with interest the higher density electromagnetic field within it. In other words, the rays bounced around inside, couldn't get out, and cooked whatever was in the metal box.

Six months later, Raytheon (an electrical engineering company) put a patent on the first microwave. In 1947, the microwave was being used in an experimental capacity in a Boston restaurant. That first heatbox cost $5000, weighed over 750 pounds and was 5'6" (for the metric kiddies, that's around 340kgs and 1.6 metres), and had to be cooled with a water system.

That same year, engineers got rid of that pesky water cooling system and downsized the microwave to fridge-sized. The "Radarange" was the first common oven and cost around $2500. This was followed by the first domestic oven, sold around 1953 by Tappan, which was a steal at just $1295.

It took more then another decade before the microwave was adjusted to the public need, as it was 1967 before the benchtop 100 volt $500 microwave was invented.

In 1976, things had certainly changed. 60% of American homes had a mini-nuke oven (if you didn't, what was with your family? How did you cook, what did you eat?). Today the microwave is smaller than ever- so small that some ovens, like mine, are too tiny for a regular sized plate to turn around in there. Of course, following the laws of technological advancement, it was cheap, so I can't expect much.

Now we've got convection microwaves, heat and moisture probes and microwaves which give the recipe for various foods on their little LCD screens. Whatever next?

Microwave trivia:

  • Dr Spencer was a self-taught scientist- he didn't even finish high school. In spite of this, he was inducted into the National Inventors Hall of Fame in 1999 ( don't stay in school, kids!).
  • According to www.gallawa.com/microtech, microwaves used in microwave ovens oscillate at 2450 million cycles per second (MHz). On the radar band, this is slightly above the frequencies used for UHF TV channels, and safely within the NON-ionizing region. Still, if you exposed an eyeball to microwaves for too long, it would cook like an egg. Mmmmm. Fried eye.
  • Microwaving water alone will not result in "exploding water". An urban myth has circulated around, telling of the dangers of microwaving a cup of water as the water will explode into the face of whoever gets the cup out of the microwave. No, it won't. Calm down.

info from www.gallawa.com/microtech and www.urbanlegends.com

Log in or register to write something here or to contact authors.