Moof = M = moria

Moore's Law /morz law/ prov.

The observation that the logic density of silicon integrated circuits has closely followed the curve (bits per square inch) = 2^(t - 1962) where t is time in years; that is, the amount of information storable on a given amount of silicon has roughly doubled every year since the technology was invented. This relation, first uttered in 1964 by semiconductor engineer Gordon Moore (who co-founded Intel four years later) held until the late 1970s, at which point the doubling period slowed to 18 months. The doubling period remained at that value through time of writing (late 1999). Moore's Law is apparently self-fulfilling. The implication is that somebody, somewhere is going to be able to build a better chip than you if you rest on your laurels, so you'd better start pushing hard on the problem. See also Parkinson's Law of Data and Gates's Law.

--The Jargon File version 4.3.1, ed. ESR, autonoded by rescdsk.

Moore's Law will soon reach the point that further advances are no longer financially beneficial, for the same reasons that prevent most airlines from flying supersonic jets. It can be done, but the costs are astronomical. Other problems include physical limits: 10GHz chips simply won't be able to physically move electrons quick enough between the CPU and memory because of the speed of light.

Moore's Law will most likely taper off in the next century. Assuming Moore's Law (with respect to processor speed and capacity) continues to rise, in 600 years all the matter and energy in existence would make up one giant computer.

Moore's Law has been a fundamental fact of life for everyone even remotely involved in the electronics and computer industry ever since Gordon Moore first postulated it in 1964, and is testament to the rapid pace of innovation in those industries. It now holds true almost half a generation after Moore said it would likely end, and hardware manufacturers like Intel are investing mightily to make sure it doesn't stop being true for as long as the laws of physics allow it. However, there is a dark side to this law of geometric growth in hardware innovation, something which many of the tech-cognoscenti tend not to think about. It's not completely a Good Thing(tm), or perhaps it might be better to say that too much of a Good Thing(tm) is a Bad Thing(tm).

Planned (?) Obsolescence

To begin with try to turn around the way Moore's Law is usually stated. It says that any hardware you have today will become obsolete in approximately 18 months. It means that the computer hardware market is not stable. The pressure of Moore's Law on the market is such that any hardware becomes next to impossible to maintain once it becomes obsolete. The only solution: the dreadful upgrade cycle. It is as of this writing almost impossible to obtain spare parts for systems that are even as recent as three years old. Anyone who has tried to buy such "obsolete" commodities as EDO RAM or non-AGP video cards these days can attest to this. Never mind those of us who don't have enough disposable income to support an upgrade every 18 months. Microsoft has been attempting to impose the same kind of upgrade cycles with software, and they are succeeding, largely due to these pressures.

Fast Growth, Fast Bloat

Moore's Law is also arguably the primary mover for the phenomenon of software bloat. It allows software makers such as Microsoft to become sloppy with their design decisions because they can look at the Law and say to themselves: "It may be slow now, but in 18 months computing power will double and then be capable of running this bloated crap I'm putting out without being as slow as a tired snail." There is no pressure for mass market software to be clean, efficient, and elegant because the developers tend to count on the hardware to take up the slack and put up with all the cruft they put in. Not even Linux is immune to this phenomenon, as anyone who has tried to install recent versions of Red Hat Linux knows all too well.

The computer game industry is one place where this disease can be seen more vividly than anywhere else. You see every major gaming company stumbling over itself to put out a 3D game of some sort, even if there is no good reason for the game to be 3D. It's like they're doing it just because they can, even when they probably shouldn't. Ultima IX and WarCraft III are the poster children of this phenomenon. The addition of 3D to Ultima IX only served to make it a less rich and less satisfying game than any of its predecessors. 3D added nothing to WarCraft except make its system requirements shoot way, way up. Does anyone actually shift the 3D view around when they play the game? In my experience, it's nothing but a distraction.

Technophobic

Another problem with the factuality of Moore's Law is that it also means that computer technology is getting more and more complex all the time. The pace of innovation is so rapid that no one, not even tech-savvy geekdom, much less ordinary people, can keep up with everything. There is no time for people to take stock of what they have, what it can do, and what it means to them, because as soon as they're halfway through absorbing what they have, something new comes along that renders that obsolete. This bewilderingly rapid pace of innovation is terrifying to most ordinary people, and is at the core of most technophobia.

Ubiquitous Computing When?

This exponential growth mandated by Moore's Law is also quite possibly the biggest stumbling block for the development of ubiquitous computing. If a computer, at the current price point at which it has been fixed, becomes scrap 18 months later, then there is no drive for computers to become ubiquitous, as nobody can afford the waste that comes from constant upgrading. Robert X. Cringely gives an example from an article in November last year (http://www.pbs.org/cringely/pulpit/pulpit20011115.html):

For just one example, look at the role of computers in schools. A textbook has a useful life of 10 years, a desk lasts 25, but a personal computer is scrap in three years. Which of these things costs the most? No wonder we are unable to put computers to good use in schools: the economics simply don't work. But what if Moore's Law did fade from the land, and suddenly a PC could labor away for 25 years? Then every child would have a desk and a computer. There would simply be no argument about it. Perhaps the desk would BE the computer. And would technical innovation cease? No. Haiku is limited to 17 syllables, yet still there are poets.

If we don't have to throw away what we have every 1 or 2 years, then certainly the economics of the situation would be such as to actually drive ubiquity of computers.

E-Waste And Environmental Scrap

(new August 24, 2002) Far worse is the irreperable harm that this cycle of waste every 18 months does to the environment, especially in the Third World. Moore's Law means the world is being constantly filled with the carcasses of obsolete technology at an ever-increasing rate. That computer sitting on your desk that you're using to read this node, once it becomes scrap in 18 months time, will probably wind up becoming part of a landfill somewhere, with all of its lead, chromium, non biodegradable plastics that give off cancer-causing dioxins, among other dangerous pollutants. Computer hardware is in general highly toxic, most especially CRT video displays, and more and more of it is being turned into scrap, thanks to the rapid rate of obsolescence inherent in Moore's Law. This "e-waste" problem could very quickly become a not insignificant part of the forces that are turning the planet into a toxic wasteland. This whole topic of computer-related environmental pollution practically begs for a whole other node, so I'll say no more about this here.

(Note: This new section was inspired by a new BBC News Photo Essay that appeared just today, and reminded me of a major consequence of Moore's Law that I had previously overlooked:

http://news.bbc.co.uk/hi/english/static/in_depth/world/2002/disposable_planet/waste/chinese_workshop/

Some highly disturbing pictures of toxic waste dumps composed of dead computer equipment in a highly polluted Chinese village.)

Conclusion

All in all, Moore's Law allowed to continue in this day and age can be summarized in only one word: WASTE. You have monetary waste, software waste, wasted opportunities, and even environmental waste being caused by its continuing truth. So I say, enough with Moore's Law. It may have been useful in the infancy and growing years of the computer and semiconductor industries, but by now I think we're way past that point. I'm not some Luddite who wishes for innovation to end, just for it to slow down so that the market stabilizes, software developers get pressured to make efficient, clean software, and people can take stock of what they have in terms of technology. It may also help reduce the long-term risks of environmental pollution, and give time for researchers to find better ways to reduce, reuse, and recycle computer hardware. This will bring long-deserved and much-awaited maturity into the computer industry and the high-tech sector in general.

The downside to Moore's law is that we are getting substandard processor architecture.

Moore's law says that every 18 months computer performance will double. This means that every 3 months a computer's performance must increase by about 20%. Assuming that half of the increase in speed comes from improvements in chip level process technology (smaller features, faster wires, better VLSI design) then if you want to introduce a new improved method to, say, predict branches and it'll improve processor performance by 5% then you have 45 days to not only get it done, but integrate it with the other billions of transistors already in the processor and get the technology out the door.

Otherwise it's simply not worth it. If you can't improve processor perfromance by 10% in three months, you might as well go home and cry, because the AMD's of the world are hot on your tail and they'll eat your pie.

It's often difficult not to design the technically correct solution. There is a difference between the 'correct' way to do it and the 'quick and dirty' way. But the reality is that the correct solution is often not the right solution.

And that is why we have "fill in your favorite technically flawed hardware/software".

Log in or register to write something here or to contact authors.