Here is my rant on the so-called "modularization" you speak of.

Lets look at the past 5 years of PC hardware and just how "upgradable" it truly all is. 5 years ago I purchased my favorite computer that I named "Banpei" after a robot that a female anime character built in a Manga (specifically, Skuld from Ah! My Goddess). This machine was the following:
From there, 3 more hard drives were added of various sizes and the original swapped out. Not to mention a PCI Monster 3D.

So what does all this have to do with modularity, upgradability and replacability? (is that even real english? ;) ) Simple. How much of that machine is truly replacable?

The CPU? Nope, Intel abandoned the Socket 7 platform for the Slot 1.

The RAM? Nope, SDRAM became the standard in the 430TX chipset (the last Socket 7) and everything after the 440LX was SDRAM.

The motherboard? Not really, the T2P4 was AT form factor, which also went out of the limelight.

The video card? Sure, as long as I make sure not to pick up an AGP card. In fact, while I'm talking about differing bus types, I should note that the ISA Sound Blaster isn't an option anymore. My latest motherboard has zero slots of that type.

And, in fact, most of the things I talk about above are going out the door right now anyway. SDRAM is becoming DDR. Slot One has already become Socket 370. ATX is going out so that Pentium 4's can get all the power they need.

My point is that computer technology is moving fast. Really really fast. An all-in-one machine is no more evil then a modular machine that will never get upgraded. After all, by the time things start to break or the owner of the computer thinks to themself "you know, this program runs really really slow, maybe I should upgrade", it's already too late.

After all, how many friends of yours have come to you, with machine specs not unlike the ones I listed above, asking what they can upgrade? And how pleased are they when they hear that they're basically better starting from scratch?