The first decade of the 21st century. The problem lies in how to pronounce it. Is it the 'oughts', the 'noughts', the 'double-O's', the 'zips', the 'zeroes'? We've got to decide soon, so the media analysts can get busy generalizing, mislabeling, and stereotyping it.

There’s an old cartoon, made in the 1920s, that features a character, an old woman with flowers in her hat, who smiles at a recollection of the "gay '90s". 1890s, of course. Whoever penned the script could hardly have anticipated that 30 years later, after a thorough economic meltdown and a second continent-sweeping war, people would recall the "gay '20s". And he could never have anticipated that the '90s - the 1990s - would be the very last decade, ever.

When exactly did we begin assigning personalities to decades? After all, time is a smooth sweep, not pixelated into segments that happen to coincide with the number of fingers on the human hand. Tracing steps back through the cynical and technological '90s (though every recent decade has thought of itself as technological), the yuppie '80s, the Me Decade (that's "'70s", for Gen Y-ers, who tend to "remember" it as an adjunct to the hippie era), the radical and revolutionary '60s, the clean-lined '50s, the decades of War, of poverty, of prosperity - and there the modern conception ends. Our images of the '10s and '00s (the "aughts", as they were called) consist of black-and-white bicycle riding. Historians can push the concept further, though: supposedly, the American Civil War marked a major starting point for decadedom, an event so defining that thought could not help but be regimented around it. Mark Twain wrote near the end of the 1860s that the time before the war was a time "before History was born - before Tradition had being."

So when exactly did we begin assigning personalities to decades? "Surely plague-ridden Europe was spared the indignity of town criers bellowing, 'Welcome to the Bubonic 1340s,'" commented one wit, "and students of ancient Rome have yet to uncover a text reading anything like 'Welcome to the Visigothic 410s'." Clearly, some prerequisites must be in place.

The first is a collective unconscious: a fairly universal knowledge of current events, and a fairly universal image of their effects. Before the advent of mass media, before technology rendered the distances between major centers small enough for cultural thinking differences to diffuse quickly into each other, only centuries could be personalized: historians studied events far in retrospect (decades are studied concurrently with their existence), and no great technological hurdles limited said diffusion when the scale of time was large enough - intellectuals across the European subcontinent had a fairly unified horror at the dark ages by the time the renaissance was in full bloom, for example, even if their analyses of current events differed radically.

A collective unconscious was completely impossible when most people had little connection to the world outside their farm, grew steadily as farms were abandoned for apartment blocks, and took a quantum leap (around the time of quantum mechanics, incidentally) from the development of recording technology. For the first time, auditory and visual culture could be communicated directly, across thousands of miles, rather than encrypted into sheet music (decryptable only with a musician handy) or textual description, and, similarly, pure culture could now be preserved for posterity - how many people today would know about (let alone have opinions about) swing music if it could only be read, not heard? How many kindergarteners have taken in 1950s thinking-style in the form of Marvin the Martian?

The second ingredient, and most obvious, is a tendency to think of time in blocks. Many historians have attributed this to the usual suspects: consumerism, Americanism, short-attention-span-ism, but judging from the long, clearly recorded history of century-thinking, it is well nigh universal (for those with a knowledge of history).

During the last few months of the '90s, a minor debate began over what to call the coming decade - would the '00s be the Naughts? The Aughts? The Ohs? The Zeros? The Double-Zeros? (Frankly, it all sounded pretty stupid.) Faced with a rash of differing standards, how would we ever say "Welcome to the '00s" as we had said "Welcome to the '90s", and "Welcome to the '80s" before it? Even decades long past had their spoken sayings - John F. Kennedy, according to his campaign, was "A President for the '60s!"

But the Y2K bug was a far more pressing concern. Semantics would sort themselves out.

And they did, in a way no one expected (and no one but me seems to have noticed): apart from a rash of ads incorporating "that is so 20th century" into their text, The Decade dropped completely off the conversational and conceptual map. Despite its familiarity, it appears that the marriage of Decade and modern thought was one of convenience alone. A tiny, simple problem - nobody knows how to say '00s - and suddenly we cease to think in 10-year chunks.

Completely halt. Do not pass Go, do not collect $200.

So what does the future hold for our intrepid meme? Will it be relegated to footnotehood in metahistorical textbooks? Can it stage a reincarnation once the verbal baggage is past? Will it rise like a phoenix from the ashes of conversational ineptitude? Will teenagers someday sneer, "Welcome to the ‘20s, dude."?

Only time will tell.

Log in or register to write something here or to contact authors.