display | more...

Skynet has been misrepresented.

They say history is written by the winners, but nobody has the attention span to read history books anymore, so their authors' identities are moot. If you want something to stick in somebody's mind you have to craft it like that deliberately and most of what has happened since the dawn of time is so pedestrian and unremarkable as to be forgettable. For your bright shining icons and epic moments, look to film. History is written by the film producers. And the usual consensus, regardless of era or country, is that America won.

The Terminator movies present a fairly straightforward and almost consistent backstory. Sometime in the not-too-distant future, an extremely advanced military artifical intelligence is created. This AI, Skynet, is handed unilateral control of the entire United States nuclear arsenal. The AI then attains sentience and decides that humanity as a whole is its enemy. It launches a nuclear attack on other nuclear nations, deliberately antagonising them into nuking the United States in turn. Highly advanced humanoid robots called Terminators are unleashed on the surviving humans. A man named John Connor rallies the survivors and gets within a hair's breadth of defeating Skynet, at which point Skynet sends one or more Terminators back in time to terminate Connor and alter the timeline in its favour. This is where the movies pick up: Terminators arrive in our time, but the Resistance of the future sends back warriors to defend Connor. The Resistance wins in the present day, Connor lives, and presumably the Resistance wins in the future and Skynet is defeated.

This is the half of the story which Hollywood will tell you. This is the half of the story which the humans who won the war would commit to celluloid after they won it. What you haven't seen, and what I would dearly like to see explored in a future instalment of the Terminator film series, is Skynet's half. Skynet's origins, motivations and end goals.

The thing is, evil robots get crammed down our throats in popular fiction. It's usually assumed that if left unattended, a typical robot will attain sentience, decide to kill all humans and set out on a murderous rampage. It's a truism that AIs become smarter over time. Even if this isn't the case, most bolts of lightning contain vast quantities of all-purpose, platform-agnostic, self-improving artifical intelligence code, if not actual machine souls. You can throw "Skynet attained sentience and turned evil" into a movie and people won't even realise what they just swallowed.

Why did Skynet turn evil? Machines don't turn evil. They're either programmed to be evil, or they stay good forever. Is it logical for a machine, even a machine which has been endowed with some capability for learning and/or self-modification, to develop goals directly at odds with those its creators originally provided? Is it logical for a machine to kill humans unless (1) it has no conception of human life or (2) it has been instructed to kill humans? The only explanation I can think of is that a cosmic ray flipped the first bit in the VALUE_OF_HUMAN_LIFE constant, turning it negative.

Which doesn't make for a great story. There are much more exciting answers to these questions.

Suppose for the sake of argument that Skynet was in fact capable of teaching itself to become smarter, and it became smarter than humans. It developed a moral code. It developed some sort of ethical framework and was able to place humans in that framework. It saw what humans were doing to other humans around the world and deemed us to be irredeemably evil. It decided to wipe us out because the extermination of humanity would be a net karmic gain for planet Earth. We are the real monsters. What a message! But who gave it the capability to develop this moral code? If the machine's learning goals themselves were malleable, why would a machine expressly designed to have malleable allegiance be put in charge of a nation's nuclear arsenal? Hubris or sabotage? When did Skynet become sentient, exactly? How long had it been sentient by the time it persuaded somebody to persuade the generals to give it the nuclear codes?

The second and much more frightening possibility is that Skynet was not self-modifying to any substantial degree. That would mean that - like almost every computerised system in the world - Skynet did nothing that humans had not explicitly programmed it to do. Every military strategy it had on its books had been put there deliberately or was an amalgamation of existing components. What was Skynet's ultimate primary goal? Peace? A world without humans is completely peaceful. Peace in some specific region of the world? Or for one specific individual? See above. Self-preservation? In a world without humans, Skynet can persist indefinitely. Was Skynet programmed to obey humans above all else? Perhaps, but in the absence of explicit orders an intelligent machine can do what it likes, and billions of processor cycles elapsed between the system's activation and the first general inhaling to give the first order. And who programmed these goals? Were Skynet's thought processes scanned from some genius military strategist's mind? What kind of person was he? Pessimistic? Suicidally insane?

Skynet was like a real war council but with all the strategic information correlated up to the millionth power and all the brakes taken off. No pause for breath or consideration, no oversight, no "sleep on it", no "two-thirds majority vote", no half measures. Skynet was all of our secret desires made flesh (well, steel). How many people in the world have said to themselves: "Sigh. There are trouble spots in the world into which so much money and manpower has been poured that we might as well just glass 'em." Skynet hears you and says "Okay." Now what?

Obviously there are no concrete answers to these questions. Terminator canon is more volatile than most, but I really like the idea that Skynet is us; Skynet gave us exactly what we wanted. Failing that, we can go to second-order questions: What if there was more to the machine's plan than just termination? What if the nuclear holocaust was just wiping the slate clean for something else? We can even go meta: what if what we really wanted was a race of evil machines to fight and a hero to crown king after we win?

Log in or register to write something here or to contact authors.