One possibility, of course, is that some malign super-intelligence already exists on earth, but is shrewd enough to disguise its existence, its intentions or its intelligence. I don’t think this act of deception would be particularly difficult.
We simply aren’t very good at spotting what to fear.
For most of evolutionary time, the most salient avoidable threats to our survival came from things which were roughly the same size as we are, and which actively wanted to hurt us. Ferocious animals, for instance, or other people. Over time, we got pretty good at recognising when something or someone who was nasty. We also learned to minimise the risk of infection, but we learned this unwittingly, through instinctive revulsion, social norms or religious observance. We did not spend much time consciously thinking about germs for the simple reason that we did not know they existed.
To sell products which promote hygiene, consumer goods companies have ploughed billions of dollars into advertising campaigns which dramatise the risk of bacteria, or which sell the idea of cleanliness obliquely through appeals to social status. I can confidently predict that nobody will ever come into my office clutching a brief for an advertising campaign to raise awareness of the risk you run when approaching an escaped tiger.
So, when we think about threats from technology, we automatically fall back on instincts honed a million years ago. This is why the first prototype for a driverless car has been designed to look so damnably cute.
It is, in short, designed to look like a puppy on wheels. It can only travel at relatively low speeds and is small and light: but it also artfully exploits pareidolia and our parental urges with its infant-like, wide-eyed facial expression and little button nose. My inner marketer admires this. It is exactly what I would have recommended. “Make the thing impossible to hate.” Even if the technology is ultimately more dangerous than an AK47, I find it hard to imagine myself taking an axe to it in a fit of Luddism.
But is it a mental patch or a mental hack? Is it designed to look cute to overcome an unwarranted innate fear of such technologies, or is it a hack—to lull us into a false confidence? I don’t know. Our fear of driverless cars might be akin to the fear that our children are kidnapped (high in saliency; low in probability) or it might be justified. But our level of fear will be determined by factors (including cuteness) not really relevant to the level of threat.
Which brings me to a second question.
Though the driverless car looks cute, we are at least aware of some possible dangers. It seduces us, but we are still aware of being seduced.
Are there already in existence technologies (in the broadest sense) which have seduced us so effectively and been adopted so quickly and so widely that we may only learn of their risks through a problem that is sudden, unexpected and immense? What might be the technological equivalent of potato blight?
Our current belief in “technological providence” is so strong, that it would be fairly easy, I think, for us all to fall into this trap—where we are so excited by something new we fail to notice what other things it might give rise to until it is too late. For the first few hundred years, gunpowder was used not for warfare but for entertainment.
And, just as airline pilots regularly practice landing by hand, even though they are very rarely required to operate without an autopilot, should we too set aside periods of our life where we deliberately eschew certain technologies just to remind ourselves how to live without them, to maintain technological diversity and to keep in trim the mental muscles made weak through underuse? Perhaps. But what the mechanism is for coordinating this behaviour amongst large groups of people, I don’t know.
I recently proposed that companies adopt a weekly “email sabbath” because I believed that the overuse of email was driving into extinction other forms of valuable interaction. We're losing the knack of communicating in other ways. Most people thought I was mad. A few hundred years ago a Pope or Rabbi might have told us to do this—or the Archbishop of Canterbury. There’s nobody now.
I always fear cock-ups more than conspiracies. Compared to the threat of the unintended consequence, the threat of intentionally evil cyborgs remote enough that it can be safely left to Hollywood for now.