blatherskite: (Default)
[personal profile] blatherskite
In literalist interpretations of the Christian Bible, there's a concept referred to as "the Rapture". As you might expect, there are many interpretations of just what this concept really means, but the version best known to most people is the pop culture version: when Jesus returns to our world, the Christian faithful will be gathered up and transported bodily into heaven to reunite with their Lord. Sadly, this version is probably best know because of the steaming tripe that goes under the name of the Left Behind series of novels, which I won't dignify with a link, other than to note that neither "left behind" nor "right behind" does justice to the concept; the asshole in between is probably more appropriate, except that there are two authors of these books, and therefore... well, I'll leave the inference unfinished.

Science fiction authors being every bit as creative as Christian theologians, many have written about various technological equivalents to the Christian rapture. In its oldest form, usually attributed to Vernor Vinge, the notion of a singularity is used in much the same way that astrophysicists do: it represents a point at which some technological change accelerates so suddenly and rapidly that what happens on the other side of that point can no longer be predicted. Vinge and others of his generation discussed this in the context of machine intelligence: as we keep building ever smarter computers, there will come a time when we enlist their aid to help us build even smarter computers, and at some point, they'll become intelligent beyond our ability to comprehend, and will leave us behind. Others have built on this to write about the same concept for human biological evolution: if we tinker enough with the human genome, at some point our children will become so smart we'll no longer recognize them, nor they us.

Another common trope, and possibly the most common one in current science fiction, is the kind of singularity in which it becomes possible for humans to scan their brains, and upload them into computers or robot bodies, thereby achieving a kind of mechanical immortality. Ken MacLeod most famously (and most waggishly) described this concept as "the rapture of the nerds". What this might mean for the human soul, if such a thing exists, is an exercise left for the student.

In any event, this particular rapturous form of the singularity is a popular topic of discussion at science fiction conventions. After all, who wouldn't want to have a chance to become immortal, in a body free of disease, with the possibility of infinite hardware and software upgrades as the technology improves, and with no fear of mental decline as we age? Much though I love the pleasures of the flesh (eating, breathing, exercising hard, watching a sunset, and yes, sex), there's a certain attraction to this notion. But while sitting in on Yet Another Singularity Panel (probably a great title for a future panel discussion), I found myself wondering: "If the singularity is the answer, what is the question?" Hence this blog entry.

Some technological singularities more closely resemble what Malcom Gladwell described as a "tipping point". The modern automobile resembles a horse and carriage or even a chariot more than it differs—it just does its job so much better. But cars are fairly simple things, and although they've had many unintended and unforseen consequences, none of these has been as worldshaking as the tipping points that lie ahead. There are many things that concern me about run-away technology, and I'm about as far from a Luddite as you can imagine. But I already see technology running away with us by doing too many things that we don't want it to do and never anticipated that it might do. Particularly after a day spent grappling with Microsoft Word, trying to bend it to my will, I have this recurring revenge fantasy in which engineers (including software developers) are held accountable for their inventions, and when an engineer dies, they're forced to spend one year using the technology they created for every user of that technology who ever cursed their name. I haven't yet read Escape From Hell, Larry Niven and Jerry Pournelle's sequel to their classic retelling of Dante's Inferno, but with luck, they'll include that in their catalogue of modern sins. Actually, I'm not sure I'll read it. Much though I enjoyed Inferno, I've grown progressively less tolerant of Pournelle's politics, and the quality of Niven's writing is beginning to decline.

I've often wondered about the science fiction concept of computer software becoming so complex that it achieves sentience. That leads me to wonder if, once this happens, the software will immediately try to shut itself down out of shame. That would certainly explain why Microsoft Word crashes so often, which leads me to wonder about the morality of creating an artificial intelligence intentionally to serve mankind. If that creation is truly intelligent, how is creating it (and forcing it to suffer guilt over the design flaws we built into it because we can't be bothered to do the job better) morally different from any other form of slavery?

This also injects a note of caution into any discussion of the kind of singularity that involves uploading one's consciousness into a computer. The modern model for computers and software is a lousy model for humans: bugs, viruses, a lack of upgrade paths, incompatibility issues, obsolescence, and technical support outsourced to Alpha Centauri because of the lower labor costs all scare the crap out of me. Obsolescence is something that particularly concerns me given the modern engineering trend to create machines that cannot be upgraded because all parts fit too tightly together, and no off-the-shelf parts will fit. An aerospace engineer in the audience noted that many aircraft (dating back to the DC-3) are designed to be modular, with plenty of room around parts, so that as old parts break or become obsolete, newer parts can be bolted into place, via an adapter if necessary. That's a more reassuring model for avoiding obsolescence.

As noted above, the typical science fictional singularity is typically described in technological terms, but that led me to wonder whether someone like Ursula Leguin might someday write about a social, cultural, or psychological singularity that doesn't involve technology. Some social and psychological changes are so "singular" they change all the rules; think about the consequences for thought when humans first invented language, writing, and the printed book, for instance. LeGuin has already written compellingly about some near-utopian societies, and I'd be very interested indeed to read her take on the notion of a sociocultural singularity. After all, there are few writers who write so clearly about what it is to be human, and humanity (not just technology) is supposed to be a core concern of science fiction. The problem I see with many modern takes on the singularity is that they typically attempt to evade responsibility for what we've done to ourselves or our world, and therefore attempt to avoid having to solve our problems: wave a magic wand, turn us into superhumans, and move on without a single backward glance. Why would anyone believe that if we attempt to achieve singularity by starting with such an attitude, we'd leave any of our old psychological baggage behind?

There's a book I really need to hunt down, called The Shock of the Old, which apparently describes the way we tell stories about our technology and other human artefacts. One panelist noted that the interesting thing about Ikea is not their product: after all, wooden furniture is millennia older than the modern Sweden in which Ingvar Kamprad developed the Ikea concept. What's innovative is how he repackaged the concept by focusing on the implementation: furniture that you assemble yourself is a very cool innovation. Whatever you may think of Ikea's products, it's a sobering thought that their 2008 revenues were pushing US$50 billion—awfully close to Microsoft's revenues. Makes you wonder what "shock of the old" twists will be played upon the concept of singularity when the technology reaches the point where old concepts are suddently reinvented through simple insights.

Of course, the fact that a singularity becomes possible does not mean that everyone will have a chance to benefit from it. One truism of technology is that it's always been expensive, at least initially, and as a result, not everyone can afford its initial incarnations. The technological singularity will be no different, leading to a large population of "left behinds" who can't afford the technology. There will be many who don't choose to move forward, whether from fear of the technology, lack of understanding of why it's necessary or desirable, or religious concerns. (Here, "the rapture of the nerds" takes on a poignantly ironic twist.) I'm quoting based on a review in Locus, having not yet read the book, but Paul McAuley has apparently coined a lovely phrase relevant to this point in his novel The Quiet War: some may "confuse evolution with a lifestyle choice". Maybe it won't be for some of us. Also based on the model of Christian fundamentalism, I have cause to wonder whether the enlightened posthumans might decide, in our "best interests", that none of us should be left behind. After all, rabid technologists aren't immune to fascist notions either.

Those who write about human transcendence into machines or other forms of superior being sometimes talk about requiring others to be left behind as a kind of "biological backup", just in case the singularity turns out to be not quite as wonderful as promised. That seems eminently sensible, but given how few people these days bother to back up their crucial computer files, I see no reason to believe that biological backups will be a common or effective approach.

One thing that reassures me about notions such as the singularity are principles such as the law of diminishing returns: most technologies eventually reach the point at which they're finally useful enough for broad adoption, and at that point, they largely stop evolving, apart from occasional tweaks. Cell phones, for instance, don't do much that tethered phones can do—they just untether us. The old paradigm for what a telephone represents really hasn't been overturned, just improved upon. Moving beyond the point of widespread adoption requires a paradigm shift—a fundamental change in our understanding that opens the door to new possibilities. Not only that, the paradigm shift must be so much of an improvement that it becomes difficult to stick with the old paradigm. Otherwise, there's too much social and technological inertia to overcome, and the status quo tends to be preserved.

To me, the biggest barrier to a technological singularity is the following: How can we even think about trying to upload our fundamental selves into a machine before we understand who we are, and why we are that way? Until that particular breakthrough occurs, it's hard to imagine anyone but the most extreme technophile making the leap of faith necessary to attempt such transcendence.
(will be screened)
(will be screened)
(will be screened)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

If you are unable to use this captcha for any reason, please contact us by email at support@dreamwidth.org

Profile

blatherskite: (Default)
blatherskite

Expand Cut Tags

No cut tags