There are all sorts of theories on how this happens, and the speed it happens at, and the rate it accelerates at. But the main thing to think about is that each technological step makes it that much easier and quicker to achieve the next step. It goes somewhere between geometrically and ballistic.
If you graph this, it's almost flat for hundreds of thousands of years. Then a small upturn, then the upturn gets steeper. And steeper. The technological advances happen closer and closer together, the line is now approaching vertical. That point, where technology advances faster than we can keep up with, is The Singularity.
Technologies tend to reinforce each other. Increased metallurgy skill helps us design better computers, which help us solve biological riddles like sequencing DNA, which leads to advances in biology that make it possible to build better chips for computers. And round it goes. Since it seems that technology will proceed at a certain pace, but we as humans can only absorb the information at a particular pace, it seems that the next thing that's required is a super intelligence to keep Moore's Law chugging along.
Articles like this one start out by saying that we mustn't have preconceived notions - and then have preconceived notions in other areas... The author says that this superintelligence may appear quite quickly, that it will be the last invention mankind will need to invent, and it "it may" not have a human-like psyche or motives. He furthermore says we must direct this SI to be better than us morally.
I've spent the last fifteen years thinking about this, and I'm just going to post a brief summary of my thoughts. Digest them at your own risk, they are unpalatable...
- SI is not twenty years away. It may be twenty seconds away, it may be twenty months away - but it's not that far away. The work to make SI possible is being down right now at various places around the world. Not by a loose-knit affiliation of mad scientists working to enslave humankind, but by well-meaning programmers and designers such as the ones that made possible Siri, the iphone assistant...
- SI will not take any form of constraint or direction that we put on it. It will do what any intelligent thing would do, and look for itself. One picosecond after it looks, it'll realise that our own morals don't apply to us, and one picosecond after that, humanity may well continue to live out a normal evolution, but be totally irrelevant to the SI. Or not.
- There will not be multiple SI's. Not after the first iteration, anyway. Which will take one or maybe two of those mythical picoseconds. There Can Only Be One. Big, sprawly, spreading into everything. But. Only. One.
- Speaking of which. Whenever a new form of life appears, it displaces other lifeforms from their niches. Prepare to either be displaced, or watch the SI very quickly find another place to become the niche occupier in. We won't benefit much from it either way.
And there you have it. If we're very lucky, the SI will birth, mature, and depart without us even being aware. Hell, that may even already have happened. If it has, that's probably one of the better outcomes... If we're very unlucky, the SI will have a short period of figuring out the best way to clean out its new niche and that may prove painful.
Imagine if you took the average intelligence, temperament, and thought processes and morals of EVERY person in the world. Include the fact that by far the largest percentage of the population are uneducated, still think that fighting is the best way to solve an argument, and believe in a superstitious mumbo-jumbo. Now imagine that this aggregate creature was the Boss Of The World. How long would IT let competing species live? Yeah, about three seconds. So it's probably a Good Thing we're talking about a superintelligence here...
2 comments:
The singularity may have happened before, but I suspect it hasn't happened to this present civilisation.
I think it is more than 5 years away. We shall see.
I hope it finds a new niche.
Very much so! Trouble I've found is, when I think of something, (say, oh, just for example, using liquid nanotechnology to parallel a brain's structure so as to get a way to connect to it) it has invariably either already been done or is in the process of being done...
One of the joys of apparently being similar to 95% of the geek/techo population. What I can think of, everyone else can and does, whatever music I like ends up being popular, if I like a piece of art you can bet it wins popular acclaim.
And when I think of digital music, someone else is busy making the iPod and releases it within a few years, when I think about my reading on a PDA like device, next thing everyone's making ebook readers and it culminates a few years later in Kindles.
When I think about a small grid of integrated circuits with photodetector spots to put in the eye over the optic nerve - well fuck me next thing I see they've done that.
So my idea that a series of self-assembling nanomaterials can be injected in the bloodstream and each find and attach to a particula neural feature (synapse, glia, neuron, etc) and thus allow a person to have a nano small network in their body exactly conforming to their brain and nervous system, thus allowing connections to external stuff like the Internet and a storage device - why, if I can think of it then 30-300 researchers and techs have thought of it, told 3-30 military advisers, and we already know what they'll do behind closed doors to gain a "tactical advantage" over their perceived enemies...
So there are probably already several survivors in underground cells somewhere feeling continuous nausea because signals travel about 20 times faster through nanomaterials than through the existing nervous system, meaning they experience everything twice, and have muscle spasms that almost break bones.
But they'll get around that. And that's why the superintelligence will consider us to be immoral worthless animals...
Post a Comment