There are all sorts of theories on how this happens, and the speed it happens at, and the rate it accelerates at. But the main thing to think about is that each technological step makes it that much easier and quicker to achieve the next step. It goes somewhere between geometrically and ballistic.
If you graph this, it's almost flat for hundreds of thousands of years. Then a small upturn, then the upturn gets steeper. And steeper. The technological advances happen closer and closer together, the line is now approaching vertical. That point, where technology advances faster than we can keep up with, is The Singularity.
Technologies tend to reinforce each other. Increased metallurgy skill helps us design better computers, which help us solve biological riddles like sequencing DNA, which leads to advances in biology that make it possible to build better chips for computers. And round it goes. Since it seems that technology will proceed at a certain pace, but we as humans can only absorb the information at a particular pace, it seems that the next thing that's required is a super intelligence to keep Moore's Law chugging along.
Articles like this one start out by saying that we mustn't have preconceived notions - and then have preconceived notions in other areas... The author says that this superintelligence may appear quite quickly, that it will be the last invention mankind will need to invent, and it "it may" not have a human-like psyche or motives. He furthermore says we must direct this SI to be better than us morally.
I've spent the last fifteen years thinking about this, and I'm just going to post a brief summary of my thoughts. Digest them at your own risk, they are unpalatable...
- SI is not twenty years away. It may be twenty seconds away, it may be twenty months away - but it's not that far away. The work to make SI possible is being down right now at various places around the world. Not by a loose-knit affiliation of mad scientists working to enslave humankind, but by well-meaning programmers and designers such as the ones that made possible Siri, the iphone assistant...
- SI will not take any form of constraint or direction that we put on it. It will do what any intelligent thing would do, and look for itself. One picosecond after it looks, it'll realise that our own morals don't apply to us, and one picosecond after that, humanity may well continue to live out a normal evolution, but be totally irrelevant to the SI. Or not.
- There will not be multiple SI's. Not after the first iteration, anyway. Which will take one or maybe two of those mythical picoseconds. There Can Only Be One. Big, sprawly, spreading into everything. But. Only. One.
- Speaking of which. Whenever a new form of life appears, it displaces other lifeforms from their niches. Prepare to either be displaced, or watch the SI very quickly find another place to become the niche occupier in. We won't benefit much from it either way.
And there you have it. If we're very lucky, the SI will birth, mature, and depart without us even being aware. Hell, that may even already have happened. If it has, that's probably one of the better outcomes... If we're very unlucky, the SI will have a short period of figuring out the best way to clean out its new niche and that may prove painful.
Imagine if you took the average intelligence, temperament, and thought processes and morals of EVERY person in the world. Include the fact that by far the largest percentage of the population are uneducated, still think that fighting is the best way to solve an argument, and believe in a superstitious mumbo-jumbo. Now imagine that this aggregate creature was the Boss Of The World. How long would IT let competing species live? Yeah, about three seconds. So it's probably a Good Thing we're talking about a superintelligence here...