Good/Bad AI’s, accelerating returns and a lot of abundance

There are several topics which are often analyzed in isolation in regards to projected advanced technology. AI, accelerating technology, and abundance from technology and resources from space.

There are various papers that talk about achieving abundance from advanced technology like molecular nanotechology.

There is the analysis by Ray Kurzweil that technology is providing accelerating returns

There is also the concern about the need for friendly Artificial Intelligence (AI). This matters because the technological Singularity is mainly about the development of intelligences that are far greater than human and how that will cause an explosion of technological capability.

People can get a sense of the immense resources of energy and materials in space from the Kardashev scale of civilizations

People fear that an AI that is vastly more intelligent than people will rapidly become very powerful and dangerous to people. If an AI is vastly superior in intelligence and is able to rapidly develop and extend technological capability, then it should rapidly be able to tap the resources of space. Trillions of times more than what is available on earth. The AI can make itself mobile and leave and do whatever it wants. For the AI to decide to kill people on earth, I have difficulty seeing the motivation good or bad. The AI can basically outclass any human that is not completely augmented. It would be like Bill Gates parents being concerned that he might plot to kill them for his allowance. Even if the AI is very greedy or expansionist what we have developed so far should be irrelevant to its aims. Maybe a bad AI won’t help us out and just leave. But why would it fumigate the old house on the way out ?

The superior AI rapidly moves itself into an entirely level. Tiger Wood’s does not need to dominate the miniture gold courses.

There is also the discussion about whether or not to upgrade people. There is the concern the non-upgraded and therefore weaker people would be at the mercy of those who upgrade. The choice is not whether the non-upgrades will be killed, again abundance and accelerating returns from technology means that those who do not upgrade become irrelevant.

Accelerating returns mean that 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today’s rate). 200 years of progress will be more than 4,000,000 years of progress (at today’s rate).

In about 20 years, those who have not upgraded are like the Amish a few hundred years of technology behind. They are a quaint curiousity and barely connected to the advanced economy.

In about 100 years, they are like the cavemen and utterly removed from and unable to understand the advances being made.

In 200 years, they are like chimpanzees. The choice not to adopt the best technology is like choosing not to evolve.

For those who choose to advance and become transhuman, being generous to those who did not becomes very easy with abundance and the resources of space. It becomes increasingly small fractions. Initially like foreign aid (1-2%), then like setting side nature preserves and reservations. Then like setting up city zoos. Then like keeping potted plants and ant colonies.

So some things to remember is that abundant is really abundant. Not a little abundant.

And AI’s and radically augmented people can move themselves into an entirely different level of operation. Fear not the bad AI, but the meticulously cruel AI.

Don’t upgrade and rapidly become irrelevant

Comments are closed.