Thursday, June 30, 2016

The Case Against Superintelligence

Limited Attention Span

The superintelligent machine is a flawed concept, in my opinion. It is based on the assumption that the IQ of a single intelligent system can increase indefinitely. There are two problems with it. First, an intelligent system can only focus on one thing at a time. This imposes a severe limitation on the learning capability of the system. While it is possible to increase the learning speed of a machine by feeding it high speed data such as videos, learning to interact with the real world will still have to be done in real time.

The Tree of Knowledge

Second, knowledge is necessarily organized hierarchically in memory. This means that there are many branches in the tree of knowledge. This is the only way to store a huge amount of information in a limited space and to organize it in a way that makes it readily accessible within a context. This is important because it allows the system to infer analogies, an essential characteristic of intelligence. In other words, items that share a branch belong to the same category and the activation of a low level item brings up all the other related items (one thought brings up other thoughts). It is up to the attention mechanism to scan these family relations and pick one to focus on.

The Superintelligent Society

The problem is that a superintelligence would have vast numbers of awakened relations (branches) to pick from. It would take an inordinate length of time to do so. This is why it is best to specialize in an area of knowledge and rely on the expertise of others. This is what humans do. In a sense, humanity is already a superintelligent system consisting of millions of interacting individuals specialising in various areas of knowledge. The system, as a whole, is much more intelligent than any individual can ever be. The internet has accelerated communication between individuals making our global superintelligence even more efficient.

So I don't see a future dominated by a single or a few superintelligent machines but one in which many highly specialized artificial intelligences form a superintelligent society.

1 comment:

Rick Deckard said...

Our DNA breaks down when we have no contact with the outside, our system grinds to a halt, studies of isolation are exquisite but seldom. Objectivism is a failure. We can not function without the masses. Free will is a necessary evil, or the other way around. The more of us the better.