📜 ⬆️ ⬇️

Hawking called artificial intelligence the greatest mistake of mankind

British physicist Stephen Hawking , in his article , said that the underestimation of the threat from artificial intelligence could be the biggest mistake in the history of mankind.

The co-authors of this work are computer science professor from the University of California Stuart Russell and physics professors Max Tegmark and Frank Wilczek from the Massachusetts Institute of Technology. The article points to some achievements in the field of artificial intelligence, noting self-driving cars, Siri voice assistant and a supercomputer, who defeated a man in the television game-quiz Jeopardy.

As Hawking told the Independent newspaper:
All these achievements fade against the background of what awaits us in the coming decades. The successful creation of artificial intelligence will be the biggest event in the history of mankind. Unfortunately, it may be the last if we do not learn to avoid risks.


Scientists say that in the future it may happen that machines with inhuman intelligence will self-improve and nothing can stop this process. And this, in turn, will launch the process of the so-called technological singularity, which means extremely rapid technological development.
The article notes that this technology will surpass the individual and begin to manage financial markets, research, people and the development of weapons that are beyond our reach. If the short-term effect of artificial intelligence depends on who controls it, then the long-term effect depends on whether it can be controlled at all.
It is difficult to say what the consequences for people may be the creation of artificial intelligence. Hawking believes that few serious studies have been devoted to these issues outside such non-profit organizations as the Cambridge Center for the Study of Existential Risks, the Institute for the Future of Humanity, as well as the research institutes for machine intelligence and the life of the future. According to him, each of us should ask ourselves what we can do now in order to avoid the worst scenario for the future.

')

Source: https://habr.com/ru/post/221933/


All Articles