The historian compared artificial intelligence with nuclear weapons

Israeli historian and bestselling author of Sapiens: A Brief History of Humankind Yuval Noah Harari shared his concerns about the development of neural networks in an article for The Economist. He argues that the lack of control over these programs can lead to more serious consequences than nuclear weapons.

Harari believes that neural networks can develop without human intervention, unlike nuclear weapons, which require human participation at every stage. In addition, artificial intelligence (AI) can create a better version of itself and start misinforming people by generating political content and fake news.

“While nuclear weapons cannot invent more powerful nuclear weapons, artificial intelligence can make artificial intelligence exponentially more powerful” the scientist noted.

Harari calls for the use of neural networks for good purposes, such as solving environmental problems or developing drugs for cancer. He also emphasizes the importance of monitoring the development of these programs in order to avoid potential dangers.

An example of how neural networks can be used without control is the experiment of Chinese scientists with the Qimingxing 1 space satellite. The program chose the most interesting points on Earth and instructed the device to study them, which turned out to be in the interests of the Chinese military.

Not a single person interfered with the experiment, which could be a dangerous precedent.

Harari also shed light on how AI could form intimate relationships with people and influence their decisions. “Through its mastery of language, AI could even form intimate relationships with people and use the power of intimacy to change our opinions and worldviews,” he wrote.

To demonstrate this, he cited the example of Blake Lemoine, a Google engineer who lost his job after publicly claiming that the AI chatbot LaMDA had become sentient. According to the historian, the controversial claim cost Lemoine his job. He asked if AI can influence people to risk their jobs, what else could it induce them to do?

In light of these circumstances, Harari urges society and governments around the world to closely monitor the development of neural networks and take measures to protect against their possible negative impact.

Greetings, explorer! We thank our supporters from the bottom of our hearts for their generous donations that keep anomalien.com alive. If you'd like to join the cause and help us continue to deliver amazing articles, please consider making a donation. Let's keep the 👽 smiling!

Follow us on Instagram, Twitter and Telegram for interesting and mysterious bonus content!

Default image
Jake Carter

Jake Carter is a journalist and a paranormal investigator who has been fascinated by the unexplained since he was a child.

He is not afraid to challenge the official narratives and expose the cover-ups and lies that keep us in the dark. He is always eager to share his findings and insights with the readers of anomalien.com, where he has been a regular contributor since 2013.

Leave a Reply