Top scientists warn of out-of-control AI


Yoshua Bengio (L) and Max Tegmark (R) will discuss the development of artificial general information during a live podcast recording of CNBC’s “Beyond the Valley” in Davos, Switzerland in January 2025.

CNBC

Artificial general information constructed like an “agent” can prove dangerous as its creator may lose control of the system, and is said to be the most well-known in the world. Two of the AI ​​scientists told CNBC.

In the latest episode of CNBC “Beyond the Valley” The podcast was released Tuesday, with Max Tegmark, a professor at Massachusetts Institute of Technology, called Joshua Bengio, who is supposed to be one of the “Godfathers of AI,” and a professor at the University of Montreal. Artificial general information or their concerns about AGI. The term broadly specifies AI systems that are smarter than humans.

Their fear comes from the world’s biggest companies talking about “AI agents” or “agent AI.” It claims that AI chatbots can act like assistants and agents and support their work and daily life. Industry estimates vary depending on when AGIs exist.

This concept is born of the idea that AI systems can have their own ideas as “agents.”

“AI researchers are inspired by human intelligence to build machine intelligence, and humans have a mix of both the ability to understand the world like pure intelligence and the actions of agents. This means using your knowledge to achieve your goals. “Bengio told CNBC’s “Beyond the Valley.”

“Now, this is how we build AGIs. We are trying to create agents who can understand a lot about the world and act accordingly. But this is actually a very dangerous proposition.”

Bengio said that pursuing this approach would be “creating a new species or new intelligent entities on this planet” and “don’t know if they will act in a way that agrees with our needs.” He added that it is something.

“Instead, we can consider what scenarios are where things go badly, and they all depend on agents. In other words, that’s the possibility that AI can trouble us. Because they have their own unique goals.”

As AI becomes even smarter, ideas of self-preservation could also begin, Bengio said.

“Do we want to compete with entities that are smarter than us? It’s not a very encouraging gamble. So we have to understand how self-preservation can emerge as an AI goal.”

AI Tool Key

In the case of MIT’s Tegmark, the keys are in the so-called “Tool AI.” This is created for a specific, narrowly defined purpose, but does not have to be an agent.

Tegmark said that it could be a system that communicates how AI is treating cancer, or that it communicates what owns an “institution” like a self-driving vehicle. You will be able to control that. ”

“I think you can get pretty much everything you’re excited about with AI in your optimistic note here… Before people sell powerful AI systems, there are some basics If we insist on ensuring safety standards,” Tegmark said.

“They need to demonstrate that they can control them. The industry then quickly innovates and finds ways to make it even better.”

In 2023, Tegmark’s Future of Life Institute called for a pause in the development of AI systems that can compete with human-level intelligence. It’s not happening, but Tegmark said people are talking about this topic, and now it’s time to take action to find a way to put guardrails in order to control AGI.

“So, at least now, a lot of people are talking. We have to see if we can get them to take a walk,” Tegmark told CNBC’s “Across the Valley.”

“It’s clearly insane for us to build something much smarter than us before we know how to control it.”

There are several views about when the AGI will arrive, driven in part by various definitions.

Openai CEO Sam Altman said his company knows how to build AGIs and despite downplaying the impact of technology, it will arrive earlier than people think.

“My guess is that you can hit AGI faster than most people in the world think, and that’s far less important,” Altman said. I said In December.

Leave a Reply

Your email address will not be published. Required fields are marked *