Artificial intelligence poses such a threat to society and humanity that A.I. labs should pause their work on advanced systems for at least six months. So states an open letter signed this week by tech luminaries, among them Apple cofounder Steve Wozniak and Tesla CEO Elon Musk

But the CEO of Google, long among the companies with the most advanced A.I., wont commit to the idea.

Sundar Pichai addressed the open letter in an interview with the Hard Fork podcast published on Friday. The idea of companies collectively taking such an action is problematic, he believes. 

I think in the actual specifics of it, its not fully clear to me how you would do something like that today, he said. Asked why he couldnt simply email his engineers to pause their work, he responded: But if others arent doing that, so what does that mean? To me, at least, there is no way to do this effectively without getting governments involved. So I think theres a lot more thought that needs to go into it.

The open letter calls for all A.I. labs to pause not development in general, but specifically the training of systems more powerful than GPT-4. Microsoft-backed OpenAI released GPT-4 earlier this month as a successor to ChatGPT, the A.I. chatbot that took the world by storm after its release in late November. 

OpenAI itself stated last month: At some point, it may be important to get independent review before starting to train future systems, and for the most advanced efforts to agree to limit the rate of growth of compute used for creating new models. 

That some point is now, argues the open letter released this week. It warns about the risks posed by A.I. systems with human-competitive intelligence and asks:

Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization?

Such decisions shouldnt fall to unelected tech leaders, the letter argues, and powerful A.I. systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.

Pichai acknowledged to Hard Fork the possibility of an A.I. system that can cause disinformation at scale. And in a hint of the malicious A.I. use that may follow, phone scammers are now using voice-cloning A.I. tools to make people believe their relatives urgently need money wired to them.

As for jobs that could be automated away, a University of Pennsylvania business professor last weekend described recently giving A.I. tools 30 minutes to work on a business project and called the results superhuman. 

Asked on the podcast if A.I. could lead to the destruction of humanity, Pichai responded, There is a spectrum of possibilities, and what youre saying is in one of the possibility ranges.

The open letter warns about an out-of-control race to develop and deploy ever more powerful digital minds that no onenot even their creatorscan understand, predict, or reliably control. During the pause, it adds, A.I. labs and independent experts should jointly develop and implement a set of shared safety protocols for advanced AI design and development that are rigorously audited and overseen by independent outside experts.

If they cannot quickly enact a six-month pause, it argues, governments should step in and institute a moratorium.

Pichai agreed with the need for regulation, if not a moratorium. A.I. is too important an area not to regulate, he said. Its also too important an area not to regulate well.

He said of the open letter: I think the people behind it intended it as a conversation starter, so I think the spirit of it is good, but I think we need to take our time thinking about these things.

Meanwhile, Google, Microsoft, OpenAI, and others are racing ahead.

Fortune reached out to Google and OpenAI for comments but received no immediate replies. Microsoft declined to comment.


Newspapers

Spinning loader

Business

Entertainment

POST GALLERY