OpenAI disbands AGI safety preparedness team again… “The world is just not ready for AGI”

-

(Photo = Shutterstock)

The leader who was preparing for artificial general intelligence (AGI) safety at Open AI left the team again. Accordingly, the team was disbanded. He bitterly said, “Open AI, let alone any company, is ready for AGI.”

Miles Brundage, OpenAI’s senior advisor answerable for AGI preparation, announced his departure from the corporate through X (Twitter) on the twenty third (local time).

He said, “Restrictions on safety research in Open AI have grow to be too severe,” and announced that he would proceed related research at a free non-profit organization without restrictions.

Particularly, he emphasized, “Neither Open AI nor every other cutting-edge research institute is prepared for AGI, and the world is just not ready either.” He continued, “To be clear, I don’t think this statement will cause controversy in Open AI,” and explained, “The indisputable fact that the corporate and the world will someday be prepared is a separate fact from the indisputable fact that it is just not currently.”

This statement is analogous to the words of alignment team leader Jan Reiche, who left OpenAI and joined Antropic last May. He also criticized, “Conducting vital research has grow to be increasingly difficult, with no computing support for months,” adding, “Meanwhile, safety culture and processes have been brushed off by shiny products.”

Brundage has been accountable for the corporate’s AI safety initiatives for the past six years. Along with his departure, OpenAI disbanded its AI Readiness team. That is the second AGI-related safety team to be disbanded, following the superaligned team in May.

As a substitute, OpenAI established a brand new committee answerable for safety review prior to launching latest AI models, and separated it into an independent board of directors oversight organization last month.

Evaluation suggests that Brundage’s departure may additionally be related to Open AI’s conversion to a industrial company. An example is the emphasis on the necessity for independent opinions free from industry bias and conflicts of interest.

Nonetheless, despite this stance, Brundage noted that OpenAI offered to support the work by providing funding, API credits, and early model access with no strings attached.

Reporter Lim Da-jun ydj@aitimes.com

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x