Meta: “Rama 4 Training Uses 10x More GPUs Than Rama 3.1”

-

(Photo = Shutterstock)

Meta announced that it is going to train its next-generation model, Rama 4, with 10 times more GPUs than Rama 3.1. Which means that it is going to construct a cluster of roughly 160,000 GPUs, making it the world’s largest supercomputing infrastructure.

Reuters and The Verge reported on the 2nd (local time) that Meta announced its intention to expand investment in artificial intelligence (AI) through its second quarter performance announcement.

In response to this, Meta recorded sales of $39 billion (roughly KRW 53.25 trillion), up 22% year-on-year resulting from increased promoting revenue.

Nonetheless, the market’s attention was more focused on the rise in costs. It’s because when the corporate announced its first-quarter leads to April, it reported a 27% increase in sales in comparison with the identical period last yr, but its stock price fell by a whopping 19% when it said it will increase its investment in AI.

On this present day, Meta also predicted that AI infrastructure costs would increase. Particularly, the annual maximum spending estimate increased to 40 billion dollars (about 54.6 trillion won).

“It’s hard to predict how supercomputing and data centers will scale in the approaching years,” CEO Mark Zuckerberg said. “Nevertheless it’s higher to take a likelihood and construct capability early than to attend too long.”

The conference call featured a deep dive into AI. First, they began training Rama 4, which is anticipated to make use of 10 times as many GPUs as Rama 3.1.

Meta announced that Rama 3.1 was trained on a cluster with 16,384 Nvidia ‘H100’ GPUs.

So Rama 4 may have over 160,000 GPUs, which is greater than the 100,000 GPUs that xAI currently has in its Memphis data center. CEO Elon Musk recently boasted that by investing in 100,000 GPUs, he had “built the world’s largest supercomputing infrastructure.”

This shows why CEO Zuckerberg was confident that “the Rama 4 that shall be released next yr shall be the very best model on this planet.” In truth, it is claimed that Meta may have 600,000 GPUs in stock by the tip of the yr.

Meta has recently been focusing all its capabilities on AI. Led by Chief Scientist Yann LeCun, it’s developing a ‘physical AI’ model for artificial general intelligence (AGI), which is anticipated to be released as a brand new multimodal model (LMM) by the tip of the yr. It has also taken the drastic step of disbanding and reorganizing its Metaverse department to expand smart glasses equipped with AI assistants.

He emphasized that this will not be an easy investment, but a crucial technique of creating wealth in the longer term.

Zuckerberg said the meta AI assistant shall be probably the most widely used on this planet before the tip of the yr, which is able to increase user engagement on Facebook, Instagram and other platforms.

“The true money will come from business use cases, resembling AI creating ads from scratch or having corporations run AI agents for customer support,” he explained.

Despite the announcement of such a large-scale investment plan, the market response was favorable. Meta’s stock price rose 7% in after-hours trading.

Meanwhile, Meta announced that as of June 30, it had 3.27 billion users on apps resembling Facebook, Instagram, WhatsApp, and Thread. It is a 7% increase from the previous yr.

Reporter Im Dae-jun ydj@aitimes.com

ASK DUKE

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x