It is understood that open AI will begin to supply its own AI chip next 12 months. It is meant to scale back the dependence on NVIDIA’s chips within the situation that it should construct a big -scale data center through ‘Stargate’.
Reuters cited sources on Monday (10 hours) and reported that Open AI will start the ‘taping out’ procedure that completes its own AI chip design and sends it as a foundry.
Open AI plans to mass -produce its own AI chips by 2026 with TSMC’s three -nanometer process technology. The associated fee of taping out will likely be tens of thousands and thousands of dollars, and it often takes six months to supply chips. Nevertheless, if the chip doesn’t work properly in the primary taping out, it’s possible you’ll need to switch the error and tap again.
If the primary taping out is successful, the open AI may start the mass test of its own AI chip at the tip of this 12 months.
The chip incorporates a Cycetic Array Architecture with high bandwidth memory (HBM) utilized in NVIDIA chips and highly connected network functions. Taiwan’s TSMC will even be manufactured using 3 nanometer process technology.
Richard Ho Open AI Hardware was in control of chip design in cooperation with Broadcom. The hardware team has also doubled its manpower to 40 people in recent months. Ho joined Google as an open AI a 12 months ago, and he has led a customized AI chip project in Google.
The industry estimates that it costs about $ 500 million to design a brand new chip. It’s explained that the price may be doubled to construct the essential software and peripherals.
As well as, lots of of engineers have lots of of engineers, equivalent to Google and Amazon.
Despite the large cost, Open AI’s own chip can also be related to the Stargate released last month. Through this project, there are greater than 10 data centers across the US. It is important to secure chips to be put into it.
Within the AI data center chip market, NVIDIA has greater than 80%of the market share.
Nevertheless, so as to reduce the dependence on NVIDIA on account of lack of volume and price, Google, Amazon, Microsoft (MS), and Meta have developed their very own chips early.
Meanwhile, the self -developed chip of the open AI has each AI model learning and execution ability, nevertheless it is understood to introduce its own chips initially and particularly for the execution and reasoning of the AI model.
In the subsequent -generation version, it’s prone to develop chips with more powerful performance for model training.
By Park Chan, reporter cpark@aitimes.com