OpenAI, which became popular in 2023, began planning to manufacture its own chips in 2024?

According to foreign media reports, OpenAI is concerned about the shortage of artificial intelligence chips and plans to establish a chip manufacturing company in the future. Currently, its CEO Sam Ultraman is persuading potential investors to join this plan.




Network plan for semiconductor manufacturing factories far away


Foreign media reports show that OpenAI CEO Sam Ultraman is in talks with global investors to raise billions of dollars to build a network of semiconductor manufacturing factories. At present, negotiations are still in the early stages, and the complete list of partners and investors participating in the project has not been determined. According to foreign media reports, OpenAI hopes to break away from its dependence on American chip manufacturer Nvidia and achieve diversified supply.




According to insiders, Ultraman has had discussions with Abu Dhabi based artificial intelligence company G42 and Japanese SoftBank Group, as well as with some investment companies in the Middle East. South Korea's Chosun Ilbo reported on the 21st that whether South Korean semiconductor company Samsung Electronics will participate in the establishment of a network of semiconductor manufacturing factories is a focus of industry attention. According to a source, Intel, an American chip company, TSMC, a Taiwan, China chip foundry, and Samsung Electronics, a Korean chip manufacturer, are all potential partners of OpenAI.




Before being briefly dismissed as CEO of OpenAI in November last year, Ultraman had been working on this project. After returning, he quickly restarted this project. Two insiders said that Ultraman has tested Microsoft's attitude towards this plan, and Microsoft supports it.




Building a state-of-the-art semiconductor manufacturing factory alone may require billions of dollars, and creating a manufacturing factory network of this scale will require longer time and more funding. According to insiders, OpenAI hopes to raise $8 billion to $10 billion in investment from G42, and the progress of the negotiations is currently unclear.




The wave of artificial intelligence drives the demand for high-end chips


Since OpenAI released the universal AI model ChatGPT, the interest of businesses and consumers in artificial intelligence research and applications has skyrocketed, which in turn has stimulated the demand for artificial intelligence chips. According to sources familiar with the situation, Ultraman believes that this is an opportunity that cannot be missed for the development of artificial intelligence. The AI industry needs to take immediate action to ensure sufficient supply of cutting-edge chips by the end of this century. He has repeatedly publicly stated that the current chips are insufficient to meet OpenAI's AI research and development needs.




At present, AI chips are in short supply, and from the supply side, Nvidia, AMD, and Intel are the main top competitors.




According to TrendForce consulting research, looking ahead to 2024, observing the project progress of various AI chip suppliers, NVIDIA's existing products for high-end AI chips (using HBM) in 2023 are A100/A800 and H100/H800; In 2024, the product portfolio will be further classified into more detailed categories. In addition to the original models mentioned above, an H200 with 6 HBM3e and a B100 with 8 HBM3e will also be launched, and NVIDIA's own ARM architecture based CPU and GPU will be integrated simultaneously to launch GH200 and GB200.




Compared to the product plans of AMD and Intel during the same period, AMD's mainstream shipment in 2024 is the MI300 series, which will use HBM3. The next generation MI350 will use HBM3e, and HBM validation is expected to begin in the second half of 2024. The actual estimated time for a more significant product ramp up is the first quarter of 2025.




From the perspective of Intel Habana, the Gaudi 2 launched in the second half of 2022 will use 6 HBM2e, and it is expected to continue using HBM2e in the new Gaudi 3 model by mid-2024, but the usage will be upgraded to 8. Therefore, TrendForce Consulting believes that NVIDIA is expected to continue to lead the AI chip race with leading GPU specifications in HBM specifications, product readiness, and timeline.




But even they cannot meet the current high demand for AI chips in the industry. According to the CNBC website on January 19th, US technology company Meta is investing billions of dollars to purchase Nvidia's high-end chip H100, which is the core of AI research and development. Meta CEO Mark Zuckerberg stated that the company's AI future plan includes building a "large-scale computing infrastructure" that will include 350000 Nvidia H100 chips by the end of 2024. Industry insiders analyze that the Nvidia H100 chip is priced at around $25000 to $30000, and can exceed $40000 on second-hand platforms. Even if Meta purchases at an intermediate price, this expenditure will be close to $9 billion. Moreover, the supply of H100 chips is limited.


Top