Three AI startups announced new financing in June, all of which had participation from NVIDIA. Among them, Inflection AI became the third-highest valued AI unicorn globally, and the company announced the creation of the most powerful supercomputer, the 22,000 NVIDIA H100. Industry insiders believe that while NVIDIA has been leading its competitors for two years, its dominant position is not unassailable, and CUDA will not always be a moat.
The popularity of ChatGPT has sparked an AI competition in the technology industry, from giants to startups. In this AI gold rush, NVIDIA, the provider of top-notch AI chips, emerged as the big winner, with Wall Street proclaiming "NVIDIA is the sole arms dealer in the AI war."
And in the past month, NVIDIA has been making waves in the AI investment circle, taking advantage of its leading position in GPU technology and solidifying the strongest AI industry alliance.
Two AI Unicorns Announce NVIDIA's Participation in a Single Day, Inflection AI's Valuation Rises to Third Highest in the World
In June of this year, NVIDIA participated in the funding rounds of three high-profile AI unicorns. Among them, on June 9th, Cohere, a Canadian AI company specializing in ChatGPT chatbots, announced the completion of a $270 million Series C financing round, with participation from NVIDIA, Oracle, Salesforce, and others. This brought Cohere's valuation to approximately $2.2 billion.
The other two startups that announced large-scale financing in a single day last Thursday were: Inflection AI, the creator of the AI chatbot Pi, and Runway, an AI video creation company.
Inflection AI, co-founded and led by Mustafa Suleyman, co-founder of DeepMind, secured $1.3 billion in new financing, making it the fourth largest AI funding round according to Crunchbase.
The lead investors in Inflection AI's new financing round are Microsoft, LinkedIn co-founder Reid Hoffman, Bill Gates, and Alphabet-C Chairman Eric Schmidt, with NVIDIA being the only new face among the investors.
After the financing, Inflection AI's valuation rose to approximately $4 billion, making it the third largest generative AI unicorn in the world, behind only OpenAI and Anthropic.
Runway completed a $141 million new financing round, with new investors including Alphabet-C, NVIDIA, and Salesforce. Through this round, Runway's valuation rose to approximately $1.5 billion, tripling in less than six months.
Inflection AI Integrates 22,000 NVIDIA H100s to Build the Most Powerful Supercomputer, Crushing Meta's Supercomputer in Terms of Chip Quantity
Inflection AI recently launched its first proprietary language model, Inflection-1, claiming that the model was trained on a very large dataset using thousands of NVIDIA H100s. It is the best model in its computational category and outperforms GPT-3.5 and other large language models (LLMs) in various benchmark tests. LLaMA, Chinchilla, and PaLM-540B.
Last Thursday, Inflection AI also announced that in collaboration with NVIDIA to build one of the world's largest AI clusters, its supercomputer will expand to include 22,000 NVIDIA H100 chips to support the training and deployment of the next generation of AI models. The number of integrated AI chips far surpasses the supercomputing cluster built with 16,000 A100 chips announced by Meta in May this year.
In addition to NVIDIA, CoreWeave, a cloud service provider, is another partner of Inflection AI in the aforementioned super GPU cluster. It claims to offer computing power that is "80% cheaper than traditional cloud providers." NVIDIA had previously invested $100 million in CoreWeave. In June, there were reports that Microsoft agreed to invest billions of dollars in CoreWeave over the next few years for cloud computing infrastructure.
In the latest authoritative AI performance benchmark test, MLPerf, the cluster built by NVIDIA and CoreWeave with 3,584 H100 chips trained the GPT-3 large language model in less than 11 minutes.
NVIDIA leads its competitors by two years, but its dominant position is not unassailable. CUDA will not always be a moat.
According to the GPU market data report from Jon Peddie Research, NVIDIA's PC GPU shipments reached 30.34 million units last year, nearly 4.5 times that of AMD. As of the fourth quarter of last year, NVIDIA held an 84% market share in the discrete GPU market, far surpassing its competitors.
NVIDIA's revenue for the first quarter of this year, released last month, exceeded expectations by a wide margin. The revenue from its AI chip business reached a record high, maintaining a year-on-year growth rate of over 10%. The revenue guidance for the second quarter caused a sensation, as it not only did not decline for the fourth consecutive quarter as expected by the market, but also increased by a whopping 33% year-on-year, highlighting the strong demand for AI chips.
After the financial report was released, Jay Goldberg, the founder of chip consulting firm D2D Advisory, said, "At present, the AI chip market still looks like a market where NVIDIA is the winner takes all."
After AMD released its most advanced AI chip, the MI300X, last month, analysts did not hype it up but sensibly pointed out that AMD still has a long way to go to challenge NVIDIA's leading position in the AI chip field, and this chip alone is not enough.
Karl Freund, the founder and chief analyst of Cambrian-AI Research LLC, believes that in addition to having the largest software and research personnel ecosystem in the AI industry, NVIDIA does not have a significant memory advantage or obvious cost advantage over the MI300X. AMD faces a critical challenge in that it lacks a Transformer Engine like H100, which accelerates Transformer models on NVIDIA GPUs. H100 can double the performance of large language models (LLMs).
As a result, if training a new model with thousands of NVIDIA GPUs takes one year, it may take another two to three years with AMD's hardware, or the same amount of time with three times the number of GPUs.
After NVIDIA's financial report was released, some comments mentioned that Nathan Benaich, a partner at AI startup investment firm Air Street Capital, estimated that NVIDIA is two years ahead of its competitors. When evaluating NVIDIA's success, he said:
NVIDIA saw the future before others did. They shifted their focus to making GPUs programmable, identified an opportunity, made a big bet, and consistently outperformed their competitors.
However, industry insiders also believe that Wall Street's enthusiasm for NVIDIA may be overly optimistic. Benaich pointed out that NVIDIA has not yet reached an impeccable level in terms of hardware and software.
Wall Street News previously mentioned that NVIDIA has built a strong ecosystem moat based on its parallel computing platform and programming model CUDA, which helps increase competitive barriers.
Stability AI, the company behind the popular AI painting model Stable Diffusion, recently expressed agreement with Benaich's comments about NVIDIA's competitive situation. He said, "Alphabet-C, Intel, and other companies' next-generation chips are catching up, and even CUDA is no longer an impregnable fortress with the standardization of software."