Nvidia's Unshakable AI Dominance: Why No Giant Can Topple?Nvidia is renowned for its stellar performance in the AI chip manufacturing sector. However, the company's core strength lies in building a business barrier made up of a tight integration of software and hardware, effectively keeping customers loyal and competitors at bay.
Over the past two decades, Nvidia has meticulously crafted a "walled garden" in the tech world, akin to the ecosystem created by Apple. While Apple's ecosystem mainly targets consumers, Nvidia focuses on serving developers who use its chips to build AI systems and other software.
This closed system explains why Nvidia has maintained its dominant position in the AI market despite fierce competition from other chipmakers and tech giants like Google and Amazon. It's unlikely that Nvidia will lose significant market share in the coming years.
In the long run, the competition over Nvidia's dominance will likely focus more on its coding prowess rather than just circuit design. Competitors are racing to develop software that can bypass Nvidia's barriers.
CUDA: The Foundation of the Walled Garden
Understanding Nvidia's "walled garden" hinges on its CUDA software platform. Since its launch in 2007, CUDA has solved a problem that others haven't—how to run non-graphics software, like encryption algorithms and cryptocurrency mining, on Nvidia's specialized chips designed for labor-intensive applications like 3D graphics and video games.
CUDA supports a variety of computing tasks on these graphics processing units (GPUs) and allows AI software to run on Nvidia's chips. The explosive growth of AI software in recent years has elevated Nvidia to one of the world's most valuable companies.
Importantly, CUDA continues to evolve. Year after year, Nvidia releases specialized code libraries to meet the needs of software developers. These libraries enable tasks to be executed on Nvidia GPUs at speeds far surpassing traditional general-purpose processors like those made by Intel and AMD.
The Importance of Full-Stack Computing and Software Platforms
The significance of Nvidia's software platforms also explains why Nvidia has historically invested more in hiring software engineers than hardware engineers. CEO Jensen Huang recently emphasized the company's focus on "full-stack computing," which involves everything from chip-making to AI software development.
Whenever competitors announce AI chips meant to rival Nvidia's, they're effectively competing against a system that's been in use for over 15 years, with vast amounts of code written for it. This software is challenging to port to competitors' systems, which is a true advantage of Nvidia's coding capabilities.
At its shareholders' meeting in June, Nvidia announced that CUDA now includes over 300 code libraries and 600 AI models, supporting 3,700 GPU-accelerated applications used by over five million developers across approximately 40,000 companies.
Market Predictions and Competitive Landscape
The vast size of the AI computing market has prompted multiple companies to join forces against Nvidia. Atif Malik, a semiconductor and networking equipment analyst at Citi Research, predicts that the AI-related chip market will reach $400 billion annually by 2027. In comparison, Nvidia's revenue for the fiscal year ending in January was about $61 billion.
Bill Pearson, Intel's vice president for AI for cloud customers, states that much of the industry's collaboration focuses on developing open-source alternatives to CUDA. Intel engineers are contributing to two such projects, one involving companies like ARM, Google, Samsung, and Qualcomm. OpenAI, the company behind ChatGPT, is also working on its open-source project.
Investors are flocking to startups working on CUDA alternatives, driven partly by the prospect of engineers from many global tech giants potentially making it possible for companies to use any chips they want, avoiding what some in the industry call the "CUDA tax."
Open-Source Alternatives and Industry Dynamics
In the AI chip sector, Nvidia retains a strong leadership position, but competition is intensifying. Startup Groq recently secured $640 million in funding at a $2.8 billion valuation to develop chips that can rival Nvidia's, marking the rise of open-source software and bringing new vitality and possibilities to the industry.
Not just startups, but tech giants are also making moves. Google and Amazon are developing their AI training and deployment chips, and Microsoft announced in 2023 that it would join this effort. These moves challenge Nvidia's market position and push for industry innovation.
In this competition, AMD has emerged as one of the strongest challengers to Nvidia's AI chip dominance with its Instinct AI chip line. AMD Executive Vice President Andrew Dieckman states that although AMD's market share is still behind Nvidia, the company is heavily investing in software engineers to expand its software resources and narrow the gap. Last month, AMD announced a $665 million acquisition of Silo AI, further enhancing its AI development capabilities.
Two major Nvidia customers, Microsoft and Meta Platforms, have started purchasing AMD's AI chips, reflecting the market's demand for diverse suppliers and a desire for competition in high-end products.
Challenges and Opportunities for Nvidia
However, Nvidia's market barrier isn't impenetrable. Babak Pahlavan, CEO of startup NinjaTech AI, revealed that he would have preferred using Nvidia's hardware and software if costs allowed. But due to shortages and the high cost of Nvidia's H100 chips, NinjaTech AI turned to Amazon, which offers its AI training chip Trainium. After months of effort and collaboration, NinjaTech AI successfully trained its AI models on Trainium chips and launched AI "agents" in May, boasting over one million monthly active users, all supported by models trained and run on Amazon chips.
This shift wasn't easy. Pahlavan admitted facing numerous challenges and errors along the way. Amazon Web Services Executive Gadi Hutt acknowledged early mistakes from both sides but stated they are now on track. Amazon's AI chip customer base is growing, including companies like Anthropic, Airbnb, Pinterest, and Snap. Although Amazon offers customers the option to use Nvidia chips, they are more expensive, and transitioning takes time.
NinjaTech AI's experience highlights one major reason why startups like it endure the extra effort and development time to build AI outside Nvidia's "walled garden": cost. Pahlavan says NinjaTech's cloud service bill at Amazon is about $250,000 a month to serve over a million users. If the same AI ran on Nvidia chips, it would cost between $750,000 and $1.2 million.
Nvidia's Response and Future Outlook
Facing these competitive pressures, Nvidia is acutely aware of the high costs associated with its chips. CEO Jensen Huang has pledged that the company's next generation of AI-focused chips will aim to reduce the costs of training AI on Nvidia's hardware.
Malik of Citi Research expects Nvidia to maintain a 90% market share in AI-related chipsets for the next two to three years. This suggests that despite competition, Nvidia's leading position remains solid.
In the foreseeable future, Nvidia's fate will depend on the kind of inertia that has historically kept many businesses and customers locked into various "walled gardens."