Profile Picture
  • All
  • Search
  • Images
  • Videos
  • Maps
  • News
  • Copilot
  • More
    • Shopping
    • Flights
    • Travel
  • Notebook
  • Top stories
  • Sports
  • U.S.
  • Local
  • World
  • Science
  • Technology
  • Entertainment
  • Business
  • More
    Politics
Order byBest matchMost fresh
  • Any time
    • Past hour
    • Past 24 hours
    • Past 7 days
    • Past 30 days

Nvidia’s $1 trillion inference chip opportunity

Digest more
Top News
Overview
 · 1d · on MSN
Nvidia bets on AI inference as chip revenue opportunity hits $1 trillion
By Stephen Nellis and Max A. Cherney SAN JOSE, California, March 16 (Reuters) - Nvidia said the revenue opportunity for its artificial intelligence chips may reach at least $1 trillion through 2027, as the company outlined a strategy to compete more aggressively in the fast-growing market for running AI systems in real time.

Continue reading

24/7 Wall St. · 1d
Nvidia’s $1 Trillion Inference Chip Opportunity: The Inflection Point Investors Were Waiting For?
 · 2d
Nvidia expects to sell $1 trillion in AI chips through 2027 — and it's pushing further into inference
 · 1h
Nvidia's Jensen Huang Just Made a Startling Prediction. Here's What it Means for Nvidia's Stock Price.
Investors' eyes are on Nvidia (NASDAQ: NVDA) this week.

Continue reading

 · 3h
Nvidia Sees Sales Goal Topping $1 Trillion With New Markets
 · 1d
Nvidia forecasts $1tn in AI-driven orders by 2027
2don MSN

What Is Inference? Explaining the Massive New Shift in AI Computing

The focus of artificial-intelligence spending has gone from training models to using them. Here’s how to understand the difference—and the implications.
2don MSN

The Artificial Intelligence (AI) Inference Market Could Reach $255 Billion by 2030. This Stock Is Best Positioned to Win.

More investors need to hear of and learn about ASML.
DIGITIMES
34m

Nvidia adopts Groq to tackle AI inference and expand global reach

At its annual GTC conference in San Jose, Nvidia unveiled a major shift in its AI hardware strategy: integrating technology from AI chip startup Groq to address growing demand in AI inference, while simultaneously preparing new products for global markets,
2d

Nvidia GTC 2026: Jensen Huang’s Groq ‘Mellanox moment’ and the inference land grab

Ahead of Nvidia Corp.’s GTC 2026 this week, we reiterate our thesis that the center of gravity in artificial intelligence is shifting from “How fast can you train?” to “How well can you serve?” Training has ushered in the modern AI era.
1don MSN

How Nvidia’s inference bet at GTC poses a challenge and opportunity for China

Nvidia's Groq 3 LPU chip widens the AI gap with China, but offers Chinese firms niche inference market opportunities, analysts say Nvidia's latest language processing chip, unveiled at the company's annual artificial intelligence conference,
2d

CEO Jensen Huang says 'the inflection point of inference has arrived'

Artificial intelligence has to "reason" and "think," meaning that "the inflection point of inference has arrived." "It's way past training now," he added. While Nvidia chips were once heavily used to train AI models,
4don MSN

Amazon Announces Inference Chips Deal With Cerebras

Amazon Web Services says the partnership will allow it to offer lightning-fast inference computing.
Electronics For You
17h

GPU Inference Stack Gets Boost

New cloud stack cuts AI inference cost, scales enterprise workloads. A new enterprise AI inference stack built on NVIDIA’s Rubin platform is being rolled out by Vultr, aiming to
The Economist
5h

The next phase of artificial intelligence may require very different processors

Nvidia faces competition from startups developing specialised chips for AI inference as demand shifts from training large language models to running them efficiently.

Related topics

Nvidia
Ai
Artificial intelligence
Jensen Huang
Amazon Web Services
  • Privacy
  • Terms