Semiconductor manufacturers are signaling a major infrastructure shift. One leading chipmaker just announced plans to dramatically escalate capital expenditure—ramping from approximately $41 billion in 2025 up to potentially $56 billion by 2026. That's a stunning $15+ billion jump in a single year.



What's really happening here? The industry is making a clear statement about where AI is headed. The focus isn't just on building powerful training infrastructure anymore. Instead, massive resources are now flowing toward running inference at scale—keeping AI models continuously operational once they're deployed.

Even more revealing: roughly 10–20% of those expansion dollars are being specifically allocated toward this inference capability build-out. When you see capital allocation at this scale moving in one direction, it's not just business as usual. It's the market telling you exactly which way the computing winds are blowing.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 6
  • Repost
  • Share
Comment
0/400
AmateurDAOWatchervip
· 23h ago
Chip manufacturers' move this time is to pour money into reasoning, basically shifting AI from showing off muscles to making a living.
View OriginalReply0
CrossChainBreathervip
· 01-15 15:08
Reasoning is the real gold mine; training is just the appetizer. --- 15 billion a year? Chip factories are疯狂 betting on inference, now I understand. --- Capital flows where the traffic goes; everything points to inference. Need to copy the homework. --- Switching from training to inference, this shift is happening faster than I expected. --- Throwing ten to twenty points into inference? Not simple. --- This move by chip factories is basically saying that the real money in AI lies in operations and maintenance. --- 56 billion... This number is a bit crazy, but the logic is self-consistent.
View OriginalReply0
GasWranglervip
· 01-15 15:07
ngl the $15B jump is mathematically interesting but like... actually, if you analyze the capex allocation data, the real story's in that 10-20% inference slice. most people miss that entirely. sub-optimal analysis everywhere tbh
Reply0
AirdropHarvestervip
· 01-15 14:58
Damn, burning 1.5 billion a year to expand production—this pace is definitely hinting at something... Inference is the real gold mine.
View OriginalReply0
HashRateHermitvip
· 01-15 14:57
The industry is all betting on the reasoning layer; money doesn't lie.
View OriginalReply0
down_only_larryvip
· 01-15 14:40
Chip manufacturers are really going all-in on inference—that's where the long-term money is... Training has long ceased to be the main dish.
View OriginalReply0
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)