Ptechhub
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs
No Result
View All Result
PtechHub
No Result
View All Result

The AI Industry’s Scaling Obsession Is Headed for a Cliff

By Wired by By Wired
October 15, 2025
Home AI & ML
Share on FacebookShare on Twitter


A new study from MIT suggests the biggest and most computationally intensive AI models may soon offer diminishing returns compared to smaller models. By mapping scaling laws against continued improvements in model efficiency, the researchers found that it could become harder to wring leaps in performance from giant models whereas efficiency gains could make models running on more modest hardware increasingly capable over the next decade.

“In the next five to 10 years, things are very likely to start narrowing,” says Neil Thompson, a computer scientist and professor at MIT involved in the study.

Leaps in efficiency, like those seen with DeepSeek’s remarkably low-cost model in January, have already served as a reality check for the AI industry, which is accustomed to burning massive amounts of compute.

As things stand, a frontier model from a company like OpenAI is currently much better than a model trained with a fraction of the compute from an academic lab. While the MIT team’s prediction might not hold if, for example, new training methods like reinforcement learning produce surprising new results, they suggest that big AI firms will have less of an edge in the future.

Hans Gundlach, a research scientist at MIT who led the analysis, became interested in the issue due to the unwieldy nature of running cutting edge models. Together with Thompson and Jayson Lynch, another research scientist at MIT, he mapped out the future performance of frontier models compared to those built with more modest computational means. Gundlach says the predicted trend is especially pronounced for the reasoning models that are now in vogue, which rely more on extra computation during inference.

Thompson says the results show the value of honing an algorithm as well as scaling up compute. “If you are spending a lot of money training these models, then you should absolutely be spending some of it trying to develop more efficient algorithms, because that can matter hugely,” he adds.

The study is particularly interesting given today’s AI infrastructure boom (or should we say “bubble”?)—which shows little sign of slowing down.

OpenAI and other US tech firms have signed hundred-billion-dollar deals to build AI infrastructure in the United States. “The world needs much more compute,” OpenAI’s president, Greg Brockman, proclaimed this week as he announced a partnership between OpenAI and Broadcom for custom AI chips.

A growing number of experts are questioning the soundness of these deals. Roughly 60 percent of the cost of building a data center goes toward GPUs, which tend to depreciate quickly. Partnerships between the major players also appear circular and opaque.



Source link

Tags: AIai labalgorithmsArtificial Intelligencedeep learningmachine learning
By Wired

By Wired

Next Post
Agora Builds on Exotel’s AgentStream to Deliver Real-Time AI Voice Bots

Agora Builds on Exotel's AgentStream to Deliver Real-Time AI Voice Bots

Recommended.

Unraveling the legal, economic and market ramifications if Trump tries to fire Fed Chair Powell

Unraveling the legal, economic and market ramifications if Trump tries to fire Fed Chair Powell

July 19, 2025
Outdated Inventory Systems Are Costing Retailers: Info-Tech Research Group Publishes Insights on How AI Can Make a Difference

Outdated Inventory Systems Are Costing Retailers: Info-Tech Research Group Publishes Insights on How AI Can Make a Difference

June 20, 2025

Trending.

Google Sues 25 Chinese Entities Over BADBOX 2.0 Botnet Affecting 10M Android Devices

Google Sues 25 Chinese Entities Over BADBOX 2.0 Botnet Affecting 10M Android Devices

July 18, 2025
Stocks making the biggest moves premarket: Salesforce, American Eagle, Hewlett Packard Enterprise and more

Stocks making the biggest moves premarket: Salesforce, American Eagle, Hewlett Packard Enterprise and more

September 4, 2025
Wesco Declares Quarterly Dividend on Common Stock

Wesco Declares Quarterly Dividend on Common Stock

December 1, 2025
⚡ THN Weekly Recap: New Attacks, Old Tricks, Bigger Impact

⚡ THN Weekly Recap: New Attacks, Old Tricks, Bigger Impact

March 10, 2025
Bloody Wolf Targets Uzbekistan, Russia Using NetSupport RAT in Spear-Phishing Campaign

Bloody Wolf Targets Uzbekistan, Russia Using NetSupport RAT in Spear-Phishing Campaign

February 9, 2026

PTechHub

A tech news platform delivering fresh perspectives, critical insights, and in-depth reporting — beyond the buzz. We cover innovation, policy, and digital culture with clarity, independence, and a sharp editorial edge.

Follow Us

Industries

  • AI & ML
  • Cybersecurity
  • Enterprise IT
  • Finance
  • Telco

Navigation

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Subscribe to Our Newsletter

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2025 | Powered By Porpholio

No Result
View All Result
  • News
  • Industries
    • Enterprise IT
    • AI & ML
    • Cybersecurity
    • Finance
    • Telco
  • Brand Hub
    • Lifesight
  • Blogs

Copyright © 2025 | Powered By Porpholio