Google Just Found a Way to Make AI Run on 6x Less Memory, and Chip Stocks Are Panicking
Google's new TurboQuant algorithm could slash the amount of expensive memory chips AI needs to run, and Wall Street is already freaking out.
If you've been wondering why AI companies keep burning through billions of dollars on hardware, here's the short version: the chips that power AI need a ton of memory to work. Like, a ridiculous amount. And the companies that make those memory chips have been printing money because of it.
But Google just dropped a bombshell. Their new algorithm, called TurboQuant, can shrink the memory AI models need by more than six times. That means an AI system that used to need, say, $60,000 worth of memory chips might now only need $10,000 worth.
Naturally, stocks for memory chipmakers like Micron, SK Hynix, and Samsung all took a hit. Cloudflare's CEO even called it "Google's DeepSeek moment," comparing it to when the Chinese AI lab shocked everyone last year by building a powerful model on the cheap.
But here's the twist: some experts say this panic is overblown. Making AI more efficient doesn't mean people will use less of it. It means they'll use way more of it, just like how better gas mileage didn't make people drive less. The real winner might actually be companies building AI applications, not the ones selling the parts.
As reported by The Motley Fool.
Source: The Motley Fool
Sponsored