Sir Mix-a-Lot made his mark on music history with “Baby Got Back.”
The 1992 hit shared his love for large derrieres. And ultimately transformed female physiques for ever more.
Big back sides are now the norm.
But it’s not just men and women who are seeking robust back ends… the world’s greatest technological advancement since the iPhone is focused on well built back ends in 2025. And so should investors.
You see, artificial intelligence (AI) has a problem.
And providing the solution could be the largest profit-making opportunity for investors in the years ahead.
Not that long ago, AI required only a handful of chips.
A small cluster of semiconductors networked together to solve problems.
But as the demands for AI and its potential have increased, AI systems in turn have grown exponentially.
No longer are AI “brains” several processors networked together. They are now something far larger… and far more costly.
A Colossal Standard
Last September, CEO Elon Musk unveiled what he called, “the most powerful AI training system in the world.” His latest venture – xAI – built a supercluster of Nvidia (NVDA) H100 GPUs in Memphis, Tennessee.
The system, dubbed “Colossus,” is made up of 100,000 of Nvidia’s AI chips. And considering that each H100 GPU carries a price tag of $30,000, it means Musk plunked down at least $3 billion on this project.
But it claims to be the fastest and most powerful AI training system in the world. And the hope is this will give Musk and his companies, like Tesla (TSLA) and X, a leg up. Colossus will work on everything from the free-speech chat bot, Grok, to full self-driving cars.
And of course, Grok is making headlines this week as Musk claims Grok 3 outperforms ChatGPT AND DeepSeek.
But most notable for the industry, Colossus is setting the standard for all future AI infrastructure.
And it kicks the arms race between the major AI names – such as Tesla, Alphabet (GOOGL), Alibaba (BABA), Amazon (AMZN), Meta Platforms (META), Microsoft (MSFT), and Oracle (ORCL) - into another gear.
This is of course means big money for the right players who are in the right place at the right time… Ready with the right solution.
The Countdown to 1 Million
AI “brains” are now entire buildings.
Data centers.
Factory-sized machines answering prompts and requests at blazing speeds.
And they’re only going to get bigger.
In fact, the outlook is for AI systems to grow to 500,000 processors… then one million… and then more than a million.
At the same time, the price tag to build these centers will skyrocket.
Consider this… a data center with more than one million processors will cost over $100 billion to build!
But the issue at hand is its easier said than done to get 100,000 AI chips to work together on a single computing task. As Nvidia’s Networking Chief, Gilad Shainer noted, “The basic computing element in AI isn’t a processor, but a data center.”
And AI data centers can’t operate without networking technology. This includes networking chips (some as advanced as AI chips), lasers, switches, and cables that integrate thousands – or tens of thousands – of AI processors into a single computer.
I don’t want investors to underestimate the performance and power of networking chips. Switch chips can now move 51 terabytes of data per second. For perspective, that’s equal to you kicking back and watching videos on your phone for 200 days straight!
And this year, switching speeds will double.
The chips that allow AI processors to function as one are as impressive – if not more so – than the AI chips themselves.
Currently, network chips account for roughly 5% to 10% of all AI semiconductor spending. But this is expected to grow to 15% to 20%.
That means, for a $100 billion data center with a million or more processors, we’re talking about a $15 billion to $20 billion opportunity.
And keep in mind, that’s merely one hyperscale data center.
In the first six months of 2024, the number of data centers under construction in North America rose 70%.
And in the third quarter of 2024, capital spending by hyperscalers surged 50% to an annualized rate of $200 billion. Plus, Goldman Sachs believes AI spending will jump another 35% to 40% this year.
We are at the dawn of a new era in technology.
AI is at the forefront and captures all the headlines.
But one of the fastest growing opportunities going forward will be companies that specialize in ensuring AI runs smoothly and effectively. Network technology specialists and companies working to improve the energy efficiency of AI data centers.
This is why in a year of choppy AI earnings, American Superconductor Corp (AMSC) has rocketed higher, gaining nearly 30% already year-to-date.
AI will continue to dominate headlines in 2025. But the focus will be on back-end technologies. These are advancing at an accelerated rate to keep pace with those expensive AI processors.
Now, Sir Mix-a-Lot did get one thing wrong. Silicon parts aren’t just made for toys… they’re also made for chips.
The type of back-end chips our portfolios want to double up on this year.
Cosmo ain’t got nothing to do with my selection,
Matthew