Since I initially shorted Nvidia a few weeks ago, Nvidia has nosedived from its peak above $500 to low $400. And then it has been bouncing back and forth between these two ends, basically directionless. Accordingly I have either shorted or longed it depending on its short-term TA setup with quite a lot of success. I think NVDA is a great candidate to do short-term trading as it will probably stuck in the current status for a long time. Actually its long-term prospect is likely quite dire and it may become the next CSCO or INTC as dead money for the forseeable future. My friend forwarded me the following analysis presenting a good reason why Nvidia's darling days are numbered!
**********************************
Playing With Fire
Nvidia had a monopoly on AI data center GPUs for most of 2023. So, it could name its price without worrying about the competition undercutting it with lower prices.
And yes, Nvidia sold a ton of its GPUs this year. It's on track to sell 550,000 of its H100 GPUs by the end of 2023.
That's worth more than $16.5 billion in chips, depending on how they're configured.
But instead of catering to the needs of the giant cloud providers, Nvidia chose a different path.
All year, Amazon, Microsoft, Meta, Oracle, Tesla, IBM, and Google wanted to buy as many Nvidia GPUs as the company would sell them.
As Tesla boss Elon Musk put it in July…
We'll actually take Nvidia hardware as fast as Nvidia will deliver it to us.
But instead of maxing out these clients' order books, Nvidia allocated supply to CoreWeave, Lambda, and other smaller cloud-computing providers.
On the surface, this seems like a savvy strategy.
Nvidia reportedly got stock investments in each of those companies as part of the deal to supply them with its GPUs.
That's made Nvidia a part-owner in these companies.
And what's a good way to boost a startup cloud provider's valuation in a year when AI server demand is red-hot?
Allocate your GPUs to the startup instead of selling them to Amazon, Microsoft, and Google!
But Nvidia is playing with fire.
It boosted the valuation of its equity stakes in CoreWeave and Lambda. But the largest cloud providers aren't going to sit around and watch some startups gobble their market share without putting up a fight.
These big tech companies have the resources to build their own computer chips. And that's exactly what they are doing.
Faster and More Efficient Than a GPU
Since 2016, Google has been designing and using an AI chip it calls a Tensor Processing Unit (TPU).
TPUs are custom-built for a type of mathematical operation known as a tensor operator that's commonly used in machine learning tasks.
This makes them faster and more efficient than GPUs.
This means Google has little use for Nvidia GPUs. It can use its own TPUs instead.
And like Google, Amazon has been designing and using its own custom-designed chips for years.
Now, Microsoft is expected to throw its hat in the ring with a custom AI chip set to launch later this year.
Even Tesla – an electric car maker – has been forced to design its own chips.
Bottom line: Nvidia backed the wrong horses in the AI race.
Backing startup cloud providers with early access to its GPUs isn't a durable business model. Thanks to their scale, Amazon, Google, and Microsoft can offer steep discounts and parallel services to their cloud-computing clients that CoreWeave and Lambda can't match.
Worse, Nvidia's throttling supply to the major players, further encouraging them to develop and advance their own silicon.
It's a big mistake.
Nvidia should cater to the large cloud providers. This would lock in Nvidia GPUs as the go-to device for AI applications for decades.
Now, Nvidia has a bunch of giant competitors that have tons of resources, customers, and a reason to not use Nvidia's products in the future.
No comments:
Post a Comment