- FinChat
- Posts
- 🗞 Is NVIDIA Closer to Saturation Than We Think?
🗞 Is NVIDIA Closer to Saturation Than We Think?
Scaling laws slowing down, trouble for Nvidia?
Happy Sunday!
Here’s what’s on the docket for this week’s newsletter:
👨💻 Is NVIDIA closer to saturation than we think?
💻️ Remember when Crowdstrike broke the internet… Well, customers don’t.
Let’s dive in!
Partner Spotlight: UncoverAlpha
Nvidia: Latest earnings and the LLM scaling laws slowdown
“We are really at the beginning of 2 fundamental shifts in computing that is really quite significant. The first is moving from coding that runs on CPUs to machine learning that creates neural networks that runs on GPUs. On the other hand, secondarily, I guess, is that on top of these systems, we’re going to be creating a new type of capability called AI.”
(Jensen Huang, Nvidia’s President and Chief Executive Officer)
In the same quarter last year, Nvidia had less revenue than it did net income in this quarter. Surprisingly for many, even though we have Blackwell ramping up, according to Nvidia Hopper, demand continues to be strong and is expected to continue even in the coming quarters. Let’s take a look at what’s driving the sustained adoption:
1. Are cloud service providers still the main driver of Data Center revenue?
Cloud service providers (CSPs) like Amazon, Microsoft, Google, and Oracle are still the main drivers of the Data Center revenue. Nvidia says that approximately half of their Data Center sales came from CSPs this quarter. Nvidia said that they’re doing more than 2x on a year-over-year basis. This means that Nvidia’s reliance on CSPs is getting bigger once again:
CSPs as % of Data Center revenue:
Q4 2023 – +50%
Q1 2024 – mid 40%
Q2 2024 – 45%
Q3 2024 – 50%
The problem with growth being more dependent on CSPs, it’s that CSPs will be coming to limits regarding their yearly CapEx spent from their investor bases at some point. The question of how much more of an increase in CapEx investors in the cloud providers are willing to handle is a complicated one. It also depends on the growth of AI workloads that contribute to the top line. In the last quarter, we saw accelerated growth from all the CSPs, but the pace of that growth and the trajectory of CapEx growth so far are on a different curve.
1. Are LLM Scaling laws slowing down?
This is a question that Jensen couldn’t avoid, as he got it as the first question on his earnings call. It is something that the whole AI community has been asking for the last few weeks. Are scaling laws slowing down for LLMs? Jensen explained that LLMs now scale with pre-training, post-training with reinforcement learning, and now with inference time scaling, which OpenAI’s o1 model first introduced. However, to me, the fact that Jensen shifted that answer towards showing post-training and inference as a scaling paradigm also indicates that there is more truth to pre-training scaling laws slowing down.
It is essential to understand this shift as it can signal significant changes for the industry. If pre-training doesn’t make big incremental improvements to the LLM models anymore and the focus shifts to post-training and inference scaling, then the gravity of investments might shift as well. Post-training scaling means improving the model with reinforcement learning via human or machine feedback. Here, the story is not about how many GPUs you have but more about the quality of your data. While some argue that we have synthetic data that is produced from older versions of LLMs, some don’t think that synthetic data will be good enough for major improvements. A Former Google Deepmind Sr. Scientist said this about synthetic data:
“Now they’re all just doing synthetic data to fix things here and here, but scaling laws does not want synthetic data. It actually wants what’s called independent and identically distributed data, IDD data. It wants new data. It doesn’t want to just get a reframing of the old data. I think in the LLM world there is a sense in which the next-generation models are not going to be much better than the previous models if you just talking about pure text LLMs”
Regarding the inference scaling paradigm, the biggest question for Nvidia is what their competitive position in this market is. As I already wrote some time ago, when o1 came out, Nvidia was and will be in the inference market. Still, its competitive position differs from the pre-training market, where they have no real competition. At inference, given there is more competition and less complexity, I do expect the margins to reflect that over the longer period.
A comment from a COO of a big AI compute company:
“I think in the next 12 months, you will see that the people will be able to figure out inference jobs running on NVIDIA or AMD almost equally other than some few major exceptions, so that will catch up”
Get 25% Off FinChat 🎁
There are just 48 hours left on FinChat’s Black Friday/Cyber Monday sale.
New customers can get FinChat’s institutional-grade stock research platform for 25% off the first 12 months.
Featured Story
Remember when Crowdstrike broke the internet…
Roughly 4 months ago, on Friday, July 19th at 4AM EST, cybersecurity company Crowdstrike pushed what should have been a routine product update.
Instead, it led to the largest global IT outage in history.
Businesses all over the world ground to a halt. Airlines, TV Networks, Banks, and tons more ceased operations and felt the repercussions for weeks following.
But, fast forward 4 months, and it appears customers have forgotten.
On Wednesday of this week, Crowdstrike reported better than expected sales for the 3rd quarter and even raised its full-year financial targets.
Despite the incident having caused many customers to threaten leaving or at least look for alternative providers, Crowdstrike still grew its annual recurring revenue by 4% compared to the previous quarter.
On the conference call following the report, Crowdstrike’s CEO stated “Falcon customers are staying with CrowdStrike as their trusted cybersecurity platform of choice. Q3 gross retention was over 97%, down less than 0.5 percentage points.”
One-time incentive offers for long-term commitments was instrumental in navigating through the crisis this quarter. While these longer-term commitments helped retain customers, it also led to the widest spread ever between Crowdstrike’s Remaining Performance Obligations (total future revenue commitments from customers) and current revenue growth.
Despite a slight drop in the stock following the report, shares are still up 41% year-to-date.
Meme of the week
For those in the US, it’s that time of the year again.
Thanksgiving dinner. A time to gather around the table with family and talk your book.
Here’s to giving your best stock pitch at the dinner table .🥂