Discussion about this post

User's avatar
James Thompson's avatar

In your interview with Goldman Sachs you made an observation about the flow of dollars in the AI universe: "One observation that a lot of people have made is, if a dollar comes in at the top, Nvidia keeps $1.20 today. So Nvidia is capturing a lot of the value in the supply chain today." I'm not following... how is NVDA capturing $1.20 out of every $1? Is there leverage? Is this based on a FV vs NPV calculation? Thanks in advance.

Expand full comment
Stefan Uzunov's avatar

Global gross wage costs(not including non-wage labor costs) are roughly $55T vs global GDP of $110T. Lets assume that AI agents(which is not AGI) lead to some monetary gain that is combination of one-time productivity boost(PB) and wage cost reduction(WCR). This would be total monetary gain of $55T * PB% + $110T * WCR%. Obviously best case is all gain is from PB - for example 5% increase from $110T is $5.2T monetary gain with no change in wage costs. Lets get some rather arbitrary base case: 1% WCR and one-time 3% PB on average. That would lead to total monetary gain of $55T*1% + $110T*3% - roughly equal to $3.6T annual monetary gain. I assume no growth in wage cost and GDP, which will make the monetary gain even bigger. I am assuming one-time PB, which is worse case than compounding PB, and makes the monetary gain appear smaller. I am assuming that not all jobs will be affected by AI that is why put only 3% PB on aggregate.

This is rather simplified, maybe oversimplified, but it tries to illustrate that if AI agents are remotely useful and permeate the service economy, the annual monetary gains will be in the trillions. I fail to see how this is not a real possibility and therefore does not justify even $1T annual spending in the next 5-10 years.

Expand full comment
10 more comments...

No posts

Ready for more?