Comments on: Ongoing Saga: How Much Money Will Be Spent On AI Chips? https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/ In-depth coverage of high-end computing at large enterprises, supercomputing centers, hyperscale data centers, and public clouds. Fri, 26 Jul 2024 16:04:40 +0000 hourly 1 https://wordpress.org/?v=6.7.1 By: emerth https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-228031 Wed, 17 Jul 2024 13:58:34 +0000 https://www.nextplatform.com/?p=144408#comment-228031 In reply to Timothy Prickett Morgan.

AI in the LLM sense and current state of the art is a data mining tool with a friendly front end. That’s my $0.02.

]]>
By: Mike Bruzzone https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227982 Tue, 16 Jul 2024 16:55:02 +0000 https://www.nextplatform.com/?p=144408#comment-227982 Right the myth of 20 M servers, IDC says 12 M in 2023?

Intel DCG + Nex on channel AWP < COS / division revenue on a gross basis = 21,606,432 Xeon components

Intel DCG + NEX on channel AWP < COS < R&D < MG&A < restructuring on a net basis = 44,884,979 components

AMD Epyc net = 6,490,264 units, I've only scored AMD on net.

Total servers of all types on a net basis 72,941,675 components / 18 CPUs per rack = 4,052,315 racks composed of 36,470,837 2P sleds.

On a gross basis approximately 28,096,696 components because I score AMD on a net basis regardless, 1,560,928 18 CPU racks that is 14,048,348 2P sleds.

Then one has to ask why channel supply data has AMD and Intel head-to-head on current generation share and the answer is estimating on a net basis for keeping track of bundle deal sales close incentive.

Mike Bruzzone, Camp Marketing

]]>
By: Mike Bruzzone https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227979 Tue, 16 Jul 2024 16:28:22 +0000 https://www.nextplatform.com/?p=144408#comment-227979 Bruzzone NVIDIA accelerator on channel supply on financial data;

A100 q1 2022 = 1,589,368 units
A100 q2 2022 = 1,348,612
A100 q3 2022 = 1,151,484
A100 q4 2022 = 948,259
H100 q1 2023 = 366,774
H800 q1 2023 = 103,449
H100 q2 2023 = 983,539
H800 q2 2023 = 245,885
H100 q3 2023 = 773,648
H800 q3 2023 = 190,355
L40_ q3 2023 = 296,223
H100 q4 2023 = 798.539
H800 q4 2023 = 196,479
L40_ q4 2023 = 305,753
GH 200 q1 2024 = 168,819
H100 q1 2024 = 685,033
H800 q1 2024 = 225,059
L40_ q1 2024 = 557,067

2022 = 5,037,724 units of A100
2023 = 4,260,644 units here broken out,
H100 = 2,922,500
H800 = 736,168
L40_ = 601,97

I’m also working on consumer dGPU volume. Similar the industry myth of 20 M so said servers produced annually, which on the mass of 2017 through 2021 v2/v3/v4 volume pumped out as sales close [?]for their standard platform markets, the syndicate myth that conceals total Nvidia generation volume in about to be revealed.

Doing this exercise in due diligence it has become apparent why Intel is continuing consumer dGPU field application validation relying on Arc priced at cost. Monopoly substitutes aim for 20% of the market they enter because that’s what it takes to sustain a presence. Intel would not be interested in expending development resource on consumer dGPU when the second players so said annual volume averaged only 8.7 M units annually over the last 5 years.

Mike Bruzzone, Camp Marketing

]]>
By: Timothy Prickett Morgan https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227956 Tue, 16 Jul 2024 11:42:45 +0000 https://www.nextplatform.com/?p=144408#comment-227956 In reply to emerth.

It’s hard to say. But I do not believe the high cost of AI is sustainable as it is currently being done. The use cases are limited and the revenue streams are small. I do not use this stuff, and I do not envision I will. I like to do my own thinking. Not everyone feels this way, but these tools have to do a better job for a lot less money. If they do, then I can see a few hundred billion being spent a year on it, maybe even a trillion. But this is not like online shopping destroying small towns and malls — at least not yet. That seemed more inevitable back in the late 1990s and was.

]]>
By: emerth https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227916 Tue, 16 Jul 2024 02:44:07 +0000 https://www.nextplatform.com/?p=144408#comment-227916 In reply to Timothy Prickett Morgan.

That sounds lovely, Timothy, but ehat about the massive raspberry pending in the AI market? No prediction?

]]>
By: Timothy Prickett Morgan https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227913 Tue, 16 Jul 2024 01:59:20 +0000 https://www.nextplatform.com/?p=144408#comment-227913 In reply to John S.

I have similar instincts. And selfish ones, too. We have land, can grow food, hunt and fish. I have planted orchards and have a massive raspberry and blackberry garden. We have chickens and the neighbors have beef and a need for some hands to mend fences. That is not an accident, but it is also how I want to live out the final two quarters of my life. I will write as long as it makes money and does something useful, and when it doesn’t, I will turn to politics I guess….?

]]>
By: John S https://www.nextplatform.com/2024/07/15/ongoing-saga-how-much-money-will-be-spent-on-ai-chips/#comment-227865 Mon, 15 Jul 2024 21:09:16 +0000 https://www.nextplatform.com/?p=144408#comment-227865 So when is the inevitable correction going to happen on all this AI hype? I’m sure there will be some smart applications to come out of this at some point, but the hallucinations get you out of some models I’ve played with have been laughable. If you know anything at all about the subject in question, you’d be horrified. It _sounds_ good, but it’s just not based on reality.

So even though Gartner, IDC and others show numbers going up up up, we all know it’s going to crash back down at some point. It’s just a matter of when the correction happens and how hard. I lean towards more severe, but that’s the pessimist or curmudgeonly instincts kicking in.

]]>