AI platforms as they currently exist are also incredibly fungible. To the average end user there is minimal difference between chatgpt, claude, grok etc. To an enthusiast, the main difference is how much they censor, evade, or try to avoid controversial prompts (of which grok is by far the least annoying here). Burning billions on the cutting edge doesn't give you any lasting advantage against 11th hour entries who spend 1/10th the amount to produce something 90% as good at half the price to their customers. And the amount getting invested already implies capturing a substantial amount of consumer spending at some point in the future just to break even. (Already you want $30 per person on Earth per year).
Nvidia itself will probably be fine, though it depends on if the crash just hits the cutting edge or if its so severe it becomes hard to even cover the cost of serving prompts.
AI platforms as they currently exist are also incredibly fungible. To the average end user there is minimal difference between chatgpt, claude, grok etc. To an enthusiast, the main difference is how much they censor, evade, or try to avoid controversial prompts (of which grok is by far the least annoying here). Burning billions on the cutting edge doesn't give you any lasting advantage against 11th hour entries who spend 1/10th the amount to produce something 90% as good at half the price to their customers. And the amount getting invested already implies capturing a substantial amount of consumer spending at some point in the future just to break even. (Already you want $30 per person on Earth per year).
Nvidia itself will probably be fine, though it depends on if the crash just hits the cutting edge or if its so severe it becomes hard to even cover the cost of serving prompts.
More options
Context Copy link