This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
AGI Was Never Going To Kill Us Because Suicide Happens At The End of Doomscrolling
I'll go ahead and call this the peak of AI version one-dot-oh
The headline reads "OpenAI Is Preparing to Launch a Social App for AI-Generated Videos." People will, I guess, be able to share AI generated videos with their friends (and who doesn't have THE ALGO as a friend). Awesome. This is also on the heels of the introduction of live ads within OpenAI's ChatGPT.
Some of us were waiting for The Matrix. I know I've always wanted to learn Kung Fu. Others were sharpening our pointing sticks so that when the paperclip machine came, we'd be ready. Most of us just want to look forward to spending a quiet evening with AI Waifu before we initiate her kink.exe module.
But we'll never get there. Because Silicon Valley just can't help itself. Hockey sticks and rocketships. Series E-F-G. If I can just get 5 million more Americans addicted to my app, I can buy a new yacht made completely out of bitcoin.
I am a daily "AI" user and I still have very high hopes. My current operating theory is that a combination of whatever the MCP protocol eventually settles into plus agents trading some sort of crypto or stable coin will create a kind of autonomous, goal-seek driven economy. It will be sandboxed but with (semi) real money. I don't think we, humans, will use it to actually drive the global economy, but as a kind of just-over-the-horizon global prediction market. Think of it as a way for us to have seen 2008 coming in 2006. I also was looking forward to a team of maybe 10 people making a legit billion dollar company and this paving the way for groups of 3 - 5 friends running thousands of $10 + $50 million dollar companies. No more corporate grind if you're willing to take a little risk and team up with some people you work well with. No bullshit VC games - just ship the damn thing.
And I think these things are still possible, but I also, now, think the pure consumer backlash to this silicon valley lobotomy of AI could be very much Dot-Com-2-point-O. The normies at my watering hole are making jokes about AI slop. Instead of "lol I doomscrolled into 3 am again" people are swapping stories about popping in old DVDs so that they can escape the ads and the subscription fatigue.
Culturally, this could be great. Maybe the damn kids will go outside and touch some grass. In terms of advancing the frontier of human-digital knowledge, it seems like we're going to trade it in early not even for unlimited weird porn, but for pink haired anime cat videos that my aunt likes.
What consumer backlash? For every reddit post about how AI is terrible, there are probably 100 people who are enjoying using ChatGPT, find it convenient, 10 people gooning to physically impossible pornography or degen ERP, 30 people enjoying the funny AI cat video that chops up and cooks other animals...
Many consumers say they hate Facebook ad-slop, Microsoft's persistent disregard for consent with Windows updates, Google spying on you and the crap Google algorithm, Tiktok brainrot short form video.
But these companies are making huge amounts of money. Trump and Larry Ellison aren't trying to secure Tiktok because short form video is unpopular, quite the opposite. Tiktok is making billions. It's high-status to say Tiktok is slop, I think portrait video was a mistake and repress youtube shorts furiously whenever I see it... but it's clearly very popular.
If we just read what consumers say and what the media highlights, we'd assume that Facebook was near bankruptcy. They're constantly getting fined, called into congress, delete facebook and hit the gym is an ancient meme at this point, billions shoveled into VR with no returns, their Llama AI models have been shit, everyone thinks of it as a website for boomers, people blame them for everything from loneliness to anorexia to genocide in Myanmar... But no, Facebook is making gigantic profits and their profits are rising fast. Money >>> talk. AI is paying off massively for Facebook in the unsexy ad algorithms that nobody talks about. They can easily pay for these huge capital investments, profits are up even as they spend more and more!
OpenAI is making 42% margins on inference, they want to grow the inference market and this is a natural route to take. 42% margins when they have such a big free-tier is insane. Research is the expensive part, not inference. AI research is clearly important, Facebook and Tiktok prove there's fortunes to be made. LLMs and generative AI are also lucrative, only they're resource-intensive for R&D compared to deep learning. But the promise of mechanizing intellectual labour is incredibly seductive, the big players are not going to slow down here. The market for LLMs is awkward because they're so immensely powerful and valuable that there's furious competition driving prices down, while the market is also still immature and yet to be developed so revenue is starting off small (but growing very quickly).
Unlimited weird porn and anime cat videos are going to accelerate techno-capital, not slow things down.
You and I had a version of this conversation last month, I think. And I know I still think you're wrong.
First, what's your source for the 42% margin on inference? Second, forget about the source. I'll make the same point I did before; separating out margins across different segments of your business is a great way to obfuscate a failing business. This is the same as saying "Why am I broke? I never, ever break my budget for food! And don't worry about this $10k a month I'm spending at the casino." If research and model development costs more than your inference margins (and your inference margins aren't somehow greater than 100%) you're losing money.
https://x.com/kimmonismus/status/1973120175470944615
Says it right here, source is paywalled article.
Revenue: $4.3B in H1 2025 Cost of Revenue: $2.5B in H1 2025. Do the maths, margins are 42%.
I never see a source for these claims that inference costs are higher than what is charged to customers but people keep saying it, in spite of the fact that it violates basic rules of economics.
Companies are allowed to make losses investing in R&D for new products. This profit-brained beancounter mindset is why the West has been declining, in a nutshell. If you don't invest aggressively, how are you going to innovate? R&D and capital deepening is the source of prosperity.
It's very reasonable to expect there will be all kinds of lucrative offshoots from LLM research, just like how deep learning is staggeringly, ludicrously profitable, that's why these big companies are investing so much. The technology is fundamentally very promising and is worth investing in.
From your own link:
And
I think the 42% margin you're talking about is the Server rental cost which that twitter post calls "cost of revenue" but ... that's not how operating margin works.
I'm not talking about operating margin, I'm talking about inference margin, where the server rental is the cost of production.
The operating loss is due to research. Research is the basis of all modern technology and companies should be doing more of it. It's inappropriate to compare it to casino spending like in your above comment.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link