site banner

Culture War Roundup for the week of September 23, 2024

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

OpenAI To Become a For-Profit Company

You'll notice that the link is to a hackernews thread. I did that intentionally because I think some of the points raised there get to issues deeper than "hurr durr, Elon got burnt" or whatever.

Some points to consider:

  1. It is hard to not see this as a deliberate business-model hack. Start as a research oriented non-profit so you can more easily acquire data, perhaps investors / funders, and a more favorable public imagine. Sam Altman spent a bunch of time on Capitol Hill last year and seemed to move with greater ease because of the whole "benefit to humanity" angle. Then, once you have acquired a bunch of market share this way, flip the money switch on. Also, there are a bunch of tax incentives for non-profits that make it easier to run in the early startup phase.

  2. I think this can be seen as a milestone for VC hype. The trope for VC investors is that they see every investment as "changing the world," but it's mostly a weird status-signaling mechanism. In reality, they're care about the money, but also care about looking like they're being altruistic or, at least, oriented towards vague concepts of "change for the better." OpenAI was literally pitched as addressing an existential question for humanity. I guess they fixed AI alignment in the past week or something and now it's time, again, to flip the money switch. How much of VC is now totally divorced from real business fundamentals and is only about weird idea trading? Sure, it's always been like that to some extent, but I feel like the whole VC ecosystem is turning into a battle of posts on the LessWrong forums.

  3. How much of this is FTX-style nonsense, but without outright fraud. Altman gives me similar vibes as SBF with a little less bad-hygiene-autism. He probably smells nice, but is still weird as fuck. We know he was fired and rehired at OpenAI. A bunch (all?) of the cofounders have jumped shipped recently. I don't necessarily see Enron/FTX/Theranos levels of plain lying, but how much of this is a venture funding house of cards that ends with a 99% loss and a partial IP sale to Google or something.

I posted this comment well over a year ago, and I think it holds up:

I am not a Musk fanboy, but I'll say this, Elon Musk very transparently cares about the survival of humanity as humanity, and it is deeply present down to a biological drive to reproduce his own genes. Musk openly worries about things like dropping birth rates, while also personally spotlighting his own rabbit-like reproductive efforts. Musk clearly is a guy who wants and expects his own genes to spread, last and thrive in future generations. This is a rising tides approach for humans Musk has also signaled clearly against unnatural life extensions.

“I certainly would like to maintain health for a longer period of time,” Musk told Insider. “But I am not afraid of dying. I think it would come as a relief.”

and

"Increasing quality of life for the aged is important, but increased lifespan, especially if cognitive impairment is not addressed, is not good for civilization."

Now, there is plenty, that I as a conservative, Christian, and Luddish would readily fault in Musk (e.g. his affairs and divorces). But from this perspective Musk certainly has large overlap with a traditionally "ordered" view of civilization and human flourishing.

Altman, on the other hand has no children, and as a gay man, never will have children inside of a traditional framework (yes I am aware many (all?) of Musks own children were IVF. I am no Musk fanboy).

I certainly hope this is just my bias showing, but I have greater fear for Altman types running the show than Musks because they are a few extra steps removed from stake in future civilization. We know that Musk wants to preserve humanity for his children and his grandchildren. Can we be sure that's anymore than an abstract good for Altman?

I'd rather put my faith in Musks own "selfish" genes at the cost of knowing most of my descendants will eventually be his too than in a bachelor, not driven by fecund sexual biology, doing cool tech.

Every child Musk pops out is more the tightly intermingled his genetic future is with the rest of humanity's.

...

In either case, I don't know about AI x-risk. I am much more worried about 2cimerafa's economic collapse risk. But in both scenarios I am increasingly of a perspective that I'll cheekily describe as "You shouldn't get to have a decision on AI development unless you have young children". You don't have enough stake.

I have growing distrust of those of you without bio-children eager or indifferent to building a successor race or exhaulting yourself through immortal transhumanist fancies.

"You shouldn't get to have a decision on AI development unless you have young children". You don't have enough stake.

That strikes me as a remarkably arbitrary line in the sand to draw (besides being conveniently self-serving) - you can apply this to literally anything that is not 100% one-sided harmless improvement.

You shouldn't get to have a decision in education policy unless you have young children. You don't have enough stake.

You shouldn't get to have a decision in gov't spending unless you have young children. You don't have enough stake.

You shouldn't get to have a vote in popular elections unless you have young children. You don't have enough stake.

What is the relation of child-having to being more spiritual grounded and invested in the human race (the human race, not just their children)'s long-term wellbeing? I'm perfectly interested in the human race's wellbeing as it is, and I've certainly met a lot of shitty parents in my life.

I hope this isn't too uncharitable but your argument strikes me less as a noble God-given task for families to uphold, and more as a cope for those that have settled in life and (understandably) do not wish to rock the boat more than necessary. I'm glad for you but this does nothing to convince me you and yours are the prime candidates to hold the reins of power, especially over AI where the cope seems more transparent than usual. Enjoy your life and let me try to improve mine.

(Childless incel/acc here, to be clear.)

What is the relation of child-having to being more spiritual grounded and invested in the human race (the human race, not just their children)'s long-term wellbeing?

Note, of course, that parents can also fall into stupid mental traps and failure modes. The position is they are just somewhat less so, as being invested in abstractions is not the same as being invested in something concrete. High-minded ideals can lead one down ridiculous paths- see EA's concern for shrimp.

In my experience, such positions tend to themselves be cope, that one finds excuses for being a selfish hedonist ("Oh, I'm not having kids for the environment," totally has nothing to do with being a perpetual adolescent who can barely take care of themselves and have no interest in the world at large). People of every stripe and position will find reasons to justify that their choices are Good and Right, and will work to reshape reality to ensure that.