site banner

Culture War Roundup for the week of November 13, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

7
Jump in the discussion.

No email address required.

OpenAI announces leadership transition

The board of directors of OpenAI, Inc., the 501(c)(3) that acts as the overall governing body for all OpenAI activities, today announced that Sam Altman will depart as CEO and leave the board of directors. Mira Murati, the company’s chief technology officer, will serve as interim CEO, effective immediately.

A member of OpenAI’s leadership team for five years, Mira has played a critical role in OpenAI’s evolution into a global AI leader. She brings a unique skill set, understanding of the company’s values, operations, and business, and already leads the company’s research, product, and safety functions. Given her long tenure and close engagement with all aspects of the company, including her experience in AI governance and policy, the board believes she is uniquely qualified for the role and anticipates a seamless transition while it conducts a formal search for a permanent CEO.

Mr. Altman’s departure follows a deliberative review process by the board, which concluded that he was not consistently candid in his communications with the board, hindering its ability to exercise its responsibilities. The board no longer has confidence in his ability to continue leading OpenAI.

In a statement, the board of directors said: “OpenAI was deliberately structured to advance our mission: to ensure that artificial general intelligence benefits all humanity. The board remains fully committed to serving this mission. We are grateful for Sam’s many contributions to the founding and growth of OpenAI. At the same time, we believe new leadership is necessary as we move forward. As the leader of the company’s research, product, and safety functions, Mira is exceptionally qualified to step into the role of interim CEO. We have the utmost confidence in her ability to lead OpenAI during this transition period.”

I posted this in Twitter and someone speculated that it's because Altman paused subscriptions on Tuesday, but that would alone seem like a pretty inconsequential reason for this sort of a major move.

Brian Armstrong tweeted now that $80 billion of company value has been 'evaporated'. I'm like 'what?' You realize that there is still an immensely popular product with millions of users? A few departures even of key people does not change that. i don't get it...it's like social media compels otherwise rational, smart people to make incendiary statements for attention. It's not like creating 'the next Open Ai' will be easy...look how hard Google has struggled despite unlimited resources and PR. I dunno how much valuation has been lost, but if it were public, probably a 10-20% decline in share price on Monday on this news. Bad but not critical at all. I agree he's right about wokeness, but as we've seen with the huge success of Silicon Valley tech companies, wokeness is evidently not a hindrance to success as much as he may dislike it.

I also observed how it seems like important news and events are always on Friday-Saturday, for example:

Gaza conflict

Starship launch

Open Ai board upheaval

Twitter advertisers defecting over alleged antisemitic post by Elon

For some reason, this 24-hour window from Friday morning to Saturday morning seems to always pack a lot of news

Now that the dust seems to be settling, it looks like a coup by the more nonprofit-focused boardmembers and executives against the guys like Sam and Microsoft who wanted to build a company with real shareholder value. The talent might be there, but there's no guarantee that the will to put out products to the marketplace is still there. Microsoft's equity might be worth pennies on the dollar if OpenAI leadership refuses to ship state-of-the-art technology from now on.

EDIT: If you want proof that Microsoft is scared shitless, take a gander at this breaking report: OpenAI board in discussions with Sam Altman to return as CEO.

Now that the dust seems to be settling, it looks like a coup by the more nonprofit-focused boardmembers and executives against the guys like Sam and Microsoft who wanted to build a company with real shareholder value.

If true, then this sounds like the board doing it's job. Even if the result of this is to entirely kill OpenAI, that would still be closer to the mission than what had been going on. That said, I'm still waiting to see what the real result will be.

Sounds like a value alignment problem. Could value alignment be a fundamentally intractable problem with intelligent actors?

Yeah. The issue with all alignment talk is that the sect of people who'd align the ASI almost certainly have a set of values that are every bit as opposed to mine as whatever random set of values that an ASI would have, if not more. Sure, at least some of the unaligned AI values would involve paperclipping me and the universe, but even that's better than "keep everyone around, but use my unlimited power to align everyone with my values for eternity."

"Make a shit ton of money by building tools that people find highly useful and economically valuable" is at least a comprehensible value and something I can work with, since it allows space to other value systems.