site banner

Culture War Roundup for the week of December 11, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

6
Jump in the discussion.

No email address required.

People won’t accept rule from a tech techno-king and would kill someone who tried to force it. The average person thinks of Steve Jobs, the most widely respected tech person, as a weird nerd who made their phone a bit better. People would turn on him in an instant the minute he tried to be more than a friendly salesman. And the average tech person is much closer to Zuckerberg than Jobs in terms of likability and charisma, and look at how much the public respects big Z.

And no, money won’t implicitly do it. The impact that a trillionaire will have on my life is basically null except indirectly through whatever makes him a trillionaire (eg if Bezos becomes a trillionaire the biggest impact he’d have on anybody is the $1000 worth of value that Amazon brought to everyone’s life.) You can’t really convert money into power in a functioning society beyond the incredibly limited scope of making your own life comfier.

You can’t really convert money into power in a functioning society beyond the incredibly limited scope of making your own life comfier.

Bribery and influence? If money can't buy power, somebody should tell big corporations and they'd sack all their lobbyists. Wealth is power and always has been. The sinews of war are infinite money.

What the masses want or like is totally inconsequential. What matters is how many competent people are ready to be the ruling class of a techno-king and if they have means of displacing the current rulers.

I wager that number is big enough already given how popular e/acc is and how unpopular the managers have become.

But respect from "people" is the bottom tier predictor of political influence, much lower than how rich you are or how many men you command.

how popular e/acc has become

This is either a bad larp or you need to get out of your bubble. When you show anyone outside of a tiny naval gazing tech bubble what e/acc is it induces extreme cringe, and the libertarian mechs piloted by a squad of John Galts are at least a few decades away.

tiny naval gazing tech bubble

Sadly, they're the ones busily instantiating the Machine God. I think that lets them punch a tad bit above their weight class.

Groups that believe they are summoning a god are usually called cults and are generally ignored or disliked. In fiction, when they're right and do manage to manifest a god, it rarely goes well for them.

Being liked or disliked isn't particularly relevant when nobody goes to the trouble of stopping them, at least not yet. I'm sure someone will eventually try.

In fiction, when they're right and do manage to manifest a god, it rarely goes well for them.

Insert standard response to "updating from fictional evidence".

I don't claim it's necessarily going to go well, just that I expect it to be more likely than not, but e/accs are the kind of idiots who don't mind being disassembled by ASI, and some, Beff included, would consider that to be a good thing. All we agree on is that creating such an entity is possible.

e/acc has hugely negative net favorables, similar to satanism. In both cases, it's mostly internet edgelords who claim to want it.

Hotz, Buterin and Balaji, though certainly extremely online people, don't strike me as edgelords.

They're radicals. Which is a completely different thing.

As Dase covered in his event post, Buterin isn't e/acc, he's d/acc, a more sane approach that wants to maximize technological progress while minimizing niggling issues like AInotkilleveryone. I subscribe to the same.

I'm aware, I'm also more sympathetic to that approach. I'm just speaking broadly about these watered down versions of accelerationism finally filtering through to techies in a way that might result in a circulation of elites.

I have never heard any of these names before. Unless the Hotz you refer to is George Hotz, but I only know his name because of the PS3 reverse engineering suit.

George Hotz is indeed Geohot, the PS3 hacker, but he did quite a many other things after that, mostly self driving and AI related. And generally has a big platform for commenting on tech from the techie point of view.

Vitalik Buterin is an early bitcoiner most known for co-founding Ethereum and originating large parts of the smart contract ecosystem and of the conceptual frame for blockchain as a technology in general. If you ever heard of crypto chances are the two names you recall are Satoshi's and his.

Balaji Srinivasan is the former CTO of Coinbase but is probably most known for his political activism and ideological work around the idea of a Network State as theorized in his eponymous book based on the work of Albert Hirschman.

given how popular e/acc is

Wildly unpopular. Lizard man constant or below approval rating.

how unpopular the managers have become

Unpopular for sure. But in the curious way that gets almost all of them reelected every election.

I thought the rest of my comment would be a big enough hint that the only measure of popularity that matters in my view is among the 10% of the population that even have a shot at becoming elite with a higher weight for the 2 to 1% that already are.

The Bolsheviks were lizardmen too, but power is not equally distributed in the population.

It matters a great deal that Musk, Buterin, Hotz and Thiel are talking publicly about accelerationism being something instrumental (as opposed to something only philosophy nerds know about) and coordinating along those lines.

It doesn't matter in the slightest that Joe Schmoe on the street doesn't even know what that means or is at best afraid of it. Much like it didn't matter in the slightest what Vasya Pupkin thought of bolchevism.

The Bolsheviks were not Lizardman-tier popular. They got 23% of the vote in the Constituent Assembly elections which were the closest approximation to a free election in post-Tsarist Russia (they happened after the October revolution but before the Bolsheviks consolidated power outside Petrograd) including majorities among the voters they were targetting (the urban working class in Petrograd and Moscow, and the enlisted men in the armed forces), and had already demonstrated enough support in the military to defeat the Kornilov coup. It may not have mattered what Vasya Pupkin thought, but given how things played out it sure-as-hell mattered what Private Ivan Ivanovich thought.

e/acc and the Tech Right do not have the level of elite or popular support they need to take over through anything that looks like politics - at the point Hanania wrote the linked essay the people he was lauding as the future of right-wing politics were backing deSantis. In so far as Balaji Srinivasan had a political project it was based on crypto-enabled exit, and it has already failed - it is now very obvious that the world is a sufficiently dangerous place that you want to be under the protection of a US or Chinese client regime if you own anything worth stealing - particularly if you own anything the Feds or Chicoms would want to steal themselves. The Dominic Cummings project (given the people involved, probably supported in the background by Peter Thiel and Robert Mercer) to replace Trump with someone the Tech Right see as competent and biddable ended up not happening, and if it had happened it would need "orbital mind control lasers complete" persuasion tech in order to beat Trump in a Republican primary.

The only way an e/acc is becoming a technoking is if they build a superintelligent AI and use it to pwn their opponents. And then our technoking will be the man (or possibly transwoman, but definitely someone with a Y chromosome and penis) who is de facto goal-donor to the AI (who is probably a first-tier tech lead who can ignore the de jure top management of his organisation until the AI is turned on, and pwn them afterwards). The only gold-owners who are sufficiently close to top technical AI teams to be de facto goal-donors are Sam Altman and Demis Hassabis. If the AI-enabled technoking isn't one of those two then it is some wunderkind we haven't met yet.

The Bolsheviks were not Lizardman-tier popular. They got 23% of the vote in the Constituent Assembly elections

Isn't it a bit facile to measure it at the time they took power instead of say in 1903 when the term was coined and they were the minority in the RSDLP (despite being called the majority)? Or we could even start at the inception of the RSDLP when Russia was still so agrarian that the industrial workers as a whole were lizardmen at a whopping 3% of the population. Seems like that would be the more apt comparison with a movement for the technocapitalist class.

In so far as Balaji Srinivasan had a political project it was based on crypto-enabled exit, and it has already failed

[...]

The only way an e/acc is becoming a technoking is if they build a superintelligent AI and use it to pwn their opponents.

I suppose if we disagree about premises, we can only disagree about conclusions. I think this analysis is about as nonsensical as you probably think mine is.

I think Starlink is a million times more consequential than AI anything. I think that superintelligence is made up ill defined fantasy. I think crypto-enabled exit is coming along just fine.

it is some wunderkind we haven't met yet

We can at least agree on that being the most likely scenario.

How much power does Hassabis actually have within Google?

I am arguing that that bureaucratic power within Alphabet Inc. is less relevant than you might think - the question is at the moment when (and if) Google Deepmind turns on a superintelligent AI, who is the person who is actually setting the direction of the team that determines what said AI. And I think that person is likely to be Demis Hassabis, or someone even lower down the org chart. If Sundar Pichai or Larry Page goes to Demis and tells him to build an AI that is programmed to make Larry technoking of the galaxy, and Demis goes and builds an AI programmed to make Demis technoking instead, I don't see how Larry finds out what is going on soon enough to make a difference.

If Sundar Pichai or Larry Page goes to Demis and tells him to build an AI that is programmed to make Larry technoking of the galaxy, and Demis goes and builds an AI programmed to make Demis technoking instead, I don't see how Larry finds out what is going on soon enough to make a difference.

It depends on how malleable the AGI is, how many people have direct access to those levers, and how much auditing/permission is required to modify its goals.

Current RLHF takes time, you don't simply just append "be helpful and harmless" to a system prompt and immediately finish your job. Even accounting for sweeping algorithmic changes and significantly faster turnover, I expect that the people designing and implementing a superhuman AGI to have something in the way of checks and balances for making sure that updates pushed to production are approved by an oversight committee, not just one dude, no matter how senior.

Like, these people are not stupid, if I can predict the risk from the head scientist telling the AI to hand over the keys to the kingdom, so can they. Let alone all the genius level AGI or narrowly superhuman AI implemented along the way.

There are potentially ways around this, like creating undetectable backdoors when feeding a model training data (an almost intractable problem, without making cryptography trivial first), but they're not possible to pull off without a great deal of premeditation, or a risk of whistleblowing.

I put more stock in a small group/cabal controlling an AGI far more than any one person, unless the company is under private ownership by one person, like in the case of Musk. And Grok, bless its heart, isn't SOTA, nor do I put particularly high odds on X AI overtaking the incumbents at OAI/DM/Anthropic.

will kill

*will try

I wish them the best against someone who has an ASI at their beck and call.