site banner

Culture War Roundup for the week of January 13, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

5
Jump in the discussion.

No email address required.

Vibe-Changes and the Still-Misunderstood Freedom of the Press

Some headlines have formed around Biden's Farewell Address and his invocation of the phrase "tech-industrial complex". From the speech:

It is also clear that American leadership in technology is unparalleled, an unparalleled source of innovation that can transform lives. We see the same dangers in the concentration of technology, power and wealth.

You know, in his farewell address, President Eisenhower spoke of the dangers of the military-industrial complex. He warned us that about, and I quote, “The potential for the disastrous rise of misplaced power.” Six days — six decades later, I’m equally concerned about the potential rise of a tech-industrial complex that could pose real dangers for our country as well.

For those of us who are old enough, this typifies a yuge vibe change from the "90s consensus" that tech is magic, that it's a democratizing force, that it can only do good in the world, and that the only thing that can possibly stop the progress of history toward utopia would be if government even put one single iota of regulation on it. Of course, this is not a vibe change that happened overnight. A lot has happened over the years. Insane proliferation of technology and connected devices, colossal increases in number of users and usage rates, displacement and reorientation of entire industries. With that came the shift from "Web 1.0" to "Web 2.0", and folks can debate whether "Web 3.0" has crashed and burned ten feet off the launchpad or whether it's still just slowly picking up steam. With the rise of bitcoin making it easy to cash out on internet crime, there are probably only a few ideological holdouts who still think that it cannot possibly be touched or that code is law or whatever.

So, glossing over mountains of events that have happened in the past 20+ years, what are the President's biggest concerns?

Americans are being buried under an avalanche of misinformation and disinformation enabling the abuse of power. The free press is crumbling. Editors are disappearing. Social media is giving up on fact-checking. The truth is smothered by lies told for power and for profit. We must hold the social platforms accountable to protect our children, our families and our very democracy from the abuse of power. Meanwhile, artificial intelligence is the most consequential technology of our time, perhaps of all time.

Nothing offers more profound possibilities and risks for our economy, and our security, our society. For humanity. Artificial intelligence even has the potential to help us answer my call to end cancer as we know it. But unless safeguards are in place, A.I. could spawn new threats to our rights, our way of life, to our privacy, how we work, and how we protect our nation. We must make sure A.I. is safe and trustworthy and good for all humankind.

In the age of A.I., it’s more important than ever that the people must govern. And as the Land of Liberty, America — not China — must lead the world in the development of A.I.

I'll start with AI, only to quickly drop it. No one here needs a retread of those debates, which are all too familiar. I'll only call attention to the same point as above - the vibe is completely opposed to a complete hands off, let it be what it be, surely it will be a good democratizing force vibe. Almost no one thinks that AI code is law, that if, say, a public university RLHF'ed their way into getting a bot to discriminate against white people or conservatives or whatever, then that's just how the world is and that nothing can be done, hands off the tech. The AI doomers are only an extreme example of how completely antiquated the old view is.

Similarly, for the main event, the President is very concerned about the core function of "information technology", which is to convey information. Make no mistake, this is a broadside on the core conception of what this stuff does, and it cannot be easily excised in some way. It is an acknowledgement that there can be power in tech, and to many, where there is power, there is something to be seized.

One of those industries that holds significant power and which has been disrupted and displaced several times in history is the press. The press, itself, was a disruptive technology, significantly affecting the old ways of scrolls, papyrus, stone carvings, etc. We've seen the rise of radio and television before the internet. With that, I would like to once again claim that this view of freedom of the press gets the history entirely backwards:

That is, for centuries after the printing press was created, governments around the world went to great lengths to control its use. Examples are found in Acemoğlu and Robinson. Private entities or companies would operate a printing press, and regular people could go interact with these operators in sort of a regular way; say, if they wanted to print up a pamphlet to hand out about their views or a newspaper or something, they would go to the printer, submit what they wanted to have printed, pay them however much money, then come back and receive their product after it was printed. Much the same as today, you could say that those private entities had some rights of their own to do business, and they might refuse to print something if they really disagreed with it (they didn't have to bake the cake or make the website; could ban the local Alex Jones, or whatever analogy you want). So what did governments do? They pressured press operators to adopt criteria that the government found favorable. Maybe they'd even issue local monopolies and say that only so-and-so had the right to run a press in a particular area. Of course, the guys they picked always somehow knew what sets of views they needed to have (and which they needed to reject to print) in order to keep their license and continue making bank.

As countries became more liberal democratic, they realized that this was a problem. Some countries kept the monopolies, but passed pretty strict non-discrimination laws, saying that they had to just print whatever the customers wanted; no letting pro-monarchists print their pamphlets and rejecting revolutionary pamphlets. Others, like the US, passed freedom of the press provisions, simply saying that the government needed to stay TF away from press operators; no monopolies, no threats of shutting them down if they don't toe the party line, just leave Britney press operators alone. All of them. Whoever wanted to just buy a press and print.

So therein lies the contradiction. One cannot simply leave the entire internet alone; extorting someone via IP is not conceptually different than doing so by voice. But gobs and gobs of the core purpose of the internet is to simply convey information, as one would have in the past by going to the local printer and then handing out pamphlets. It seems that people really want to break this centuries old consensus, just like how the 90s consensus has crumbled. What's messed up about it is that they want to break that consensus in the name of that consensus. It's as if since no one seems to remember what a physical printing press is, you can just call whatever you want "the free press", and no one will bat an eye.

Is there a steelman? Possibly. The President talked about editing, facts, and lies. Perhaps one can just slightly tweak his speech to say, "Libel law is crumbling," and that fixes the glitch. Indeed, it would be conceptually coherent this way, but who's going to raise their hand to sign up for that? Anyone? Bueller? Bueller? The nightmare of trying to wade through concepts of "misinformation", "disinformation", "malinformation", etc. is too scary, and the well is too poisoned to have any hope of bipartisan agreement to bring libel law up to the task of the internet. In fact, even just this morning, I listened to the oral argument from a case that was in the Supreme Court on Tuesday, where the entire issue was the fine-grained distinction between a "false statement" and one that is "misleading but not false". These arguments happen, probably have to happen sometimes, but are, for the most part, relatively rare. Anyone who wants to Make Libel Law Great Again in order to "fix" the internet has a monumental task in front of them. I don't know how they'd do it. The only thing I know is that continuing to propagate the misinformation that this is about "the free press" is going to occasionally cause me to write a far-too-long, far-too-annoyed comment.

The AI doomers are only an extreme example of how completely antiquated the old view is.

Can you elaborate? What do you think the doomer position is?

Eliezer "Melt the GPUs and Nuke Conventionally Bomb Chinese Data Centers" Yudkowsky thinks AI will literally end life on earth.

Right. But what's the new consensus? That AIs will take all our jobs, including the police and military, and we'll all live happily off UBI while contributing nothing, and no one's ever going to take our stuff away?

Re-read the article. He did not advocate nuking rogue datacenters. He advocated conventional airstrikes even if the datacenter was located inside a nuclear power like China and even if doing so therefore carried some risk of nuclear war. And even that was in the context of an international agreement to stop AI research, because there is no point in destroying Chinese datacenters if your own datacenters are forging full-steam ahead. It doesn't matter who builds the AI, we are all going to die.

In the spirit of accuracy I will edit my comment.

I would shorthand it as, "If we don't heavily regulate this tech, possibly to the point of completely smothering it in the cradle, it won't just be bad; it'll destroy us."

Got it. But what's the new, non-outdated consensus?