This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I'm pretty sure no one involved in the process actually said "Our goal is to make a bad film". I'm pretty sure a lot of people involved in the process were trying as hard as they possibly could to make a blockbuster. Maybe all of them. And again, they had orders of magnitude more technology than Walt Disney had, but the technology didn't actually solve the problem of making a good movie even a little bit.
Just so. Humans inevitably human, for good or ill. They'll human with sticks and rocks, and they'll human just as hard with nanocircuitry and orbital launch vehicles and nuclear fusion.
Are you familiar with Bostrom's Vulnerable World Hypothesis? If not, I'd recommend it. The standard assumption is that tech advancements proceed in a stable fashion, that the increase in individual/breaking power is balanced by an increase in communal/binding power. I don't think that assumption is valid, not only for future tech, but very likely for tech that already exists. What we have available to us at this moment is probably enough to crash society as we know it; all that is required is for the dice to come up snake-eyes. Adding more tech just means we roll more dice. Maybe, as you say, some future development jacks the binding power up, and we get stable dystopia, but honestly I'd prefer collapse.
You're correct that we bounced back from the black death and so on. But consider something like Bostrom's "easy nukes" example. There, the threat is baked into tech itself. There's no practical way to defend against it. There's no practical way to live with it. You can suppress the knowledge, likely at grievous cost, but the longer you have it suppressed, the more likely someone rediscovers it independently. Bostrom's example is of course a parable about AI, because he's a Rationalist and AI parables are what Rationalists do. It seems to me, though, that their Kurzweilian origins deny them the perspective needed to see the other ways the shining future might be dismayed.
More options
Context Copy link