This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
There's been a fair amount of discussion of America's military aid to Ukraine, and no few condemnations of those of us who have opposed that aid. I am informed, in fact, that this forum is overrun with Russian Propaganda, such that some no longer wish to participate. This is lamentable if true, so I thought it might help to prompt some elaboration of the pro-Ukraine case.
People who support aid to Ukraine, in whatever form, suppose that you personally are given complete control over the US government, and can set policy however you wish. What would your answers be to the following questions?
How much aid would you provide? Weapons? Money? No-Fly Zone? Air support? Troops on the ground? Nuclear umbrella? Something else?
What is the end-state your policy is aiming for? A ceasefire? Deter subsequent Russian invasion? Restoration of Ukraine's original borders? The Russian army destroyed? Putin deposed? Russia broken up? Something else?
Is there an end-state or a potential event in the war that you think would falsify your understanding of the war, and convince you that providing aid was a bad idea? Another way of putting it is, do you think your views on the Ukraine war are falsifiable, and if so, what evidence would be sufficient for you to consider it falsified?
...Reading comments from those arguing for Ukraine, I've noted from the start that many of the arguments presented in favor of aid appear to be mutually-exclusive. In this most recent discussion, I've seen some people arguing that we should be sending in US or NATO troops, and other people arguing that of course no US or NATO troops are needed and that sending them would be obviously crazy. This is a natural consequence of many people arguing many points of view in one forum, but it seems helpful for people to lay out their own views when possible; often, these positions are just stated as though they should be obviously true.
If the doubt over Russian Propaganda is the basis of raising questions, then you have an odd way of going about clarifying the potentially lamentable situation.
For example, your first 8 questions are-
Set aside that many of these are badly structured questions in and of themselves. More importantly, what possible answer could any of these questions provide from a pro-Ukrainian supporter that would or would not illuminate the prevalence of Russian Propaganda in this forum?
If your theory that the forumites answering are only saying [Big Amount] because of Russian Propaganda on this forum, as opposed to other sources of information or non-Russian Propaganda? Would [Smaller Amount] provide any meaningful contrast? If a pro-Ukrainian says 'no, I don't support a No-Fly Zone,' does that... mean anything regarding the forum propaganda that a pro-Ukrainian saying 'yes, I do support a No-Fly Zone?' would indicate?
If not, why are you asking the question as a means to gain elaboration about the thing that would be lamentable-if-true? Where is the light? A 'just asking questions' inverse gish-galloping- demanding a host of extensive answers from low-effort questions- is certainly a technique, but it would be a technique far better suited for generating heat, particularly given the ease of anti-supporters to selectively pick at the answers they find most vulnerable and ignoring the rest, without having to justify (or provide) their own views to the same questions.
Moreover, and this is an even more basic structure question, why aim the question at the supporters of Ukraine aid, as opposed to the opponents of Ukrainian aid?
Again, if your stated basis of concern is to believed, then the best survey audience with whom to evaluate the prevalence of Russian Propaganda on the forum is to start with the people most likely to have adopted Russian propaganda. Then you could ask questions where you know what a Russian propaganda line answer, and then compare the answers that align with the Russian propaganda line versus those that differ. This, in turn, could allow comparison and contrasts, and see how much of opposition to a thing is due to reasons aligned with propaganda versus those that are not.
This wouldn't be the end of a genuine search for truth, of course, as not all capital-P Propaganda is false. Sometimes Propaganda boosts rather than causes the narratives of the target audience. Independent convergence is a thing. But you would at least have chosen a more appropriate survey audience.
And this is without poisoning the well with a bad question like-
Yes, this is a well-poisoning question. We are both aware we are on a rationalist-adjacent forum where intellectual integrity is prioritized. Most of the audience has internalized the principle that good theories need to be falsifiable, because non-falsifiable is itself evidence that no amount of evidence or reason can reverse the premise. which is Bad in a community that prioritizes pursuing Truth and Reason. This is also a cornerstone of the scientific process, practically an axiom of rationalist discourse, and anyone reading this can recognize the follow-on insinuation. If someone says no, they are being unreasonable since reason demands being open to being disproven (falsification), and if they are being unreasonable in this they are being unreasonable in the rest as well.
What not everyone will recognize is that you are not only asking a leading question, or that you are leading with a variation of the hindsight bias (the mental error of looking back and thinking an eventual result was obvious all along), but that you are insinuating an axiom outside of its appropriate paradigm. The Ukraine War is not a laboratory hypothesis for the scientific method. It is a strategic conflict between multiple strategic actors, and this means that paradigms of strategies and game-theory apply.
And because fewer people are familiar with those paradigms than they are rationalist-adjacent forum norms or maxims regarding the scientific method, they wouldn't recognize that the premise of the question doesn't make sense. Or even that 'yes' is the irrational answer that should invite doubt. Not just because doing so would reveal susceptibility to the hindsight bias invitation- the validity/soundness of a decision is the decision made with the information at hand on the time, not information unavailable to them- but just on the matter of strategic paradigm itself.
Outcomes do not falsify strategies.
Outcomes are results, but results are not a strategy, nor are strategies predictive devices in and of themselves. Strategies entail predictions, but equating the two is a compositional fallacy, believing what is true of a part of a thing is true of the whole of the thing. Even ignoring that potential fallacy, believing that results falsify a process (strategy) that leads to them is a first-order mistake. It is a common mistake, particularly among the sort of people who believe that a strategy that fails is axiomatically a falsified strategy, but this is a bad axiom. And like bad axioms in any field, anyone whose theoretical understanding of a field rests on bad axioms is building their understanding on poor foundations, whether the user acknowledges it as an axiom or not.
This is much easier to see when politically loaded topics are substituted by less political topics, which can be done by some basic coding to produce less politically contentious analogies that rest on the same argument structure and axiom of outcome-falsifies-strategy.
For example, this-
-could be represented for an axiom test as-
-because the axiom has to apply to all models for it to apply as an axiom. The Ukraine War, in turn, is a [high-stakes process], and events including but not limited to how the war ends are outcomes of that process.
However, the axiom-test is just as valid if applied to-
-because despite flipping the 'engage with high-stakes process' with a 'not engage in high-stakes process,' the axiom of outcome-falsifies-strategy does not depend on what the actual strategy is. That is why it is an axiom, as opposed to a special pleading rhetorical device / argument-as-soldier to advance the strategist's favored (or dis-favored) positions.
Now consider this in a less ideologically charged- or propagandized- process where a strategic paradigm applies... like gambling. Which has substantial literature overlap with issues of strategic competition, risk-management, and so on. The field of game-theory was named because of the various meta-analysis commonalities, and has been foundational to the field of international relations and conflicts, including the school of Realist theorists.
Now just as we substituted [Ukraine War] for [high-stakes process] for the purpose of the axiom, we can substitute [high-stakes process] for something like...
Somehow, it is less pejorative if someone says 'no, there is no result from gambling with my life's savings that would falsify my understanding that gambling with my life's savings is a bad idea, and convince me that my strategy of not gambling with my life's savings was a bad idea.'
And that somehow is because people will reflexively drop both axioms- the axiom that outcomes falsify strategies, and also that they must be willing to falsify their theories of strategy to be logical and reasonable- on topics of strategy or risk. Not least because they are not valid axioms on topics of strategy and risk. Which is really not surprising to anyone with any notable experience in games of strategy of uncontrollable factors, or risk management.
As anyone with experience in games of cards or dice should know, even dominant strategies don't always win. Sometimes the other party has a better hand, and statistically improbably things happen all the time. Similarly, there are times where high-risk/high-reward strategies are the only plausible strategies for [victory], whether it's because you're making up a deficit or because playing safe is a guaranteed loss. There is no axiom-level 'your strategy is falsified if you lose' principle in play- there are only disputes about the sort of assumptions should go into evaluating the tradeoffs. And, as with most subjective value systems, there is no axiom that people must agree on shared valuations, or defend their own valuation system to your own, regardless of whether you find it convincing or not. The player who tries their hardest to maximize their probable draws or avoid a lose condition, and the player who throws hands to keep a weaker player in the game because [reasons], are playing fundamentally different games for what [victory] even is, even as they play by the same nominal draw rules.
Similarly, a strategy of risk management doesn't get disproven if the risk manifests regardless. Most forms of risk mitigation are about mitigating the frequency of a negative outcome, not the total impossibility, and so it could well be worse to change strategy after a bad-outcome than to stick the course. Again, examples are not hard to find. The negative consequences of false positives driving unnecessary (and sometimes deadly) medical interventions does not disprove the value of screening for deadly diseases at a point where the risk of surgery can pre-empt the risk of no intervention. If someone asked what sort of outcome from a car crash would make someone falsify their belief that safety belts are a good idea, they would be looked at with suspicion and rightly so, regardless of whether they were a bad-faith actor for [special interest group against safety belt laws] or merely incompetent. Nor are the suspicious being irrational or unreasonable for not engaging with a question about what sort of result of [car accident] would make them falsify their strategy of [maybe we should make seat belts mandatory by law], even if there are possible negative outcomes like malefactors trying to race away from the cops and running over children and wouldn't someone think of the children?
Because, again, outcomes do not falsify strategies. Even emotionally-evocative outcomes chosen to try and shift the strategic choice.
Outcomes can justify a re-evaluation of strategic assessments. Outcomes may reveal new information to update strategic assumptions. Outcomes may even drive changes in strategy as decision-maker's tastes change rather than anything underlying information in and of itself. But outcomes do not 'falsify' strategies. The fact that a strategy turns out poorly (or well!) is independent of the validity of the strategic choice, as good strategies can still fail, and bad strategies can still succeed. And this is because judging the ultimate outcome is judging by [information] that- by its nature- is not available at the time of decision about the strategy. It could only be included if one used that as an assumption.
Which- since the point of a strategy is to try and manage risks and opportunities for the future- is assuming the conclusion to justify a decision on how to approach the future.
Which also is not a good practice as a general point of strategy. But it can work well as a propagandistic frame-setting trick when paired with discrediting pejoratives that dares the target audience to self-identify themselves with the alternative.
I think what the OP meant is that it would be lamentable if some people are really considering leaving over their belief that this place is full of Russian propaganda, not that the OP himself believes that the Russian propaganda is here.
And the premise of the response still applies- if the goal is to illuminate the prevalence (or lack thereof) of Russian Propaganda, then you should solicit Russian Propaganda or adjacent views to invite observations of the contrast, not invite people to maximize attack vectors for Russian Propaganda (or adjacent) arguments-as-soldiers upon themselves.
How did you get the idea that that is or is claimed to be the goal?
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link