site banner

Culture War Roundup for the week of July 17, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

11
Jump in the discussion.

No email address required.

I mean, your subjective things are pretty secondary to me. The same reason why the nazi's subjective things are secondary to you.

Well only assuming we don’t share the same moral axioms right? What if we share most moral axioms besides the axiom that morals are relative?

Then the axioms that define my perspective wouldn’t be secondary to you, by virtue of being the same as your axioms which are primary to you.

I think it's incorrect/incomplete.

Ok well that’s not very informative or fun but it’s honest. Thank you.

That two people happen to share some set of subjective things does not somehow elevate them to being any more primary. You and a buddy may both happen to share the taste axiom that you like cilantro. You may happen to share that with twenty buddies. Your experience is still secondary to them, even though they can find some familiarity with it. And it is utterly secondary to the twenty-second guy who dislikes cilantro. Looking at those twenty-two people, can I say that cilantro is "good" or "bad"? I think that even trying would be an error in language.

That wasn't what I meant. Your axioms are already primary to you. So if we share an axiom that axiom is primary to you. I didn't say anything about my experience. There is just an overlap of a preexisting condition of primary-ness. I am not saying my axioms have any effect on what is primary to you.

Your axioms are primary to you. Let's say there is an axiom called axiom A. If axiom A is an axiom that you hold, then it is primary to you. If I hold axiom A, then axiom A is primary to me, because it is one of the axioms I hold. Therefore if you and I both hold axiom A, then it is primary for both of us and acts as primary, substantial moral common ground. You don't have to care about my experience or my moral axioms whatsoever. But if we do share them, then those axioms are primary for both of us. That's all I was saying. And with that common ground then we can communicate about morality. That is the basis of a shared morality. Even if you don't agree with me that moral axioms are subjective, the ones we share are still primary to both of us.

That two people happen to share some set of subjective things does not somehow elevate them to being any more primary.

No elevation is needed. Each person already believes the thing, therefore the thing is primary to them. I am not saying that there is magical effect creating new primacy from their shared moral axioms. I am saying that all of their moral axioms are primary to them, therefore if they share them, they have common ground and will agree that those axioms are primary.

Looking at those twenty-two people, can I say that cilantro is "good" or "bad"? I think that even trying would be an error in language.

I am saying that if all of those people share the axiom that cilantro is good, then they can all agree that cilantro is good. That's all.

can I say that cilantro is "good" or "bad"

If you are one of these people with the axiom that cilantro is good, then you will say that cilantro is good. If you hold an axiom that it is bad you will say it is bad.

If you are a third party with no opinion about cilantro, then I think the moral status of cilantro will be undefined for you, or perhaps it will seem like a weird and alien thing to attach moral status to. As it does for me in real life.

I think that even trying would be an error in language.

maybe it would be an error in language for you as a third party with no opinion on cilantro to say that cilantro is good or bad. But it would certainly not be an error of language for you to say that those people over there believe that cilantro is good. That would be a simple description of the reality that those people believe cilantro is morally good.

and for that strange group of people who believe cilantro is morally good, it would not be a error of language for them to say "cilantro is morally good" - because that is what they believe. You would say that they are incorrect, but there have been lots of humans with moral axioms you would say are incorrect or bizarre, and I doubt you would normally say that their expressions of their weird beliefs are an error of language.

But it would certainly not be an error of language for you to say that those people over there believe that cilantro is good.

There's twenty-one people over there that like cilantro and one person who doesn't. I can't actually say that "they" believe that cilantro is good. In any event, you changed what it is that I said would be an error in language. I asked, "Looking at those twenty-two people, can I say that cilantro is "good" or "bad"? I think that even trying would be an error in language."

There's twenty-one people over there that like cilantro and one person who doesn't. I can't actually say that "they" believe that cilantro is good.

Ok, Sorry if I miswrote that or wasn't clear enough. You can say that they, the 21 people who believe cilantro is good - believe that cilantro is good. That seems essentially definitionally true and not an error of language.

In any event, you changed what it is that I said would be an error in language. I asked, "Looking at those twenty-two people, can I say that cilantro is "good" or "bad"? I think that even trying would be an error in language."

I don't think I changed what you said. I made it clear what I thought. If there are groups of people who think cilantro is good or bad, that does not provide you any ability to extract from the fact that they believe those things the position that cilantro is good or bad. Their moral conclusions are largely irrelevant to whether you can say cilantro is good or bad, that would have to be based on your own axioms. most likely you wouldn't think that cilantro has a moral weight, but I have no problem imagining a culture that does, like this theoretical group.

If your axioms are that cilantro is morally good, then it is not an error of language to say that cilantro is morally good.

However, as I said earlier, it is logically incoherent to say that is it objectively true that cilantro is morally good. And definitely logically incoherent to say that it is morally truer that cilantro is good than that cilantro is bad. What is objectively true is that some of these theoretical people believe that cilantro is good, and some of them believe it is bad. That is objectively true. Determining the truth of the statement "cilantro is morally good" is where logical coherence breaks down.

You can say that they, the 21 people who believe cilantro is good - believe that cilantro is good.

Sure. You can say that the nazis believed that eliminating Jews is good. Nothing interesting seems to follow from this. On your view, there are no grounds on which we can say, "The nazis were wrong, and exterminating Jews is not good." We can only say, "There are some people over there who think that exterminating Jews is good and some people over there who think that exterminating Jews is bad. Nothing interesting seems to follow from this."

Nothing interesting seems to follow from this.

I disagree, but interesting or not, my account of the nature of morality more closely aligns with reality and has more explanatory power.

From the evidence we have, it appears that morality is relative. I am making the argument that just because morality is relative that doesn't rob us of morality. It doesn't lead to moral nihilism and it doesn't decrease the relevance of morality in our lives.

On your view, there are no grounds on which we can say, "The nazis were wrong, and exterminating Jews is not good." We can only say, "There are some people over there who think that exterminating Jews is good and some people over there who think that exterminating Jews is bad. Nothing interesting seems to follow from this."

I think I have made a strong argument that this is not a necessary result of moral relativism.

The behavioral result is identical in my account and your moral realist account. If I am a moral relativist who thinks the nazis are wrong, I will say "There are some people over there who think that exterminating Jews is good and some people over there who think that exterminating Jews is bad" also I can say "I am one of the people who thinks that the nazis are wrong and exterminating Jews is bad" and I can act accordingly to stop their abhorrent behavior.

Nothing has changed behaviorally from your account, as both the allies and the nazis are going to behave the same regardless. Even if I was a moral realist the nazis were going to act in line with their fucked up beliefs. I was still going to act in accordance with my beliefs.

The difference between the accounts is their explanatory power. moral relativism doesn't have the issue of needing to find a justification for moral axioms that as far as I can tell are fundamentally not possible to justify objectively. Can you explain to me how you can justify a moral axiom without relying on another moral axiom?

The behavioral result is identical in my account and your moral realist account.

This is going to depend on things like the determinism/compatibilism/free will debate. It cannot be freely concluded.

The difference between the accounts is their explanatory power. moral relativism doesn't have the issue of needing to find a justification for moral axioms that as far as I can tell are fundamentally not possible to justify objectively.

This is not what it means to have more explanatory power. In fact, if it were, we could on similar grounds jettison the entire scientific endeavor for objective physical reality. No need to go to the trouble of looking for a justification when we can just happily settle for the subjectivist view.

This is going to depend on things like the determinism/compatibilism/free will debate. It cannot be freely concluded.

I'm not sure I believe that actual compatibilists exists but otherwise I guess that's fair. I'll think more about how that debate interacts with this one.

This is not what it means to have more explanatory power.

I think it kind of is but that's a larger argument. As this feels to not be the core of the argument, how about I start by just saying it's more coherent and has at least an equal level of explanatory power as the alternate theory. I think to argue explanatory power in depth I would need to know more about what you think moral realism vs. moral relativism predicts, which you have said would require pulling in the debate over determinism/free will.

In fact, if it were, we could on similar grounds jettison the entire scientific endeavor for objective physical reality. No need to go to the trouble of looking for a justification when we can just happily settle for the subjectivist view.

Thats not true.

Science is different in that it has no need to be justified by arbitrary axioms. It has utility as a justification which I think we would both agree is not a valid justification for morals. All science needs to show that it is better science is to work.

The scientific endeavor can be tracked via its utility. If my opponents and I have different science, but their science makes better bombs and medicine, then I should reconsider my science.

But if my opponents and I have different morals, and their morals make better bombs and medicine (let's say they use children in their mines or sacrifice children to create a working immortality potion), that is not grounds to reconsider my morals. Science is judged on utility, morality is not.

moral relativism doesn't have the issue of needing to find a justification for moral axioms that as far as I can tell are fundamentally not possible to justify objectively. Can you explain to me how you can justify a moral axiom without relying on another moral axiom?

I would still like an answer to this please.

More comments