site banner

Culture War Roundup for the week of October 31, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

(Huge thanks to @self_made_human for collaborating with me on this post.)

On Extreme Mental States & the Future

Is it possible for one to wash one's dishes in the modern world as reverently as Jesus washed feet? Is it realistic or desirable for the average human being to possess enough mental fortitude to self-immolate in the name of their cause and/or god? These are far out extreme mental states I'm calling on here to be sure. But their extremity or clicheness does not negate the fact that we can be reasonably sure people such as Jesus or the Buddha existed, those with extreme awareness or control of their mind and body.

When thinking of the far future I often see many folks, including transhumanists and techno-optimists, ignoring these critical experiential discoveries. In an ideal scenario everyone could have their material needs provided by automated technology, and we could figure out how to compete for status goods in a non-harmful way. Alternatively, all those who wish to compete for status goods can do so, but opting out for long periods of time is seen as common, healthy, and normal.

At this point, many hand-wave away the importance of thinking of such a future because writing about utopia is cliche, boring, or pointless to ponder about. Critics often invoke the image of a world full of listless, joyful idiots walking around fed and clothed by machines with inane smiles on their faces. (Assuming they even bother to walk, and aren't just sitting around being lotus eaters, hooked on carefully titrated doses of neurotransmitters injected right into their nucleus accumbens. Which, to be fair, sounds like a great time, which is all the more reason I'd run away screaming from someone trying to do that to me, or if they do manage to catch me, pull out a gun, and shoot them and then myself, preferably in that order.)

As it stands, I expect the exploration of new and unique qualia to be one of the great joys of a transhumanist society, all the more for one that isn't constrained by needing to restrain itself to a couple pounds of meat stuck inside the cage of a skull. And it is a very personal endeavor, knowing that superhuman AGI has explored the 500 dimensional qualia space of baseline human minds isn't much good if you can't do it yourself! And there's very good reason to think that such qualia exist, because evolution and even modern pharmaceutical interventions have to restrain themselves to an infinitesimal slice of the possible ones that we could experience.

Humans born with the rare tetrachromat genes can perceive newer, richer, colors just by virtue of having a novel type of cone cell; it turns out if you hook new sensory modalities into the human brain, it almost inevitably figures out what to do with them. On top of that you have weird crossovers that happen in synesthesia, such as seeing sound waves in the air, or tasting colors. Then you can discuss trying to augment lesser known senses such as kinesthesia or thermoception - and boy are there a fuck-ton of sensory experiences left to try! (Not even getting into the senses that non-human animals have.)

Exploring different states of mind, ways of interacting culturally, and striving for happiness are, in my view, the main benefits of such material abundance. One of the great shifts in human evolution came with agriculture, providing enough material wealth and leisure time to enable a greater focus on wealth, writing, and intellectual pursuits.

In many ways, the modern world is an unthinkably beautiful heaven of infinite bliss compared to the world our ancestors inhabited during the long climb to civilization from the start of our species ~200,000 years ago.

If and when we can experience another great shift, how would humans organize themselves differently? Could a society actually operate if every single person inhabited a similar mental state to a Buddhist monk?

I'd argue that society could work out in these states - but doesn't need to. We could have multiple cultures that all go their own way in developing mental states that today are literally unthinkable. People could alter their minds to promote intuitive leaps far more, or the rationality project could succeed and we could have humans that are extremely rigid and logical in their thinking.

We could experiment with playing strange noises or providing exotic sensory experiences that over time shift our perceptions entirely. Pain as a non-voluntary state could be mostly removed from the adult human experience if everyone was equipped with excellent mindfulness techniques from birth. Anyone who wanted could master lucid dreaming, and spend every night flying through scintillating dreamscapes of surpassing beauty.

None of the shifts listed in the preceding two paragraphs even require invoking far-fetched or theoretical technology created in the future. Many of the most potent mental states we already know humans can achieve are currently inaccessible to the majority of us due to our environment and material wealth. Only a small fraction of humans have ever been in a position to explore these states. If this exploration can become a more common activity, we may find that our consciousness is far more malleable than we had previously imagined.

I'd like to believe that this sort of shift in society will even be necessary with the advent of AI taking over more and more tasks from us. I'm not in the camp that thinks AI is conscious or will be any time soon, so even if we no longer have to do practically any physical task, even if AI can out think us in a million different directions, the study and understanding of consciousness could become the only thing left to explore for us.

Pain as a non-voluntary state could be mostly removed from the adult human experience if everyone was equipped with excellent mindfulness techniques from birth.

Any time I have managed to ignore pain, I have ended up worse off because it turns out, pain is sending an important signal. Ignoring it ends up in more damage.

If you mean emotional pain, yeah maybe. "My long-term partner broke up with me after ten years, but due to mindfulness techniques I'm not upset". Too much "I am never upset or distressed" is as bad as too little, though. Perfectly calm, unfeeling people who are unaffected by distressing things might be saints - or they might be psychopaths.

My long-term partner broke up with me after ten years, but due to mindfulness techniques I'm not upset

That doesn't sound good at all to me, unless "upset" means something like depressed or anxious. Emotions like frustration, sadness, concern, and so on are part of what happens when a human being has their preferences blocked. I can postulate cognition without these emotions, but it no longer seems like human cognition.

I think that part of the problem is exactly that people suppose that, if they are truly upset, they must be anxious or depressed. However, while emotions like sadness or frustration can be directed towards constructive action, this rarely occurs with anxiety and depression. It's not even that the latter are stronger, I think: more that they are disabling, rather than envigorating.

Any time I have managed to ignore pain, I have ended up worse off because it turns out, pain is sending an important signal. Ignoring it ends up in more damage.

Yeah this is why I specified non-voluntary. I'm certainly not advocating for nobody ever experiencing pain again, or totally ignoring it. But as you grow in mindfulness you can say, stub your toe, then go into a kind of state where you still perceive the pain but don't let it affect your mental state.

Perfectly calm, unfeeling people who are unaffected by distressing things might be saints - or they might be psychopaths.

Actually don't psychopaths usually rage out and have uncontrollable urges? Sure they don't feel anything for other people, but that doesn't mean they don't feel anything.

Again I'm not advocating that 100% of the planet becomes an enlightened monk, but if people wished to pursue that lifestyle and could, or could even dip in and out every now and then, I think it would lead to a lot of human flourishing.

Is it possible for one to wash one's dishes in the modern world as reverently as Jesus washed shoes?

Eh? It was feet, not shoes; it was for Passover, and there was a moral lesson He wished to impart to the apostles about being "servants of the servants of God".

I'm not entirely sure what you are trying to link up this example with the main body of your post, which is about transhumanism and how we could allegedly have bigger, better mental states, but okay.

Yeah man, I was apparently pretty mistaken to include that. Also I find it hilarious that I put shoes instead of feet, but apparently others didn't!

I can't think of an example where Jesus was known for washing shoes. There is a noted instance where He washed feet, however, and the point was modeling humility to His disciples.

Wow good catch. I need to brush up on my Bible study.

Jesus's reverence at washing shoes demonstrated a supposed moral virtue of caring for the meek, the downtrodden. This was understood, and followed, by billions. Buddhism, too, is a complex set of claims that guide peoples' actions, understanding, etc. Mental states aren't merely states, there isn't any value in, say, permanently 'reverently washing shoes' aside from the outcomes of it. Also tangential, but 'possesing the fortitude to self-immolate' is not that difficult, suicidal people do it every day, and it's quite analogous to fighting in a battle/war you know you'll probably die in - something dome by at least a billion people, historically.

So these interesting mental states are hollow without corresponding understanding or action. If I take all your neurons and just ... immerse them in dopamine or heroin or something, forever, do you feel infinite pleasure? (no, you just die). What would it even mean to be perpetually in a state of equanimity or reverence? Imagine you're literally frozen in time, in that 'state'. Again, you're just dead, functionally.

Not that said 'states', in particular ones buddhism describes, aren't interesting - just that the above vision entirely decouples them from any purpose whatsoever. "After AI takes over, we'll masturbate forever, except with equinamity!"

Pain as a non-voluntary state could be mostly removed from the adult human experience if everyone was equipped with excellent mindfulness techniques from birth

Is pain anything more than ... understanding a negative imperative or harm? The person still needs to react to potential issues, right? So ... if the hand touches a fire, the thermoreceptors fire, action potentials go to the brain, action potentials go back moving the hand away, the person rapidly makes sure it doesn't happen again. Without pain? "What would it be like if you didn't hear. Well, the vibrations still travel through the ear, you still understand the noises, but it's not hearing, something else". It's a koan - pain is contingent and empty, but it already is, and making it not-exist won't change that.

If the AI did take over, wouldn't it be capable of better, more complex and subtle, 'mental states' anyway? That seems like an issue.

Jesus's reverence at washing shoes demonstrated a supposed moral virtue of caring for the meek, the downtrodden. This was understood, and followed, by billions.

Absolutely! But unfortunately billions don't have the time/energy/mental fortitude to truly cultivate these moral virtues, as our environments push us to vice. That's part of what I'd hope changes.

Also tangential, but 'possesing the fortitude to self-immolate' is not that difficult, suicidal people do it every day, and it's quite analogous to fighting in a battle/war

Not sure if you've seen the videos, but when monks self-immolate they light themselves on fire then sit there in a lotus position as they slowly burn to death. I fail to see how that's comparable to suicide or fighting in a battle. The self-control and focus required to not move a muscle or make a sound as you burn to death is almost superhuman in my view.

So these interesting mental states are hollow without corresponding understanding or action. If I take all your neurons and just ... immerse them in dopamine or heroin or something, forever, do you feel infinite pleasure? (no, you just die). What would it even mean to be perpetually in a state of equanimity or reverence? Imagine you're literally frozen in time, in that 'state'. Again, you're just dead, functionally.

So do you think enlightenment or equanimity is not worth pursuing at all? That's a different conversation. Also to me the calculus changes if you can not just do it yourself but have thousands of others all pursuing equanimity with you. Why would that preclude understand or action?

If the AI did take over, wouldn't it be capable of better, more complex and subtle, 'mental states' anyway? That seems like an issue.

"And it is a very personal endeavor, knowing that superhuman AGI has explored the 500 dimensional qualia space of baseline human minds isn't much good if you can't do it yourself!"

I understand if you didn't comb over this with a fine toothed comb - it is quite a long screed hah.

So do you think enlightenment or equanimity is not worth pursuing at all

They are, but only because of what they mean for one's understanding and action generally. They aren't worth pursuing on their own, in the same way that 'orgasm' isn't worth pursuing on its own, outside the context of anything else. (as said in op "Not that said 'states', in particular ones buddhism describes, aren't interesting")

when monks self-immolate they light themselves on fire then sit there in a lotus position as they slowly burn to death.

Ah, that is difficult - but that power should be used in more complex ways than 'a billion normies being neuralinked into simulated self-immolating without actually dying'.

"And it is a very personal endeavor, knowing that superhuman AGI has explored the 500 dimensional qualia space of baseline human minds isn't much good if you can't do it yourself!"

Well, it's good for the AGI. And if the AGI is morally important and is innately capable of better states than the human, isn't it better to focus on AGI welfare/AGI overmen (dep. on christian or nietzchean) than the equivalent for humans?

They are, but only because of what they mean for one's understanding and action generally. They aren't worth pursuing on their own, in the same way that 'orgasm' isn't worth pursuing on its own, outside the context of anything else. (as said in op "Not that said 'states', in particular ones buddhism describes, aren't interesting")

I'm honestly not quite sure what you're getting at here, could you unpack what you mean by 'one's understanding and action generally'? Are you saying that outside of the context of a flawed world equanimity and enlightenment are pointless?

Ah, that is difficult - but that power should be used in more complex ways than 'a billion normies being neuralinked into simulated self-immolating without actually dying'.

Absolutely agreed! Part of what I was trying to do in this post is evoke the idea that there will be millions of different states people are exploring, the whole Buddhist thing is just one example.

Well, it's good for the AGI. And if the AGI is morally important and is innately capable of better states than the human, isn't it better to focus on AGI welfare/AGI overmen (dep. on christian or nietzchean) than the equivalent for humans?

Eh, I'm not a harsh utilitarian so I'm gonna say no. Human flourishing will always be important to me even if AGI is a bajillion times better at 'flourishing' than we are.

You have to ask the question of what a 'mental state' or equinamity really is, what is happening, what is worth doing. Let's say you made a human enlightened by, uh, hacking all their neurons out, and replacing them with the neurons of a squirrel that was enlightened. Maybe something was lost. Now let's say that the person became enlightened, in a sense, but a very minimal sense - they're precisely as enlightened as the enlightened squirrel is, and then carry on with their netflix-watching the next day. There's something confused here, surely? What if the 100iq-joe-the-janitor really does become as enlightened as the Buddha was - but still retains their desire for netflix, and continues to watch it and eat burgers and work as a janitor? I'm saying outside the context of that enlightenment relating to other parts of their life, it's basically meaningless. So - what is being enlightened, what does that mean, what more is being understood, and is anything being understood if said understanding is never used? And then - if these people are just cavorting in simulacra VR garden land for eternity, is enlightenment as valuable or meaningful there as it is in a complex world with significant demands and willed action? Compare to the enlightened rabbit vs human (or a pet rabbit vs wild rabbit!)

Eh, I'm not a harsh utilitarian so I'm gonna say no. Human flourishing will always be important to me even if AGI is a bajillion times better at 'flourishing' than we are.

Right, but the 'non-self' and 'emptiness' and 'dependent origination' bits should indicate - there isn't anything to being human, aside from all the specific aspects and experiences and dependencies. What is there to humans, at all, that the AGI doesn't have? What's the difference? Not that there aren't any, but it's probably worth checking, and just saying the word 'human' doesn't necessarily mean anything. (and in the sense of "important to me" - i mean, it could just as well be true that 'watching netflix and playing overwatch' would be more important to you than 'universal basic equinamity'. But acting on that would be bad, because then nobody would be enlightened. Similarly, if your desires - at least as you describe them - are wrong, then you should simply act differently. And then those actions are, in retrospect, your desires. Desires in this sense do not exist, then, in a proper sense, they're just descriptions of specific ways one understands and acts)

And then - if these people are just cavorting in simulacra VR garden land for eternity, is enlightenment as valuable or meaningful there as it is in a complex world with significant demands and willed action? Compare to the enlightened rabbit vs human (or a pet rabbit vs wild rabbit!)

I'm willing to say that enlightenment as I understand it is a sort of special state of equanimity, and commit to the idea that it's just as valuable now as it would be in an extremely wealthy society. Sure you aren't under significant demands, but you also have to resist intense luxury and as you say, VR garden lands. That would take a large amount of willpower itself.

Right, but the 'non-self' and 'emptiness' and 'dependent origination' bits should indicate - there isn't anything to being human, aside from all the specific aspects and experiences and dependencies. What is there to humans, at all, that the AGI doesn't have? What's the difference? Not that there aren't any, but it's probably worth checking, and just saying the word 'human' doesn't necessarily mean anything.

I don't think we have anywhere near the understanding of these mental states to make these sorts of judgements. Sure right now the best we can do is discuss enlightenment in terms of 'non-being' etc etc, but eventually I believe we will have a better understanding of how to enter + exit these states and be able to communicate their qualities with much more fidelity.

Wireheading is a staple of science fiction . Technology does not answer the harder problem of status. What if there was a drug that made everyone thin and ripped? But part of the value of being ripped or thin or smart is that it's scarce. If everyone has those attributes, they lose some their value. Part of the value of a magic weight loss pill is that there will always be some obese people . There is already an epidemic of men dropping out of society, delaying family formation and such...do we want to make it worse?

Technology does not answer the harder problem of status. What if there was a drug that made everyone thin and ripped? But part of the value of being ripped or thin or smart is that it's scarce. If everyone has those attributes, they lose some their value.

I think this argument is fundamentally flawed. You could just as easily say that technology does not answer the problem of having your teeth fall out by 30. Part of the value of having good dentistry is that most people's teeth fall out by the time they're 30... if everyone has good teeth, what's the point?

To have a more charitable take, I think that as we develop technologically status will be awarded based on different lines. As I have pointed out in this post, one possible way we could allocate status once everyone is thin/ripped/healthy/immortal would be based on mental states and experience. Those who have a greater command over their own mental state, or explore and describe new mental states, could accrue higher status and the status competition could continue apace on that basis.

That being said, why do we need status competition? At the end of the day if we have AGI and our society is extremely materially wealthy, does status competition even make sense? Or do you think it's inevitable wherever we have humans?