site banner

Culture War Roundup for the week of September 5, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

106
Jump in the discussion.

No email address required.

On Self Actualization, Utopia, and the Arrow of Civilization

After going back and reading this post I started thinking about what an a person would look like if they lived in a utopian society.

In the linked post, Gaasht puts forth the argument that:

there's so little joy evidenced in the current status quo, everyone is clawing for attention. So many advertisers, influencers, teachers, children competing for attention. craving it, needing it, feeling they'll lose so much if they can't claim it. My deep moral intuitions are on the humility side of things -- that we should not be all screaming for attention. That it is actually morally bad to be screaming for attention.

This sentiment also rings true to me on a deep level, and I want to dig a little more into the idea that many social actions that are rewarded by the current status quo are in fact morally wrong, and would not exist in a truly utopian society.

I want to briefly state that this is more of a thought experiment to understand how we can be better people, I do not believe in Whig History. To be clear I do not think that utopia is necessarily inevitable, or that on balance a society like a utopia is even desirable. However I think it's crucial for us to look through different formulations of society, as I think most posters here would agree that our current society is far from optimal.

Instead of looking at the overall social formations which most utopian theorists seem to get endlessly mired in, I want to think about what type of mindset and values that the average individual in a utopian society would have. The easiest place to start here is describing it in opposition to the current attention seeking, neurotic, victim-oriented mindset that so many people here know and hate. In a truly utopian society, adult should be resilient, emotionally tough, and generally willing to own up to their mistakes.

From here, it becomes much more difficult to determine what an individual should do or feel. I will look through a couple different formulations of a utopia to try and better understand what it means to be a perfect or self-actualized individual.

The Culture

The Culture from Iain Banks is semi frequently referenced here as a good outcome with regards to artificial intelligence and a future society. In The Culture, artificially intelligent starships called Minds rule society and make all of the important decisions. The only time humans make decisions that are actually relevant are when the society gets in contact with an alien species and needs a covert agent, or when a human rejects the culture entirely. Otherwise, humans essentially exist for pleasure and live in an incredible, beautiful paradise while their artificially intelligent babysitters make sure they don't break the rules.

While I would agree that this is by no means a doomsday scenario, and the society described is amazing compared to other potential outcomes of developing a true general artificial intelligence, something about this picture still strikes me as wrong. In any utopia that I can see, humans would be uplifted from our narrow view of the world and over all we will become more intelligent and able to make important decisions as a whole. Creating a benevolent AI nanny strikes me as defeatist and a course which will ultimately lead to stagnation for humanity.

The Edenists

In a science fiction series from Peter Hamilton, another society is proposed called the Edenists. Using advanced biotechnology, these people have engineered the ability to communicate telepathically with each other, and also have created 'biological machines,' - essentially biologically created hive minds for their settlements.

In this culture the ideal person is expected to be strong, willing to sacrifice themselves for the settlement, but they're also expected to be extremely emotionally balanced and very rarely experience anger or any sort of negative emotion. They achieve this by having their hive minds essentially act as therapists for any individual who gets emotionally unbalanced. This hivemind is more than willing to put you in a simulation and bring back dead family members or whatever it needs to do to make sure that you are emotionally strong.

As opposed to the Culture, humans are instrumental in this society as they form bonds with biotechnological starships and still form the backbone of the society's economic and military force.

While this situation is more appealing to me than Banks's nanny AI, it still seems that it has something missing. To me, in a truly utopian society all human emotions would be properly understood and utilized, we would not need to simply use therapy to completely get rid of all anger. I also think that privacy is important, and cannot imagine living in a society where I had literally zero privacy, although I am willing to admit that with trustworthy institutions and leaders this may be a better bet.

In Conclusion

I would suspect that many posters here see both of these societies as horrible, and instead see the ideal society as a reversion back to standard gender archetypes of pre modernity. I don't feel that I believe in that idea strongly enough to do it credit, but I am interested in seeing how a utopian society based on old gender rolls might function.

In general, I think that the universally unanswered question of 'How to Be a Good Man?' that philosophers have been working on for millennia has been nearly abandoned, and instead in the modern world has shifted away from this sort of thinking to focus more on what an ideal society would look like, instead of an individual.

I'm curious what others think the ideal individual would look like, not in the context of our current society but in the context of your favored utopian society?

It is my belief that utopian dreaming about a human life without conflict and struggle itself ceases to be human. Humans are built to struggle, and if there's nothing there to struggle against, they'll find something to be "traumatized" about and struggle with that. I don't think humans will ever stop fighting each other, scrabbling for status and scarce resources. We can change the resources that are scarce, but we can't change people.

We are generational beings projecting our DNA forward through time one fuck, one war, one long damned waste at a time.

We will never be without trouble, without hardship, without want and need.

We are inexhaustible and insatiable.

We are built this way.

Could we change ourselves to not need struggle and still be human?

I very much doubt it. In fact, I don't think we can change ourselves at all. At least not reliably, in aggregate and in the direction we intended.

I do not think so.

To rephrase your question, would you really be happy in a world where you were an exact copy of everyone else instead of such an existence evoking an existential horror of being part of a hive mind making you desire to act out to be visible in some way or form?

I don't think lack of struggle automatically makes everyone the same. If it does we can inject some artificial struggle once every thousand years or so.

That outcome does strike me as terrible, I'd certainly want to avoid that fate.

One of the key defining features of being human is that we can form bonds that go beyond simple kinship which allow us to create complex societies, up to and including our current globe-spanning super-civilization. As the meme goes, we actually are built different.

Of course, there will always be struggles and competitions - the only difference is that in a future utopian society they will be who throws the best parties, who can create the most creative and lavish gifts to give away and climbing the status hierarchy by being most well-liked or admired or talk of the town.

the only difference is that in a future utopian society they will be who throws the best parties, who can create the most creative and lavish gifts to give away and climbing the status hierarchy by being most well-liked or admired or talk of the town.

One of the things I've learned from talking with socialists is that to many what you describe is paradise despite it sounding like a hell to me.

Maybe you haven't been to the right parties yet.

parties as a competition and enjoyed in a patrick bateman business card style dominance sense would leave very little left of socializing that I enjoy.

Have you never socialized with other people as a group, for the purpose of socialization and wondering how you can make it better next time? (Not implying that every party/social event needs to be improved every time, or that there is always some obvious way to improve it for next time).

Competing to throwing the 'best' party doesn't have to be treated as a sociopathic game of Risk, nor does partying mean going to a night club and competing to see who can get the most drunk or can score the hottest chick. A party can be as little effort as stargazing with friends while getting drunk or high or neither. Replace partying with hanging out with friends if you want to quibble with the choice of words. The point is that in a utopia what it means to compete can be entirely different and much more pro-social than what it means to compete in other societies.

Also, I would imagine trying too hard to 'win' at socializing, having fun and letting loose (the goal of a party) is the opposite sort of behaviour necessary for a 'good' party in the first place.

Have you never socialized with other people as a group, for the purpose of socialization and wondering how you can make it better next time?

I'm not positive what "for the purpose of socialization" means so I'm going to assume it just means "for fun" correct me if I'm interpreting this wrong. Yes, when something doesn't go well at a party I try to improve that aspect with the next party. I do not think of this as a ~"struggle or competition", I think viewing it that way is a path straight down sociopathy.

Replace partying with hanging out with friends if you want to quibble with the choice of words. The point is that in a utopia what it means to compete can be entirely different and much more pro-social than what it means to compete in other societies.

I don't think framing this as a competition works at all. I think you're trying to combine two things, friendship/kinship and the only status hierarchy that remains when you remove all other meaningful competitions. I don't think kinship survives this merge. Competitions are gamed, when there is only one competition all energy is put into optimizing that competition. I've been to social events that are thinly veiled status competitions, they are soulless affairs but they're still good at raising status if used right but tiling over the social landscape with them would be a true loss.

My basic point is this, replacing myriad discrete status games with one central status game raises the stakes of that game to a maximum height. socialization based hierarchies are popularity contests and the real world example of heightened and globalized popularity contests isn't who throws the best board game night, it's politics. It's the Clintons, Trumps, Stalins and Maos of the world that win. Socialist don't eliminate all hierarchy, they make it so that only one hierarchy matters and it's perhaps the most brutal and illegible hierarchy that exists.

To desire to be more is a part of being human.

Where's the limit to what a human can be? Who's to say? Perhaps there isn't one. If we were to attempt to trap ideal individual in a conceptual box, to limit him to a specific set of characteristics, then at some level, disappointment appears to be inevitable. Any given definition can be retorted with - "That sounds great, a lot better than now, sure. But is it really all we can be?"

When put in theological terms - it's our mission to climb the ladder to God, using tools that he gave us. Estimated time of arrival: one eternity from now.

Ideal human society in this framework is one of multitudes rather than unity, one that allows for experimentation and different ways of being, acknowledging that we don't actually know what is the best way of going forward.

This is certainly an interesting answer, although it seems to me it's a bit of a cop out to not characterize an end-state here. From your description it sounds like the ideal end point of humanity is essentially us becoming 'god,' which would necessarily entail some sort of beyond current understanding level of power, empathy, and understanding. At that point it's practically meaningless to discuss it at the current point in time because we can't fathom that level of being.

Am I reading you correctly?

it seems to me it's a bit of a cop out to not characterize and end-state here

I know, right? I just don't like the idea of arriving at some kind of end-state and calling it a day.

At the very least, this end-state should be far, far beyond what we can now comprehend. Wouldn't the universe just feel cramped otherwise?

Agreed, I suppose I hope the real end-state of humanity is far beyond our current state. But it's fun to theorize.

For instance, do you think the end state could be achieved in our current bodies, with serious genetic/guided evolution? Or do you think we would necessarily need to go to some sort of transhumanist existence to fully evolve?

Given that many people seem to think that AGI is inevitable, I can never understand how a future in which that AGI is benevolent is worse than one in which it isn't.

Is indifference better? An ASI that is akin to a force of nature, bulldozing planets (with humans on them) not out of hatred but just because we happen to be in the wrong place at the wrong time? Or would they rather create an actively malicious ASI? Hey, at least then you get a villain to unite against... even if you will very likely lose, but of course that is the case for every hero in any good story.

How about benevolent indifference? The AGI stays in its lane, humans stay in theirs, everyone is happy, right? Ah, except for those AGI builders who made the thing in order to do X (where X is to solve some problem that's too difficult for them to solve). What's to prevent the builders from just trying again, tweaking parameters or the architecture until they get what they want? Or something they didn't want but can't put back in the bottle.

If the only answer is benevolent, then the only question is what form does benevolence take? Is it being a nanny to help a drowning child? Should the ASI only intervene when we are about to commit an existential blunder? Does climate change count? It won't be an existential crisis, but it will likely result in the deaths of tens of millions and the immiseration of hundreds of millions more. Do you think drowning Bangladeshis (who emit very little carbon in the first place) would consider being saved by a benevolent ASI nannying and refuse it on those grounds?

Of course, if you think that AGI is not inevitable, other futures are possible. But given that even many people close to AI research also struggle with how to align said AI and even they can't coordinate to slow AI development, I don't really get how it doesn't emerge at some point.

Sidenote: You can leave the Culture. You don't need to be babysat if you think you are all grown up. It is by design... anarchy.

First off, I concede that it's possible we have a future without true AGI. With how powerful the transformer/RL learning models are though, even those scaled up would give a small group of individuals a ridiculous amount of power and 'intelligence,' and I don't see that changing anytime soon.

If the only answer is benevolent, then the only question is what form does benevolence take?

Actually my favorite takes on an AGI future involve a disinterested/distanced AI that is benevolent in some sense, where it is removed from humanity but still cares about humanity (i.e. willing to step in to prevent outright extinction or other horrendous outcomes) but otherwise lets us roam free. However as you say this AGI would also need to avoid the whole 'tile the universe in hedonium' drive, and from what AI researcher seem to think that's a very difficult prospect to achieve.

With regards to the Culture, yes people can leave. I was going to address this point but my post was already over six thousand characters so I felt I needed to prune. Long story short people can leave the Culture, but it's almost always depicted as a bad idea in universe, and the Minds are there every step of the way to guide folks into leaving. Exit rights are definitely better than the alternative, but I wish that there was another collective of humanity that was powerful, and independent from the Minds. Perhaps there is and I either forgot it or didn't read long enough into the series.

Which type of 'benevolent AGI' would be your preferred outcome? And to focus back on the original question, how do you think an individual would feel/act under such an AGI?

I added the disinterested AGI as a possibility, but don't think it matters because the people who made it would still have the same drive to try again - since the benevolent but mostly indifferent AGI is not serving those AGI builder's goals (whatever they might be). The only way to lock in a future where a benevolent indifferent AGI exists is it is the first AGI created and then it prevents us humans from building anymore AGI. But the only way to do that would be to severely curtail or heavily surveil us, which would contradict its existence as being indifferent.

Long story short people can leave the Culture, but it's almost always depicted as a bad idea in universe, and the Minds are there every step of the way to guide folks into leaving.

Except it isn't presented as being bad? Culture citizens are free to travel and that is one of the more popular things to do, whether within the Culture, to other civilizations or into the wilderness. Whole factions break off from the Culture due to philosophical differences (the Peace faction, the Zetetic Elench that believe they should modify themselves to understand aliens better)

but I wish that there was another collective of humanity that was powerful, and independent from the Minds

There isn't, though I think it's mentioned that humans can undergo modifications to become more like Minds. But then in becoming a Mind they wouldn't be human anymore... So, then what's the point? When it comes to playing chess, it wouldn't matter how many chimps were tossed together to face a human player. Likewise, there's no number of human grandmaster chess players put together in a room that could outplay the current state of the art chess playing AIs.

Which type of 'benevolent AGI' would be your preferred outcome?

My preferred benevolent AGI is one that provides all humans with the conditions necessary to live a good life. What is a good life? That is something everyone has to decide for themselves, which is informed by a complex stew of genes, culture, education, age and more.

The only thing I am uncertain about is how to handle communities - I and some group of people might choose to live out in the wilderness like our ancestors did, and in doing so forsaking modern miracles like medicine. We can accept the hardships that lifestyle entails being adults, but what about our children? Our grandchildren? Ought the benevolent AGI intervene to offer those children basic medical care? Or education? This isn't a new ethical debate; it already exists like in the case of the Jehovah Witness who object to blood transfusion and force that on their children who may need it, which in some countries can be overridden by medical staff and the government.