site banner

Culture War Roundup for the week of February 27, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

10
Jump in the discussion.

No email address required.

I had my first experience this week using ChatGPT for my job. I'm a web developer. I manage all of the web properties for a small non-profit.

They have an old wordpress site that is bloated with a lot of plugins. Its often impossible to get support from any plugin developer because they take one look at the mess that I'm managing and nope their way out of helping.

ChatGPT was actually helpful. It helped me diagnose the exact problem. Then it helped me locate an area to implement solutions. Then it gave me a solution. Then it told me how to test the solution. And then the solution failed to do anything. I went through all the steps a few times, and was able to realize it was giving me some bad code. I kept prompting it in different ways and asking for explanations of its code. Eventually I got to a valid solution.

Anyways it was weird as hell. I've worked with junior programmers underneath me before. And I'd be happy to have someone like ChatGPT as a junior programmer beneath me. I'd never recommend they advance past junior programmer. But they basically make for a super googler + semi-dumb code thief.

And I'm not saying that to be like "oh look how crap AI is". Its more like "shit, its too far gone".


I have two young daughters. They are by most standard metrics pretty smart well adjusted little kids. I can say with strong confidence that my wife and I are better at our jobs than the AI. (both our jobs involve a fair bit of text manipulation + talking to people). But in twenty years when my daughters are entering the workforce I don't really have much confidence that they will be better at a text manipulation job than the current generation of AIs.

Forget future advancements, just using the current level of AI will eventually crowd out a bunch of entry level text manipulation jobs.

I look at my daughters playing at night and I think of what world they might grow up in. Right now they love playing a make believe version of day care. They tuck the babies in for nap time, feed them, and then spend an inordinate amount of time giving them diaper changes (including reactions to poopy diapers "eww stinky", or "oh good just pee"). I can't help but think that "daycare professional" might be an oddly resilient career path in the future. Its not like anyone is gonna submit their kids for surveilance in a daycare setting to train up a set of AIs.

Hopefully we'll get to a time where they are wondering about what useful jobs they can have. I was a bit of AI apocalypse skeptic a few years ago. Most of my skepticism is gone.

My prediction is if one of your daughters is in daycare, she may be in charge of watching AIs who are watching the children.

20 years ago we couldn't imagine today even tho nothing has really changed ... Except literally the entirety of human interaction in the western world has shifted to a small device in your hand. It's why I'm shocked when some like Chuck Klosterman doesn't feel like much has changed in our culture in 20 years (not his exact point). It's that the way culture is produced, shared, and taken in has so completely changed that it feels like nothing has.

And when AIs are watching our children in two decades, it'll feel just like yesterday.