This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I've all of the sudden seen AI blackpilling break out into the normie space around me. Not so much about FOOM, and paperclipping, or terminator scenarios, but around the sudden disruptive nature, and especially around economic upheaval. Not exactly sure why. Veo3 has been part of it.
For example, coworkers suddenly aware that AI is going to completely disrupt the job market and economy, and very soon. People are organically discovering the @2rafa wonderment at how precariously and even past-due a great deal of industry and surrounding B2B services industries stand to be domino'd over. If my observation generalizes, that middle class normies are waking up a doompill on AI economic disruption, what is going to happen?
Let's consider it from 2 points of view. 1 They're right. and 2. They're wrong. 1. is pretty predictable fodder here - massive, gamechanging social and economic disruption, with difficult to predict state on the other side.
But is 2 that much less worrisome? Even if everyone is 'wrong', and AI is somehow not going to take away 'careers', people in mass worrrying that it's so will still manifest serious disruption. People are already starting to hold thier breath. Stopping hiring, stopping spending, running hail mary's, checking out.
Somehow, it's only senior management who doesn't realize the impact. (They keep framing 'If we can cut costs, we'll come out on top, instead of following the logical conclusion, if everyone stops spending the B2B economy collapses.) - I have a nontechnical coworker, who has recently recreated some complex business intelligence tool we purchased not long ago using readily available AI and a little bit of coaching. He had an oh shit moment, when he realized how cannibalized the software industry is about to get. The film industry seems about to completely topple, not because Veo3 will replace it immediately, but because, who's going to make a giant investment in that space right now?
I suspect the macro economic shock is going to hit faster than most are expecting, and faster than actual GDP gains will be made, but maybe I'm just an idiot.
I’m continuing to make the confident prediction that AI, no matter how smart, will be given to insanity, personality disorders, just plain stubbornness, etc, and that GPP from the serious cybernetics corporation will be like mules or llamas in past days; it has its uses, but for applications requiring sanity, common sense, and on-taskness over brainpower, you still need a person. This is will make the white collar job market worse, especially at entry level, and probably some parts of the blue collar job market. It won’t end the concept of employment. There will probably, at the end of the day, just be more NEETs and lower pay for college degrees in nothing in particular.
This is a little wild of a prediction given that it already seems to be proven wrong.
Current gen AIs already seem poised to be pretty disruptive.
I think the main reason they are not as disruptive is because they aren't done cooking. Why try and squeeze out work from an AI right now when the AI will be better and cheaper in 6 months?
It will always be better and cheaper in sixth months.
Meanwhile I've already used it to do things for my startup that I can't even imagine how I'd have paid for otherwise.
I've also had better luck using AI as a disposable coder I can rush to test and let it run into walls rather than hoping my dev team would be able to figure things out organically. The tension of streamlining vs overloading a dev team is always an issue, and I've certainly found human intervention necessary for a great many edge cases that will show up.
Certainly the administrative loads and documentation tracking for what projects are extant and what is viable is helped. In the weirdest sense AI is useful as a stupid conversation partner to sanity check, and it doesn't mind being bullied or rejecting effusive praise.
More options
Context Copy link
Would you mind sharing any examples? I've failed to engage with AI in any meaningful way, not for ideological/luddite reasons, but the simple inertia of doing things the way I've always done them. I'd love to try something new, and don't know where to begin.
Don't really have time to get into it atm but I didn't want to leave you hanging. Just play with it. Try to get it to do stuff. You'll be amazed.
More options
Context Copy link
Me too, I haven't really played around with AI for some weird anxiety about not being able to apply it effectively. I'm kind of sticking my head in the sand on AI to avoid thinking about my FOMO.
You can literally just start by asking it how to apply it effectively.
More options
Context Copy link
Think of it this way: you've missed out on a bit, sure, but AI progressing so fast right now, that the value you left on the table so far is really insignificant compared to what you can get at any point you decide to jump in.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link