This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
I've all of the sudden seen AI blackpilling break out into the normie space around me. Not so much about FOOM, and paperclipping, or terminator scenarios, but around the sudden disruptive nature, and especially around economic upheaval. Not exactly sure why. Veo3 has been part of it.
For example, coworkers suddenly aware that AI is going to completely disrupt the job market and economy, and very soon. People are organically discovering the @2rafa wonderment at how precariously and even past-due a great deal of industry and surrounding B2B services industries stand to be domino'd over. If my observation generalizes, that middle class normies are waking up a doompill on AI economic disruption, what is going to happen?
Let's consider it from 2 points of view. 1 They're right. and 2. They're wrong. 1. is pretty predictable fodder here - massive, gamechanging social and economic disruption, with difficult to predict state on the other side.
But is 2 that much less worrisome? Even if everyone is 'wrong', and AI is somehow not going to take away 'careers', people in mass worrrying that it's so will still manifest serious disruption. People are already starting to hold thier breath. Stopping hiring, stopping spending, running hail mary's, checking out.
Somehow, it's only senior management who doesn't realize the impact. (They keep framing 'If we can cut costs, we'll come out on top, instead of following the logical conclusion, if everyone stops spending the B2B economy collapses.) - I have a nontechnical coworker, who has recently recreated some complex business intelligence tool we purchased not long ago using readily available AI and a little bit of coaching. He had an oh shit moment, when he realized how cannibalized the software industry is about to get. The film industry seems about to completely topple, not because Veo3 will replace it immediately, but because, who's going to make a giant investment in that space right now?
I suspect the macro economic shock is going to hit faster than most are expecting, and faster than actual GDP gains will be made, but maybe I'm just an idiot.
My gut feeling is that this AI wave will be a short panic and then basically blow over. To predict massive job loss you have to assume that jobs are already distributed rationally, and that companies are good at adapting to and using new technology efficiently. Neither of these are even remotely true!
If you've ever seen how the sausage gets made at a major company, jobs are very much withheld and created on more of an internal, political basis than any actual needs the companies have. On top of that, most major organizations are still barely adapted to using spreadsheets and the most simple algorithmic techniques that were created decades ago. Literally just using excel to automate tasks could save these companies tens of millions of dollars a year. And yet... they don't?
So the idea that just because there's a new technology coming out that can do a bunch of fancy new stuff, does not convince me that we'll have massive job loss at all. What will likely happen, and what has already been happening for a while, is that the people in white collar roles who can use these tools will just shift off more and more work to them privately, and pretend they are still just as busy. The roles might get a tad more competitive.
But we're not going to be in a doomsday scenario where everyone loses jobs, even IF AGI comes out tomorrow.
You say this is because of political/irrational reasons but I think it's because the human capital available to them is too stupid to use a spreadsheet to (e.g.) predict when they need to restock the warehouse. Something intelligent agents should be very useful at!
Currently, you can either have Pete walk the warehouse every week and plan a re-order list based on his vibes, or you can buy an expensive business intelligence package, which is risky because it requires you to switch a ton of your existing workflow over. The smart simple solution a college grad in hard sciences with a spreadsheet can come up with is not available to most businesses because that grad doesn't want to work in an import-distribution business.
I've worked with this type of business before, and I think you've diagnosed the problem correctly. I don't see them being able to get AI to automate any of these steps though for the same reasons that they couldn't get spreadsheets to help them.
With current AI, sure, but we might not be too far away from AI bridging that gap from the other side.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link