site banner

Culture War Roundup for the week of July 10, 2023

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

The Hollywood actors guild is on a strike. They are joining the Hollywood writers' strike, which has been ongoing for a few months. I did not know this, but apparently Fran Drescher (the loudly nasal woman from "The Nanny") is the president of the union.

Is this strike a big deal? Well, for one, it's the biggest strike for over 60 years. But what caught my eye was her rationalisation. You can read a summary of the demands.

A key demand has been surrounding generative AI. Actors do not want companies to create their own AI replicas of actors, nor to use generated voices and faces.

One possibility could be the actors raising the AI bogeyman as a cover to demand better pay. And to be sure, they are asking for a fairer split from the streaming model. Yet the AI demands are not directly linked to compensation per se, but rather asks about blanket bans. This does suggest that AI fears are genuine and real. Given very rapid progress in the generative field in recent years, perhaps they are right to be so.

Whenever I've read about jobs displacement from AI, invariably experts have opined that "the creative stuff will go last". Clearly the people who know their trade best are disagreeing with the experts. I'm not sure if this means that actors are paranoid or if we should disregard the expert consensus. Either way, I suspect we may see more and more of these kinds of Luddite strikes in the future, but perhaps not from those who people expected it from.

ai is the headline grabber, but the main issue is how residuals are calculated from streaming shows, plus the fact that most streaming services are bleeding money while traditional tv is dying.

A key demand has been surrounding generative AI. Actors do not want companies to create their own AI replicas of actors, nor to use generated voices and faces.

I'm having trouble finding the demands actually clearly stated, but everything I've seen has been about actors not wanting companies to create their own AI replicas of actors without paying actors royalties for using them or allowing any limitations or creative control over how they are used (e.g., so it could be legally used to fake up video of the actor supporting any product or political position).

As a pure, "don't use AI at all" claim, this doesn't actually make sense. Why would the studios want to make AI replicas of minor actors no one can recognize when they can use generative AI to create virtual actors that look like no real person? Either (1) they're seriously trying to claim the right to not pay big stars for AI replicas, (2) they want AI replicas of everyone who might one day be famous, or (3) they are worried about the legality of existing AI systems and want a data source they know they have rights to.

(Of course, while deep fakes do already exist, there's also the complication that this contract negotiation has to cover future AI technology that doesn't exist yet, so it's hard to be precise about exactly what uses they're worried about.)

Why would the studios want to make AI replicas of minor actors no one can recognize when they can use generative AI to create virtual actors that look like no real person?

Someone I heard talking about it on the World Service a few days ago (an actress representing some union or organisation or other) basically said that things like reshoots and follow-up gigs in, for instance, TV shows can be pretty important for small-time actors, and what she was worried about was that production had sometimes started filming with a real actor to begin with, but if they wanted to make changes/add small pieces late in the day using AI for such things. She objected to this both on the count that it deprived the actor of income from reshooting but also that it's unfair for them not to be able to 'control' their own performance, partly in the sense that they might not be happy with the quality of the AI performance and that that might reflect poorly on their reputation.

I'm thinking that this whole strike is a massive opportunity for improvement from the pov of consumers and studios.

The quality of the output of Hollywood is abysmal and the system is corrupt and dysfunctional enough that reform is very hard. What can conceivably force improvement? Competition.

There are competing production centers in both europe and Asia but they lack the American financial and production resources to truly compete. This strike provides the perfect opportunity for studios to expand the capacity of their oversea's production centers and for these centers to grow into more full competitors to Hollywood. The distribution pipelines are already there and audiences seem increasingly willing to consume content from other markets, especially younger people. A lot of English language things can probably be sourced through the UK and produced in Europe (as is often already the case).

Downsize and diversify.

This seems like win for everyone except maybe the inbred Hollywood system.

What changes to writing would make the output better than abysmal, in your opinion?

More real diversity. Competing creative centers.

Diversity of thought rather than just outward appearance?

In a broad sense, sure. Diversity of culture as well.

Those experts weren't experts then.

AI is a while from generating novel texts and scripts and such (don't ask me how long though); but it is about 15 minutes from getting fed a couple hundred hours of footage and being able convincingly replicate a specific human saying whatever the fuck you want them to say.

Quite right, audio alone is trivial. David Attenborough narrates 40K https://youtube.com/watch?v=97Zfv5OMqgE

Suddenly everyone is a luddite.

I just finished listening to a radio show where the hosts, a young man and woman, were fretting over the potential dangers of AI. Even managing to make a mock AI radio broadcast. (That's exactly where the real danger is, by the way, very scary). They even talked to a very concerned NPR journalist who made it clear the potential AI takeover was no laughing matter and was a threat facing journalism that needed to be met head on. And the list just keeps growing of the various educated folk fretting over their potential obsolescence.

However, it seems rather late to say that we can't do X, Y and Z after the last 30 years of mass immigration. Why should the economic slot of 'actor' or 'journalist' be better protected than construction work? If there was automation possible in any blue collar labor it would be automated immediately. But gasp my CAREER? How dare they...

On top of that the movie industry is a cesspit of nepotism, greed and every nasty human impulse you can imagine. Sorry, I'm not all that miffed that the last 2% of the new Marvel picture that isn't CGI is going to be AI generated CGI of a Hollywood actor instead of the real life unholy blend of nepotism and narcissism in human form. (they're so lifelike, almost like a real human beings)

And for opposite end of the movie industry I'm not all that bothered by the proposition that 'movies are art' or whatever. If I want real Hollywood art I can find it AI generated depicting the various Oscar nominated actresses attempting to suck on Harvey Weinsteins deformed penis, next to a compilation of them thanking him during their Oscar acceptance speech. Followed by their #metoo headlines where they claimed it was all rape. (No trade back though)

I'm ready for the hostile AI takeover of modern high culture. Crossing my fingers that the AI version of a man in a dress trying to sell me beer is more palatable than the real one.

In the case of journalism, while you may call their arguments motivated reasoning you are completely misrepresenting their stated position, which is not that automation is bad in AI because journalists just deserve their jobs more than everyone else, but rather that journalism is an important public service the quality or accuracy of which may be negatively impacted by AI.

In the case of the actors though, you seem to be objecting to... their union trying to resist the introduction of practices which will be unfavourable to its members? That's their job?

The journalists stated position is no different from a construction workers. Building a house is important service! Yet the quality of construction work has gone downhill dramatically where I'm from because of immigration. On top of that now you have drastically increased crime within the industry. No journalist ever gave a crap. They just close their eyes to the problems and celebrate 'diversity', refugees and more immigration inbetween complaints about astronomically high rent and housing prices.

And that's just if we assume that the modern journalist is providing an important public service. I'd maintain that most are not. Which was the opposite of construction work. And the few that are living up to the fantasy that most journalists wrap themselves in when criticized are more likely to be ostracized from the mainstream than not. Eva Bartlett and Assange come to mind.

In the case of the actors though, you seem to be objecting to... their union trying to resist the introduction of practices which will be unfavourable to its members? That's their job?

That's not the objection at all. I object to how ugly the industry is and how blind the people who work in it and support it are to the disenfranchisement of others. It's the ugliest form of hypocrisy I know of. The rich and powerful celebrating the deteriorating conditions inflicted on the poor and powerless. They then have the gall to cry about it when a similar proposal is presented for them. As if they just have to live the super privileged life of an 'artist' in the most expensive places on earth.

Motivated reasoning is a given. But that's something we all suffer from. I just expect, at the bare minimum, people show some, dignity, respect and a modicum of self awareness. These types of people are an ugly embarrassment.

The rich and powerful celebrating the deteriorating conditions inflicted on the poor and powerless.

What is this even in reference to? Where is this happening among actors except inside your head? You seem to have constructed this idea of the heartless New York journalist who hates the working class, but on issues such as working conditions and minimum wages they are surely much more pro-workers' rights than the median. If you want people didn't 'ever give a crap', maybe have a look at Republicans.

Eva Bartlett

Lol. A credulous hack. Just being anti-US doesn't automatically make you brave and noble or produce good journalism, especially if you're running cover for dictatorships; she is ostracised for good reason.

Nigh every actor champions pro-immigration and pro-diversity rhetoric.

You seem to have constructed this idea of the heartless New York journalist who hates the working class, but on issues such as working conditions and minimum wages they are surely much more pro-workers' rights than the median.

I can't help you with that, given I did not write such things.

If you want people didn't 'ever give a crap', maybe have a look at Republicans.

Why not both? And for what it's worth, Republicans think just as fondly of themselves as journalists do. They don't see themselves hating anyone regardless of how harmful the policies they support are.

Lol. A credulous hack. Just being anti-US doesn't automatically make you brave and noble or produce good journalism, especially if you're running cover for dictatorships; she is ostracised for good reason.

But being pro-truth when it happens to side against the mainstream rhetoric on Assad does make you a good journalist, even if just by chance.

All of this seems besides the point though. Your argument about journalism and the stated position of journalists being somehow different to the position of construction workers was just bunk. I'm not surprised you would try to talk yourself away from it with irrelevant stuff but it's not very interesting.

but rather that journalism is an important public service the quality or accuracy of which may be negatively impacted by AI.

Well, from the perspective of many here (a perspective I can sympathize with), the quality and accuracy of modern journalism is already at a low in absolute terms, so with AI, there is nowhere to go but up.

Maybe if all one reads is online crap one would think that, but in the real world there are huge numbers of journalists doing enormously important work. Just pick up the Economist, FT, New Statesman or many, many others, or tune into the World Service and you can read and hear it.

I'm with you on having no sympathy for the movie industry themselves. Still, any push back against AI is probably a good thing, if it gives us more time to solve alignment. So I plan to loudly support their strike in public.

I don’t see this is even weird. People have always been much more protective of their positions especially as compared to those they see as beneath themselves. Blue collar types and other low wage out groups have known this for decades. When their jobs went away to China, they barely bothered to sneer “you should have gone to university” at the poor unfortunates now watching as their entire town dies and their neighbors get hooked on meth.

By 'they' you presumably mean the right, who resist(ed) any expansion of either the safety net that might have helped displaced workers and opposed significant government investment in industry, new or old.

One has to wonder what will happen if the unions succeed in getting this stuff banned. I mean surely someone else will use the technologies and maybe someday outcompete them.

I think this was a/the take from back when the Motte discussed the WGA strike.

Whenever I've read about jobs displacement from AI, invariably experts have opined that "the creative stuff will go last"

This hasn't been a reasonable position for at least a couple years, anyone saying that is not a real expert and has an outdated conception of AI capabilities

This hasn't been reasonable since Moravec's Paradox became known.

I'm generally an AI/ML booster, but I could understand not wanting to give the major studios an inch, here. There's already a pretty bad tendency to fuck around with the numbers to make even major films have nearly no residuals, when not passing out glorified contracts of adhesion. It doesn't exactly take a lot of creative thought to consider what sort of foolishness a sufficiently motivated bunch of Dark Triad lawyers and beancounters could do, and it takes a lot of thought to bar off every exceptionally-bad option. In that sort of low-trust environment, it's more reasonable to target bans even if you don't think in the absence of a ban that the counterparty would be able to hit you too hard.

Of course, on the flip side, while I'm impressed by as a mostly-voluntary (they're still effectively semi-closed shop) union, I'm not exactly impressed by them as a guild. Them wanting to avoid being automated away (or, worse: being paid peanuts for something that gets laundered through AI for the actual sales item is a trivial tactic) doesn't really change whether someone else will be willing to work with the studios at risk of getting automated later, and having comparable quality, or whether I should care as a buyer.

Whenever I've read about jobs displacement from AI, invariably experts have opined that "the creative stuff will go last". Clearly the people who know their trade best are disagreeing with the experts. I'm not sure if this means that actors are paranoid or if we should disregard the expert consensus. Either way, I suspect we may see more and more of these kinds of Luddite strikes in the future, but perhaps not from those who people expected it from.

This is a pretty outdated read. It was indeed the case at one point, and if you made this prediction even five years ago it might be spicy, but the consensus has moved on and it do so at least a year ago.

Creative in this context means more than just acting or music. It would include things like hairdressers or fashion designers.

I think hairdressers will be around for longer than most of the other creative fields and fashion designer I don't really want to touch, I think they've been a pointless profession for quite some time ai or not.

IMO the streaming model isn’t that profitable yet. We’ve seen a boost in supply of media. My guess is if they get higher wages it will lead to a contraction of the amount of content made which might not be a bad thing.

It's already contracting.

Whenever I've read about jobs displacement from AI, invariably experts have opined that "the creative stuff will go last". Clearly the people who know their trade best are disagreeing with the experts.

Just because people call something "creative", doesn't mean it actually is.

If we restrict ourselves to the domain of cognitive tasks (ignoring the complexities introduced by physical labor), then I think the speed at which different tasks get automated by AI is a decent empirical index of how much creativity a task requires. Computers reached parity with top humans in Chess relatively early because Chess, although cognitively demanding, doesn't require much creativity. Conversely, AI has had relatively little impact on research-level mathematics (as far as I'm aware), which tracks with my intuition that it requires a great deal of creativity.

Of course, the question of what would count as "automation" of art is a complex one. GPT-4 can write coherent novels, but most people agree that they're boring and not worth reading.

Either way, I suspect we may see more and more of these kinds of Luddite strikes in the future

Your use of the word "luddite" suggests that you have a negative emotional valence towards the Actors' Guild strike. But I'm not sure why. I think it's fine for employees to try to negotiate better terms with their employer. In general, people aren't obligated to provide labor to someone else, unless both parties can come to an agreement on acceptable terms (barring previously agreed upon contractual obligations).

Just because people call something "creative", doesn't mean it actually is.

If we restrict ourselves to the domain of cognitive tasks (ignoring the complexities introduced by physical labor), then I think the speed at which different tasks get automated by AI is a decent empirical index of how much creativity a task requires.

That's the AI effect transformed from a sociological observation into an axiom of some, as of yet unformulated, theory of true intelligence.

That's the AI effect

No it's not. The rate at which AI has become competent at different tasks so far has tracked closely with my intuitive notions of which tasks require more creativity. If you had asked me 5 years ago "would you say that drawing a representational image requires more creativity than playing Chess, but less creativity than high level math and science?" then I would have told you "yes".

Your use of the word "luddite" suggests that you have a negative emotional valence towards the Actors' Guild strike.

It's unfortunate that Luddite has a negative connotation, but it seems useful to have a term to refer to people who are concerned about AI taking their jobs, to the point that they're willing to go on strike to enact a ban on AI. Can you suggest a more neutral term you would prefer?

I don't mind one way or the other. I was just pointing out that, as of right now, that term is typically used pejoratively. Maybe that could change in the future.

It's either lawfare or holding out for UBI, and for the first few unfortunates to get the axe, lawfare is likely a better deal.

I'm sure that even banning cloned appearances or voices won't matter, since they can just make Legally Distinct™ versions, and there will be people who are willing to defect and cash out.

If lawyers and doctors were smarter, they'd get right on it too, at least by demanding regulators put massive burdens of evidence to prove that models perform better than humans do. Far easier to pull off today, when the models are still deficient in key areas, versus in 2 to 3 years when it becomes rather obvious they're on par or better. (Obvious, not that they already aren't in most ways that matter)

If lawyers and doctors were smarter, they'd get right on it too, at least by demanding regulators put massive burdens of evidence to prove that models perform better than humans do. Far easier to pull off today, when the models are still deficient in key areas, versus in 2 to 3 years when it becomes rather obvious they're on par or better. (Obvious, not that they already aren't in most ways that matter)

Is there actually an issue here? Doctors and Lawyers are already 2 of the relatively few fields that are legally regulated. A person with a government license must sign off on all significant practice of either and they are legally responsible if they make any bad decisions, regardless of whether they came from their own brains, a magazine article, or the most recent LLM. So even in the "worst case" what would they do to these fields besides make it easier for the licensees, who would probably get paid well for just reading the output, making sure it isn't insane, and signing off on it? I don't think there'd be much push for change until the general public says something like, hey, why do we have to pay this guy so much just to rubber-stamp this AI output.

Yes, we do have more legal protection than most other professions.

My concern is that will likely still not be enough, at least outside the US (faster at the least). Both the UK and India have such a massive underserviced demand for more healthcare such that if the cost of meeting it with automation was deregulation, then both have already committed to it. The former has midlevels, the later homeopathic and ayurvedic quacks.

And doctors in the US are still expensive, so if hospitals decide to cut down on our numbers while retaining the very senior or the absolute best, then it's little consolation to the 90% of who become unemployable if the remaining 10% are making bank supervising or rubberstamping AI.

If AI takes off the way you think it will, why would you want a cloned actor when you can have one designed from scratch by the AI to be optimal for your intended audience?

Whatever value that name brands bring? I didn't claim it's optimal. There are plenty of people who watch movies because they hear that their fave actor is in it.

Don't worry about the doctors and lawyers. If we have experience in anything, it's in raising barriers to entry to protect our privileged status in the economy. There are already laws on the books making unauthorized practice of law or medicine illegal. All you other learned professions are like a century behind the curve here.

If lawyers and doctors were smarter, they'd get right on it too, at least by demanding regulators put massive burdens of evidence to prove that models perform better than humans do.

Please dont encourage them. Legislation slowing the adoption of broadly superior AI over conventional doctors is likely to cost thousands of lives

Look, I'll be honest and say that yes, automating away medicine will bring about untold benefits for the world, and that it would be sad indeed if it was delayed or crushed in its infancy.

That's true for almost everything that can be automated, I'm an unabashed transhumanist, and I want humans to be freed from toil or the ever looming spectre of death and decay.

But those high principles are far easier to espouse when you don't have good reason to think that your life, or at the very least, your wellbeing and that of those you love, will be sacrificed in the process.

Walk in my shoes, if not a mile, just a few feet into the future. After devoting a quarter of my life to medicine, including one year where I worked for free helping the poorest of the poor for no pay (a conscious choice), to the point that my mental health deteriorated to almost nil, I found a small reason to hope that I could escape the circumstances of my birth and find a small degree of happiness out in the better parts of the world. And yet, when I've worked harder than I've ever had in my life, to finally get a foot in the door, it's in the process of slamming into my face.

If we were guaranteed UBI

If I knew that I could get citizenship somewhere that could take care of me

If I knew with any degree of certainty things would work out and it would be all right

Then yes, I'd be at the front of the queue, begging to have myself made obsolete, and happy when it happened.

Anyone who finds themselves in my position and still wishes to sacrifice themselves for the sake of the rest of us, I can only applaud their nigh saintly selflessness and will promise to cherish their memory till the light fades from this world.

But I'm no saint, just a deeply scared and broken individual trying to prioritize himself and his loved ones above the world, while still adding more to this world than he takes away from it. I have saved lives, and I doubt my posting on an obscure corner of the internet takes any. If that makes someone hate me and think my stance worthy of condemnation, I certainly share some of their feelings on the matter. I just don't see it outweighing my need to be safe just a little longer.

That twitterism is forever true: You gotta Get the Bag.

Once you have the Bag, you can start worrying about other shit; but you don't owe dick to shit unless you have at least a bit of bag to your name.

It only feels like a problem to me once you have The Bag and keep trying to get more Bag.

It's reverse social contract theory; and it's why I feel more sympathy to shoplifters and gang bangers then financial criminals and patent trolls: If you don't have security and safety, why the fuck should you abide by all the restrictions of society? What have they done for you lately?

If you don't have security and safety, why the fuck should you abide by all the restrictions of society? What have they done for you lately?

This is why a large middle class is necessary for a stable society.

A radiologist friend of mine was pretty gungho on AI when ChatGPT came out, but he's pretty rapidly soured on it and is motivated to want a lot more regulation of that sort. People have been talking about automating away radiologists for decades at this point, but This Time is Different. Really.

I have little doubt that it is technologically possible to train an AI to make most major radiological diagnosis with average or above-average accuracy. The main obstacle is medical privacy laws restricting the data. HIPAA may be the most destructive statute ever enacted by the United States congress. The fact that it passed the senate 100-0 is perhaps the greatest indictment against democracy that exists.

Buried in this statement is the assumption that only the United States is capable of performing cutting edge medical research.

* which is probably true

I think that’s a bit much. The intent of the law was to prevent things like your medical records being used against you at work. Like you have a genetic risk for a disease and thus become effectively unemployable because your medical history or genetic data would put you in a high risk pool that would make your boss’s insurance rates go up. And had it been properly written, I don’t think it would be a problem for AI radiology at all.

I am not claiming that ChatGPT 3.5, which was the SOTA at the time of release and what he almost certainly used, is better than most doctors. I would say it's surprisingly good, about on par with a bright med student in say their third year, which is still plenty useful.

I have on the other hand, become well acquainted with GPT-4, which is where I make that case. Leaving aside my subjective impression, not only did it pass the USMLE, it did so with a 95th percentile score. That's not just a begrudging pass, it's stunning.

Given that the whole point of the USMLE is to comprehensively test the capabilities of a doctor, while also gating entry into residencies, I struggle to see how that isn't overwhelming evidence in its favor. If I sat the USMLE right now, even fresh as I am from an equivalent UK examination with an even lower pass rate, I couldn't do that, and if I could, it would represent at least half a year of constant studying. Since I and tens of thousands of others who aren't 95th percentile are allowed to practise as is..

Further, while GPT-4 has multimodal capabilities, being able to look at image files and understand them, such capabilities are not currently publicly available. I struggle to see how a radiologist would be making proper use of it, let alone when they almost certainly used the older and weaker 3.5, since half their job is looking for weird blobs in an image and translating them to a legible diagnosis or at least a list of findings.

Yes. It is different this time, and we have receipts to show it. Besides, I'm not the same people who were claiming that would be the case as early as two decades ago, if only because I was too busy memorizing multiplication tables.

At least your friend is smart enough to ask for regulation, if only for the wrong reasons. I'd do the same, but only because it's convenient for me, not because I actually think it's worse at the job.

Oh, yes. To be explicit, the souring is out of his fear of its capabilities (rather, what it's clearly tracking toward), not contempt at a lack of them. He's shifted to questions about HIPAA and big tech companies controlling data, importance of bedside manner (hilarious if you knew him), etc.

In my opinion - as a writer myself - is that right now, AI is approximately at the level of your average competently-written fanfiction, which is a big problem for Hollywood because they write below that level.

As we've seen in recent years, so-called professional scriptwriters have been putting out utter shit on the big screen and prestige TV that fans of the work would often (and do!) write superior takes for free, on the internet. The only thing that separates the two are gatekeeping and connections in the notoriously nepotistic and corrupt Hollywood system.

SAG and SAW fundamentally rely on the studio system for their collective bargaining to make any sense. As soon as photorealistic 3D generative animation gets off the ground, there will be no corporate giants for them to leech off. They see a future where automated scabs run them out of business, and I can't blame them. The mediocre products they produce can in no way compete.

I think you are right about the quality of the average Hollywood screenplay, but I don't think it's even correlated with the "sit down and write a whole script for this concept" skill of the individuals involved. What is required from a screenwriter to produce an excellent movie is the skill set to get an excellent script onto the screen through the gauntlet of executives, producers, directors, and actors who all have a stake in the final product, which your average fanfic writer would be completely incapable of. An LLM may or may not fare better.

What you've said is likely true, but sinecures are rife throughout all economies and will be extracted in due course. I'm not blaming SAG or anyone else, because these are people that have invested their lives (5+ years) into their position and you can understand why they will fight and burn out. Wait until AI hits the lawyers..

In my opinion - as a writer myself - is that right now, AI is approximately at the level of your average competently-written fanfiction, which is a big problem for Hollywood because they write below that level.

Man, as a reader, I have to disagree. AI can write individual paragraphs much better than most writers, but when it comes to themes, character growth, plot, and simple internal consistency between chapters, I find AI sorely lacking. Maybe I'm overestimating your average competently-written fanfic though.

I think that's mostly the short context window. AI can't keep the details straight because it has anterograde amnesia and forgets what it was talking about after a few thousand words.

Right now we are in the centaur stage where a competent fanfic writer stitching together ChatGPT-4 paragraphs and correcting the AI when necessary could produce a better product, or at least be much more prolific at creating an equivalent product.

When ChatGPT-5 is available and it has a context window large enough to fit entire novels, which will surely happen in the next ten years, that won't be true anymore.

For sure. I'd be interested in seeing what the costs for larger context windows will look like. Claude already has a 100k context window, but it's pretty expensive. GPT-4 also has a 32K context window that's even more so. It seems that as the models get stronger they get significantly more expensive to build and operate too.

I think we will soon run into two fundamental constraints:

  1. We will run out of external data to feed the AI, meaning further training will yield diminishing returns
  2. AI-produced text will mingle with human-produced text in somewhat undetectable ways, lowering the quality of any affected data.

In my opinion - as a writer myself - is that right now, AI is approximately at the level of your average competently-written fanfiction, which is a big problem for Hollywood because they write below that level.

Half the reason I wrote a 83k word online web serial is to serve as future evidence that I'm a good writer without AI enhancement, for whatever little street cred that's worth.

I'm incorrigibly lazy, so I've regularly tried my hand at making GPT-4 write fiction both for personal consumption and to do my work for me, and have been disappointed every single time. I haven't even seen any prompt engineering trick that raises the bar to something I would enjoy reading at length.

Of course, it beats the average human by a longshot, just look at the kind of dross people produce in high school literature classes or sort by new in /r/WritingPrompts.

I'd say that a GPT-4 written TV show wouldn't appeal to me only because my standards for what's worth watching are already above the level of an average show, not that most people seem to be as discerning!

Half the reason I wrote a 83k word online web serial is to serve as future evidence that I'm a good writer without AI enhancement, for whatever little street cred that's worth.

And hey, if you turn out to be right, your perspective, values, and ideas might become a much larger part of the agglomerate superintelligence's personality.

Thanks, albeit it only makes me about 0.5% happier about potentially being turned into a paperclip haha.

I'd much rather be kept around and asked myself, since I doubt I'd have a lot better to do than chat with a superintelligence!

One possibility could be the actors raising the AI bogeyman as a cover to demand better pay. And to be sure, they are asking for a fairer split from the streaming model. Yet the AI demands are not directly linked to compensation per se, but rather asks about blanket bans. This does suggest that AI fears are genuine and real. Given very rapid progress in the generative field in recent years, perhaps they are right to be so.

All you need to do is watch Rogue One to see their fears with the CGI Tarkin and Leia.

It's already happened. But I guess AI might make it cheaper and easier

Might as well shut the door now.

Anyways, I think AI and having to put out streaming numbers (to pay more equitable residuals) have been reported as the big sticking points.

The AI thing is obvious but the streaming numbers may be more interesting, in the short run, for us.

Many of these services may just be horribly unprofitable (this whole thing has been a gold rush of trying to blitzscale while everyone else is also doing it) and we'd finally know for sure. Maybe we'll have the transparency of the box office again.

It might mark the end of the streaming wars and possible consolidation if they have to pay profits they don't have.

The economics of streaming services have always seemed really weird to me. The incentives for the services are for consumers to subscribe for as many months as possible and watch as few shows as possible (zero, optimally). That doesn't seem like a formula likely to lead to high-quality shows, and yet Netflix, Amazon, and Apple have all produced a fair amount of high-production-value (i.e. at least expensive, if not to your taste) shows. I can't see how that can be sustainable for companies where they don't get any direct payments in exchange for making good shows. This feels like the stage in taking over a market segment where the companies produce high-quality products cheaply to gain market share / starve out competitors before slowly getting worse.

It doesn't seem to be possible to fund TV shows via a more direct "customers pay for the show" model for whatever reason (probably too much up-front cost, so it would look like a really expensive Kickstarter campaign trading on the reputation of the show runners with a release 1-2 years after payment?), so I don't have a better idea, unfortunately.

I can't see how that can be sustainable for companies where they don't get any direct payments in exchange for making good shows.

I think the idea is that Netflix builds up both a catalog of quality complete workand has enough of a reputation that you stay subscribed for the new, exciting stuff they bring you. Basically, they themselves summed it up as "becoming HBO before HBO can become us". HBO has been around for a while, and they sustain themselves on subscriptions and their reputation for making good shows.

If you are Netflix or any new upstart without a strong brand it is absolutely not in your interests - in the most crowded entertainment landscape in human history - to put out shit when everyone else is trying their hardest for people's eyeballs.

The problem is that Netflix's value proposition (or for a streaming service in general) drops the more competition there is. Netflix succeeded by being a one-stop shop when everyone was content to just license out their content. The more Netflix succeeded by cornering the market, the more competition started to wake up, the more companies trying to drown out their rivals in content (aka peak TV) before becoming enough of a monopoly to be secure or at least enough of a major player to be comfortable and the less everyone makes.

Especially since streaming offers strictly worse profits than the box office it's helping to kill and the DVD industry it definitely already strangled. People are paying prestige TV/film levels of cash and likely not getting that return (HBO does one GOT but then has shows like The Wire and Treme that get subsidized by it). Even worse: streaming makes piracy even worse since the minute it's on stream it's also on the Bay (COVID was a great time for just rolling out of bed, torrenting a "just released" movie and having it on your screen by the time your breakfast is ready.)

so I don't have a better idea, unfortunately.

There doesn't seem to be one. Honestly, I don't think this is even specific to streaming: I think that the internet broke cinema and media and we've been basically managing the hangover since then.

When people can get it all for free it messes with the equation. Streaming was an attempt to exploit this unfortunate fact but it only seemed to be working so well because of a very artificial period of Netflix dominance (and the strike shows that some people think it wasn't working well for them anyways)

No such system will simply ever be as profitable as getting a direct payment for one watch, but unless some magical uncopyable format comes out that's over.

I think the socialists are right about this one: a lack of scarcity is kind of a failure mode for capitalism and here, unlike with patents, copyright and such, we don't have a good patch.

That would be funny if they revealed the profit numbers and it turns out a lot of streaming shows and movies are actually duds.

I have to wonder how much they'd focus on trying to bring in foreign-made content, though. Squid Game was a big damn deal for Netflix, and that was made completely outside of the Hollywood system.