site banner

Culture War Roundup for the week of December 8, 2025

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.

No email address required.

I've believed for a long time that conspiracy theories in general are a cope to deal with the fact the people who are in charge of our institutions are just people, like everyone else, and even when competent, they simply aren't capable of being perfect all the time. When there's universal agreement that things like presidential assassinations, mass shootings, and terrorist attacks shouldn't happen, and shouldn't be able to happen, it's oddly reassuring to blame them on the malevolence of a shady cadre of global elites because if this is the case, then the solution—getting rid of the malevolent actors—is simple. If instead such tragic events can happen despite competent, well-meaning, hard-working people doing their jobs, this lack of control makes things much scarier.

I don't want to dwell on dark subjects, so I'll go back to media remastering, which is about as low-stakes as you can get. For the past decade or so, I've made a habit of chasing down the best-sounding versions of audio recordings I want to add to my collection. In a rational world, this would mean simply finding the most recent releases, since one would normally expect that the continuing improvement of technology and best practices would yield increasingly superior results. Of course, you don't have to get too into the weeds to know about the "loudness wars" in CD (and now digital) mastering has hobbled sound quality since the late 90s, leading some to believe that the earlier versions when they didn't do this must be superior, but even then the answer is not always clear cut. And even going back to the original vinyl doesn't solve the problem. With any given release possibly having dozens of unique versions, finding the best one through trial and error would be expensive (especially if you're trying to get your paws on rare or foreign issues), and there are no quick and dirty rules you can follow.

The best resource for researching this is the Steve Hoffman forum, hosted by the namesake mastering engineer known for his high-quality audiophile issues. While Mr. Hoffman's warm, buttery sound is controversial, his forum attracts people from the industry, and there's a lot of inside baseball regarding the way things actually operate, a lot of it coming from Steve's own recollections. While things have changed in the recording world since the introduction of the compact disc and subsequent changeover to all-digital recording, an explanation of how the process worked in the vinyl era is instructive on the pitfalls of trying to remaster older recordings for CD.

Say an album is recorded in 1975. The band recorded that album to multitrack tape. Once the recording was finished, the engineers mixed the album down to a two-tack stereo tape, called the studio master, at which point their role in the process ended. But the tape still had to make it onto a vinyl record. The studio master, or a copy, would be sent to a mastering studio where this conversion was done. The limitations of the format required that bass be cut and treble boosted on the record according to a standardized formula, the end user's equipment reversing this process during playback (this is why a special phono preamp is required to use a record player with a stereo). The mastering engineer would also make other adjustments so that the recording would fit within vinyl's physical limitations and make large scale adjustments to ensure that all the tracks had similar volume, equalization, etc. This resulted in a "production master" which the mastering engineer would use to cut a wax "laquer" (more adjustments were made during this process; the art of cutting vinyl is mostly lost and most modern reissues sound like crap because of it), which was plated with metal and thus became a "mother". The mother was used as a mold to create "stampers", which were sent to pressing plants for the manufacture of the final disc. It should be noted that copies of the studio master would be sent to foreign labels or distribution arms in other countries, and these would create their own production masters and laquers for their domestic releases. Both the mothers and the stampers wore out with use, and popular albums would need to be cut again for reissues.

Fast forward ten years and you're a young mastering engineer who just got a job with Major Label. Both you and your employer are excited about the burgeoning digital revolution and they want you to put together a CD release of the 1975 album, which was very popular. Since compact disc doesn't require the same compromises that vinyl did, you want to use the studio master to ensure the best possible sound and as accurate a representation as possible of what the original engineer intended. You quickly realize, however, that this will be impossible, since the record was a British release and you're working for the American label that owns the rights to it, and with your deadline you don't have time to make inquiries to see if you can get the tape on loan. To make matters worse, a series of mergers and acquisitions in the past decade means that even at home, tapes have been stashed hither and yon and no one seems to have any idea about where everything is or even what anything is, since nothing is clearly marked with anything other than a track listing. Some tapes are nth generation copies that sound terrible, one was split up oddly for an 8-track release, one has interesting mastering choices that make it sound wholly different than any version you've ever heard, several are okay, several are Eq'd for vinyl and will require processing to make usable, and one has "DO NOT USE!" written on the box in magic marker.

Of course, the one with cautionary language was the best sounding by far, and in your time crunch, you don't have time to ask questions so you just ignore the warning and master the album without asking any questions. You find out later that this was indeed the copy that had been sent over from the UK, and it was marked Do Not Use because it wasn't a production copy with RIAA equalization and if it had been used to cut a lacquer it could have ruined an entire pressing. Six years later no-noise is invented and the label wants to release a "better" version that takes advantage of the new technology to eliminate the tiny amount of tape hiss audible on the CD issue you made. But by this time you've moved on and the engineer, in a time crunch, takes the warning literally and uses an inferior copy that requires him to crank the noise reduction up to 11, absolutely killing the recording. Your original remaster was fine, and any benefits of additional noise reduction would have been dubious at best, but this new improved version supersedes the old one and is now the only thing available in the US market. Then ten years after that the British label decides to do a global deluxe edition touting that it was "from the original master tapes", which was true, except by now the loudness wars are in full swing and the whole thing is compresses to shit.

Meanwhile, back in 1986 the British label exec decided that the CDs should sound as close to the original albums as possible, and specified that the production masters should be used. The Dutch couldn't find their own tapes, so they requested the master tapes from the British label, and were given the actual master tapes for their 1989 domestic release, which is clearly superior to every other available release, but they started using the US mastering 2 years later. So now you're looking to buy the CD and you're confronted with a bevy of options. The 2002 global release is available in any store but sounds harsh and overcompressed. It's an ear-bleeder. The 1991 US version is readily available on the used market but sounds even worse. The old British version is easily available and sounds okay but not great. The original US version is kind of hard to find and sounds better than the British version but still has its problems. And the 1989 Dutch version is obviously the best, but it was only in production for 2 years and used copies go for top dollar.

The point of all of this is that no one intended for there to be a whole bevy of crappy releases. Record companies had discovered a gold mine in reselling albums their customers had already bought, and to do so they had to tout some improvement over the previous sound. So they latched on to anything they could find that was technically true, regardless of whether it was the best possible version or even an improvement over the original vinyl. I highly doubt the Mad Men errors were part of a cynical ploy to drum up publicity through a fuck up, because I don't see what they have to gain. The show is from 2007 so it's not like anything is going to require a great deal of restoration. The whole 4k thing is a marketing gimmick because the benefits over 1080p are dubious to begin with on most televisions, and are completely obviated by whatever lossy compression algorithm they're using. As long as it's technically in 4k they haven't lied, and the goal is to get the product out as cheaply and quickly as possible. They probably sent out the wrong files to production, or raw, unedited files, and asked for a conversion into 4k, and didn't bother to verify anything. This isn't a restoration that's done frame by frame but a combination of computer upscaling and rescanning film at higher resolution. They weren't paying anyone to pick over all 100 hours or whatever with a fine-toothed comb.

So there were obvious mistakes, but they'll be fixed in a week, and nobody will care or even remember after that. I certainly don't think a significant number of people are going to subscribe to HBO Max now because they didn't know about the 4k version of Mad Men, at least not enough to doubling production costs.

  1. It is wild that vinyl, tapes, and CDs were physical artifacts that still had "git branching hell" syndrome. The more things change, the more they stay the same, I suppose.

  2. I have a 4k TV and sometimes "4k" really does look better, and other times it doesn't. For a while, I was assuming that some sort of compression algorithm behind YouTube was the culprit. A friend suggests that the 60 FPS versions are what I see as better, not the resolution. I am not an A/V-phile. What's "real" 4k versus not? Do I really get more if its 60 FPS. Are there .... more FPS out there?

I don’t know what formats you are using, but if you are streaming there’s a good chance that the service is actually streaming at less than 4K resolution much of the time, even if you have it set to 4K.

What's "real" 4k versus not?

I think what @Rov_Scam was getting at is that, if you compare a 240p video and a 1080p video, the difference in quality will be obvious even if they're encoded in a lossy format like MP4. Whereas when comparing 1080p with 4K, the differences are so subtle that the only way you could tell them apart is if you were looking at an uncompressed video file.

The 4K videos being streamed via HBO Max are "real" 4K, in the sense that the image is 3840x2160, but because of the lossy compression used to make them playable over the internet, most viewers wouldn't be able to tell the difference between that and a 1080p version of the same video file.

What's "real" 4k versus not?

People have been complaining about the quality of Netflix's "4K" offerings for a while now. But judging the quality of a video is much more complicated than just looking at the bitrate (1 2).

Are there .... more FPS out there?

Yes, Ang Lee's last two films were shot at 120 fps.

I haven't seen either, but even watching this clip from the latter (which has been downsampled from 120 to 60 fps), the effect is weird. Somebody in the comments said that in a strange way it makes the movie seem too real by making the artifice inherent to the medium too obvious for the viewer to suspend disbelief, which is kind of what I'm feeling when I watch it. For some reason the high frame rate makes it really obvious that you're looking at a soundstage in a way that isn't obvious merely from a 4k film shot at 24 fps.

When The Hobbit came out it attracted controversy for being filmed at 48 fps which many viewers found distracting in the same way. There was a period where it looked like high frame rates might be the future of the cinema, but truthfully I can't remember any movie since Gemini Man touting them as a selling point. This article lists a handful of movies since which have been filmed at unusually high frame rates, invariably 48 fps: it's quite a ways from becoming industry standard. Curiously, Sonic the Hedgehog 3 was filmed at 48 fps but released as 24 fps. Wonder why they even bothered.