This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

Jump in the discussion.
No email address required.
Notes -
It is wild that vinyl, tapes, and CDs were physical artifacts that still had "git branching hell" syndrome. The more things change, the more they stay the same, I suppose.
I have a 4k TV and sometimes "4k" really does look better, and other times it doesn't. For a while, I was assuming that some sort of compression algorithm behind YouTube was the culprit. A friend suggests that the 60 FPS versions are what I see as better, not the resolution. I am not an A/V-phile. What's "real" 4k versus not? Do I really get more if its 60 FPS. Are there .... more FPS out there?
In many ways it was worse: in analog, you lose fidelity on the original every time you make a copy. You see this a bit wit accumulating recompression artifacts on images, but professional production pipelines manage to limit this. But on film it has historically made effects shots look "worse" than others (more copies), leading to some unintuitive choices. Kubrick in 2001 in some parts did multiple exposures of the same film to minimize compositing noise (most notably the darks washing out to shades of gray), which is why it looks good even by modern standards.
More options
Context Copy link
It might not be the culprit, but have you disabled "Motion smoothing"? If you're noticing a quality difference on the basis of framerate that setting is an extremely common problem that's often turned on by default to impress old people.
Motion smoothing on new TVs is absolutely disgusting
More options
Context Copy link
Speaking as an old person, motion smoothing sucks. Old people are the only people who remember why that's called the "soap opera effect".
The effect of 60p is not quite the same. I'm not sure about 24p on a 120p display, never having had one.
More options
Context Copy link
More options
Context Copy link
I don’t know what formats you are using, but if you are streaming there’s a good chance that the service is actually streaming at less than 4K resolution much of the time, even if you have it set to 4K.
More options
Context Copy link
I think what @Rov_Scam was getting at is that, if you compare a 240p video and a 1080p video, the difference in quality will be obvious even if they're encoded in a lossy format like MP4. Whereas when comparing 1080p with 4K, the differences are so subtle that the only way you could tell them apart is if you were looking at an uncompressed video file.
The 4K videos being streamed via HBO Max are "real" 4K, in the sense that the image resolution is 3840x2160, but because of the lossy compression used to make them playable over the internet, most viewers wouldn't be able to tell the difference between that and a 1080p version of the same video file.
You can definitely tell the difference between compressed 4k and 1080p. I just picked a random video on yt: https://youtube.com/watch?v=oqxAJKy0ii4
And the 4k has a lot of details that are too small to be represented in 1080p, and on a big screen you can see them. And I know the youtube 1080p version is gimped with low bitrate, but one thing you can do is watch the video at 4k, then keep the video on 4k but change your screen resolution to 1080p. The details will be gone.
I have no clue about HBO max but at least youtube 4k has enough detail to make 4k mean something.
I believe you, but this is something I'd really like to verify with a double-blind. Some sound engineers claim they can tell the difference between a 192 kHz sample rate and 96 kHz, but I'm honestly sceptical.
Honestly for the casual viewer it's very subtle. But the eye is objectively quite capable of seeing waaay past 1080p. There's a reason that 1080p laptop displays have fallen rapidly out of favor and "retina" class screens are now standard. And why 1080p is the standard for a 5 inch phone screen.
Meanwhile there is no scientific evidence that humans can hear 192 kHz. There are some reasons that someone can pass an ab test though. The transfer function of the equipment could be such that adding or removing ultrasonic sound also affects the audible sound. This would be an artifact of the equipment and not something different in human perception. Alternatively, the human listener could be absorbing and perceiving the ultrasonic energy in a way that's different than typical hearing. It could just be that the vibration is picked up by nerves outside the ears incidentally and therefore allows the listener to distinguish the recording.
More specifically, there has not been a single remotely credible test that would even hint at finding anyone who could hear past 20 kHz. Every time that has been claimed, the claimed ability has completely disappeared when intermodulation distortion in the playback path was removed by using a separate amplifier and speaker element for the > 20 kHz part.
Even then, nonlinear acoustics is a legitimate field of study, and you might want to verify no lower intermodulation products are appearing from thin air as it were. But IIRC that's mostly with higher amplitude sounds.
Nonlinear acoustics is a thing but the amplitudes are orders of magnitude higher and cause more or less immediate hearing damage.
Speaker elements OTOH are well known to have by far the highest distortion of any of the common audio components (right after the ear itself but that gets deep into psychoacoustics). Once that confounding factor is eliminated, tests haven’t managed to demonstrate any instances of someone hearing beyond 20 kHz. It’s not for lack of trying either as plenty of manufacturers would love to have a peer reviewed paper to point at as a sales argument for their 96 or 192 kHz sampling rate capable equipment.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
I wasn't referring to ultrasonic frequencies (I don't think anyone's arguing that humans can hear literal dog whistles), but rather to the recording sample rate.
The way sample rates work, per the Nyquist-Shannon Sampling Theorem, is that in order to represent a frequency in a signal we need to sample at twice that frequency. Since humans can hear a maximum of 20k, we would only need 40k to represent that. Accounting for some additional effects that require a bit of headroom, CD samples at 44.1k and DVD at 48k. Those sampling rates weren't chosen arbitrarily, but for the specific purpose of being able to accurately record and reproduce the full spectrum of human hearing. All sampling any higher does is increase the theoretical limit of high frequencies that can be carried on the medium. I say theoretical because microphones aren't designed to record supersonic frequencies and speakers and amplifiers aren't designed to reproduce them. So all 192k recording means is that, assuming you were able to record it, the medium could theoretically encode frequencies up to 96khz, or nearly quintuple the limit of human hearing. Professionals will record at this rate, but that's because when the files are manipulated digitally having headroom prevents certain bad things from happening if they're opened and saved a lot, though I'm not sure of the exact science behind this. Suffice it to say that higher sampling rates offer no benefit to the consumer.
The story is similar with bit depth. CDs are recorded at 16 bit, but modern "Hi Res" formats go up to 24 bit and probably higher if you look for it. All this does is lower the noise floor. There's a certain amount of noise inherent in digital recording due to rounding off the last bit, and it sounds similar to tape hiss. You've probably never heard it because it's at -96db at 16 bit. By comparison, the noise floor in the best analog sources is at -70, and that's only if you're using filters; it's more like -32 db naturally. An unusually dynamic recording is going to have around 20db of dynamic range, a more typical recording will be in the 14db range, and most contemporary loudness war recordings will have less than 10 db. The amount of music that exists that goes between being so quiet you'd struggle to hear anything and as loud as standing next to a freight train is nonexistent, and the technology already allows for that. 24 bit just lowers the noise floor to -144db, which is quieter, but with the noise floor already so low as to be inaudible at normal listening levels, the extra range is completely pointless, although if you needed to record and incredibly quiet sound and a jet engine takeoff on the same recording then hey, go for it.
This is not correct. The best analog recording media have signal to noise ratio around 70 dB without noise reduction (ie. dynamic filtering). With noise reduction > 90 dB SNR is possible but you get artifacts as a tradeoff that depend on the noise reduction method.
Analog sources can quite trivially have > 100 dB SNR. A simple example would be a high quality oscillator that's switched on and off.
This is a bit of a simplification in that it only considers the variation in short term loudness. You need to additionally consider the difference between the short term loudness and quietest audible element at that moment (somewhat less than 70 dB). Not coincidentally this 70 dB combined with 20 dB loudness variation is close to the CD limit of 93 dB (3 dB lower than the theoretical 96 dB due to dithering).
24 / 32 bits is of course extremely useful for recording and processing audio as you don't need to optimize the recording levels nearly as closely (and risk clipping) and avoid excessive noise accumulation over various processing passes. There are even recorders that combined multiple converters internally to record the entire dynamic range possible with room temperature electronics so that the resulting SNR is always optimal without ever clipping until the operating voltage limits of the recorder itself are exceeded.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
People have been complaining about the quality of Netflix's "4K" offerings for a while now. But judging the quality of a video is much more complicated than just looking at the bitrate (1 2).
More options
Context Copy link
Yes, Ang Lee's last two films were shot at 120 fps.
I haven't seen either, but even watching this clip from the latter (which has been downsampled from 120 to 60 fps), the effect is weird. Somebody in the comments said that in a strange way it makes the movie seem too real by making the artifice inherent to the medium too obvious for the viewer to suspend disbelief, which is kind of what I'm feeling when I watch it. For some reason the high frame rate makes it really obvious that you're looking at a soundstage, in a way that isn't obvious merely from a 4k film shot at 24 fps.
When The Hobbit came out it attracted controversy for being filmed at 48 fps which many viewers found distracting in the same way. There was a period where it looked like high frame rates might be the future of the cinema, but truthfully I can't remember any movie since Gemini Man touting them as a selling point. This article lists a handful of movies since that one which have been filmed at unusually high frame rates, invariably 48 fps: it's quite a ways from becoming industry standard. Curiously, Sonic the Hedgehog 3 was filmed at 48 fps but released as 24 fps. Wonder why they even bothered.
When you're in the real world looking at things motion will blur as you move your head. When you're watching a movie staring at the screen not moving, this doesn't happen. The upshot is that things that would blur in real life don't blur in the screen, and it looks fake. There are also issues where screen refresh rates are designed to be in multiples of 24 and not 60, which make the frame rate out of sync with the refresh rate and causes a stutter effect.
More options
Context Copy link
Whoa! Yeah. I can't exactly put it into words but it's very palpable. I feel like I'm watching something that hasn't been fully mastered or edited yet. There isn't that Cinema "filter" on it that makes my brain go "Oh, cool, movies!"
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link