site banner

Culture War Roundup for the week of October 3, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

Okay but garbage collection is better than reference counting for performance though, and reference counting is better than manual for sanity, certainly so when you're writing trees.

garbage collection is better than reference counting for performance though

These aren't the only two options available. Especially with the advanced research that has gone into static analysis these days.

And still, if you want any kind of performance you still have to worry about locality and all the other fun issues that only come up if you understand what is actually happening behind the scenes.

Being able to understand well how to do manual memory management requires more intelligence than just slapping stuff together, and understanding how it works is also very important for using automatic memory management well (if you write a nice, GCed, dynamic language naively or use a ton of lambdas or temporary objects, it may perform very poorly, whereas making it work well with the JIT makes your code look more like C)

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Here is an example. Once upon a time i had to shove about 10 thousand objects into NASA Worldwind and see how it ran. It ran like dog shit. Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame. So it triggered the gc literally multiple times a frame with a few thousand objects to render. That is a level of idiocy only possible if you learned to program in an environment where you were barely cognizant memory was a resource with costs associated with it.

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Eh, this basically feels like a box out of the famous XKCD comic.

Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame

This really is a problem though, if you write 'clean functional immutable nodejs' where every single function creates a new immutable object, it'll perform terribly. I've done it a few times on fun hobby projects, and it really did perform badly.

And learning manual memory management and writing some C definitely is worth doing in learning, and many fields of programming still require writing it. (That said, in most contexts doing it manually, or even worrying that much about memory management, is awful and just slows you down.)

If you don't understand the difference between dick measuring over your choice of text editor, and fundamentally not understanding how a computer works, then yes, it's superficially similar.

My disgust reaction to argumentum ad XKCD aside, it does make me wonder. Did programmers of old complain when things moved from byte code on punch cards to disks and actual languages, with compilers and all? I certainly had a few professors who seemed to have such a chip on their shoulder.

It wasn't byte code on punch cards to disks and actual languages. Compilers were rather early in the game, and macro-assemblers too. FORTRAN goes back to 1957, COBOL 1959, the well-known FORTRAN IV in 1962. FORTRAN was famously designed for punched cards, which is why the strict column rules in FORTRAN 77 and earlier. Aside from grouching about "you're not a real programmer until you've had to sort the box of punched cards you dropped on the ground", I don't think there was serious complaining about the move away from them. Complaints about assembler v. high level languages survived for much longer.

Yeah, it's probably fair that your point deserved more care and elaboration than argumentum ad XKCD can provide. Which: sorry about that! I was overly flip.

So!

Fundamentally software is a rickety tower of abstractions built on abstractions built on abstractions. At the lowest level you've got logic gates, and if you put enough of those (and some other stuff) together in the right configurations you can make stuff like arithmetic logic units; and if you put enough stuff of basically that abstraction layer together, you have yourself a CPU, and that and some other bits gets you a computer; and then you have the BIOS, the OS on top of that, and the language runtime of the stuff you're working on on top of that, and your program running on top of that. Obviously you already know this.

And the reason this basically kinda works is that a long time ago programmers figured out that the way to productivity is to have hardened interfaces at which you program; the point of these interfaces is to avoid having to concern yourself with most of the vast underground of abstractions that form a computer. Which means that most programmers don't really concern themselves with those details, and honestly it's not clear to me they should in the typical case.

That's because making maintainable software is about ensuring that you are, at all times, programming in the level of abstraction appropriate to your problem domain, neither going higher (resulting in perf issues, typically) or lower (resulting in bugs and long implementation times as you re-invent the wheel over and over). For every guy who tanks the performance of an app by not respecting the garbage collector, there's another that decides to implement his own JSON parser "for efficiency" and hooks it up to the [redacted] API, resulting in several extremely-difficult-to-debug issues in production that I personally burned several hours in fixing, all to shave milliseconds off an hourly batch process' running time. Not that I'm bitter.

So I guess that sort of statement-- "you're only a good programmer if you've used a language with manual memory management"-- feels like unjustified programmer-machismo, where someone chooses one of those abstraction layers between bare physics and the language runtime more-or-less arbitrarily and says "ah, but only if you deeply understand this specific abstraction layer can you truly be a good programmer."

Admittedly I work in distributed systems, where 99% of things that actually matter for performance occur over the network.

I'll second memory management, or more broadly understanding pointers, as being an important boundary.

There is nothing fundamentally lost moving from machine code to assembly - one's just shorthand for the other. Transforming C mechanically into assembly by hand is not hard, just pointless and tedious. But languages "above" memory really truly do lose sight of something. There is nothing in the Haskell Man's conceptual toolbox he can use to get a handle on the memory of the system he runs on.

I'll grant it's not often important, but there is a real line between "languages which require memory awareness" and languages that do not, and it's not arbitrary. A real aspect of system execution and performance is totally lost.

Abstractions leak. All of them. Even down to the physical layer. A great programmer needs to know enough about the layers below to recognize when they're being bit by a leaky abstraction. If you don't understand enough about the layers below to be able to recognize an operating system call that doesn't work, or a CPU that mis-executes an instruction, or a memory management system that causes your code to grind to a halt while you pay the cost of deallocation, you're going to be unable to solve a problem. Thing is, physical layer issues are very rare (and simple bit flips are much more common than something as esoteric as rowhammer). Operation system bugs are at least reasonably rare with mature operating systems. Issues due to allocators, on the other hand, happen ALL THE TIME if you have a lot of data. So they're rather more important.

I would honestly put both our complaints in the bucket of "Programmer is ignorant about the level of abstraction they are working at." But people don't know what they don't know. And the only way out is for them to have worked at multiple levels of abstraction to know the bounds of the one they are currently operating in.

I know some really old programmers (old enough to remember punch cards) and while I wouldn't say they are unhappy with the spectacular leaps in ease and power of computing, there is definitely an element of "I had to walk uphill ten miles in the snow both ways" in their attitudes.