site banner

Culture War Roundup for the week of October 3, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

What is your ideal programming education ?

Recently trying to teach my younger brother (CS freshman in Canadian university) programming and having that devolving into a yelling session (kicked the dog there) left me wondering about the state of programming education.

How is this CW?
  • Because in any discussion of any type of education system there is an undercurrent disagreement between the blank slatists and the "IQ believers (or whatever this group is called)".

  • How to teach something can also be split along CW lines. See common core, phonics vs whole language, etc.

  • On top of that there is the group representation angle. Certain groups of people are disproportionately represented in programming professions.

My thoughts/priors on the points above
  • I think IQ is very obviously correlated with programming ability, I think this is the default prior of anyone who believes in the predictive usefulness of IQ. However, I would go a step ahead and say that a very specific type of intelligence that probably correlates with IQ score, but is distinct is along certain dimensions could be a better predictor of programming ability. See Dehnadis work.

    My personal observation is that all good programmers I know show signs of high intelligence but not everyone who shows signs of high intelligence shows programming aptitude proportional to their intelligence. I am not entirely sure if its a "wordcel vs shape rotator" issue, the dichotomy isn't as obvious as is with Electrical Engineering for example.

  • I have come across two fairly distinct methods of teaching programming. I would classify them as 'trying to impart intuition' vs. 'trying to impart knowledge.'

    • The former consists of teaching via gamified methods where students are made to play elaborate games consisting of programming puzzles, modify existing code to draw out 2-d shapes and animations, etc. Once students develop some familiarity with changing words in a file to make the computer do something, they are introduced to data types, data structures, control flow, etc.

    • The latter is a more 'rigorous' approach where students are taught the fundamentals such as data types, structures, flow, interpreter vs compiler, etc first; Then they are made to write programs. These programs are sometimes gamified but not to the extent as the former.

    I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method. More on this later.

  • Obvious racial stratification. But I think putting that aside, the gender stratification is worth more discussion. Even the best discussions I could find on the topic simply boils down to "differences interest". I think that isn't the complete picture.

    I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer. I really don't know what it is at the root of this. My superficial intuition is that a certain aspect of becoming a good programmer is just relentlessness. Sometimes you need to try out 100 different bug fixes and read through 50 stack overflow and obscure forum posts to fix a certain problem or get something working. Men in my experience are much much more willing to swim through the stack overflow and debugger sewers than women.

    But that isn't the entire picture, I just don't see women writing naturally good code, if that even is a term. And by that I mean the code a person rights with the knowledge of the fundamentals but no knowledge of coding best practices such as separation of concerns, lose coupling, etc. Men in my experience naturally tended to write "better" code without prior knowledge. A lot of the female students I taught used to roll their eyes when being explained good practices.

Intuition vs Knowledge

Programming is hard. Teaching it is also hard. Beginner tutorials tend to have an order of more magnitude views than advanced tutorials.

I am sure that the intuition based teaching methods were born out of frustration with the fact that students couldn't connect the pieces together despite being aware of all the pieces and how they work. But having seen it first hand, I just don't understand how it can teach someone programming at all.

My brother knows how to draw a submarine and make it sway up and down but doesn't know that void means nothing. He is being made to write out words without knowing what they mean and of course its all served in a bowl of global variable spaghetti. The professor chose dumbed down Java 2-d animation package called Processing to teach the class. The documentation is horrendous, its a shadow of what Java is. Why not just use Java? Or even python??

This is very much madness from my pov. Changing lines in code the way the students in my brothers class are being made to do is so far removed from the act of programming or even the primitives of programming that I am left wondering if the "vibes" people have gotten their noses in there as well.

I was taught much differently with an introduction to compilers, data types, conditionals, etc. All of it in C, and despite using python for 99% of my word, I am eternally grateful for having started with C.

It is so much of an over-correction from what I assume is the traditional way of teaching programming that I just can't wrap my mind around it, It might pass for school children but University? I mean I get it even MIT is teaching intro to CS in Python, but at least they are still teaching the actual language and not some bastardchild of it.

I think the fact of the matter might be that demand for CS degrees far exceeds requirement for CS practitioners. The universities are not being honest to their students and are making it all seem like a game in a with the hope that it will all work out for some reason.

Edit - To further clarify why I think the intuition based method is ineffective.

Intuition is hard to impart.

Here's the submarine example from my brothers class with some more detail. The question asks for "Make the submarine sway up and down in a wave and go from left to right".

To even a notice programmer it is immediately obvious that this means the x-coordinates need to be incremented every frame and the y coordinates are just sin(x). That intuition is abstracted behind a 2-d animation task. This is adding in excessive intellectual baggage, its not necessary to anyone who understands a loop.

Valuable time is being wasted on making 2-d shapes do things as opposed to knowing the tools that make them do things. I could solve the submarine problem instantly because I know what a loop is.

This is a subject near and dear to me. Because I positively loath working with most new programmers. I loath this entire generation of "programmers" that never had to manage memory. That just stitches libraries together to accomplish 99% of their tasks. And if it isn't garbage collected, and it isn't a library, they flail around uselessly.

That's "programming" in the same way purchasing a bird house kit from a hobby shop is "wood working". Yes, you are technically using a tool that is also used in that activity. But 95% of the work was done for you. And they are barely even cognizant of that fact.

But I'm a fucking crazy person. I got annoyed that 86Box didn't work with my HOTAS once, so I downloaded the source and fixed it myself. A certain especially difficult level of Populous pissed me off to such a degree, I download the version of DOSBOX with the debugger built in and began dissecting the bytecode that effects the AI speed. Successfully I might add. Patched a version of the game just for me that was a smidge easier for my aged reaction times. When the Oculus Quest came out, I was annoyed that a lot of Gear VR games would run on it, but the controls weren't mapped to the right buttons. Since most of those games just run Unity, and use the built in Oculus plugins, I discovered I could replace the .net bytecode in the dlls to remap the controls. So I took about a half dozen Gear VR games I liked, unpacked the APK files, edited the DLLs, repacked and signed the APK files, and then loaded them onto my Oculus Quest.

Lately I decided I wanted to learn x86 assembly, targeting the 8088 platform, BIOS/DOS function calls, EGA graphics and Adlib music. Wrote a sprite editor, currently working on an adlib tracker. All in assembly. It's so much fun, I love it.

So yeah, I'm a fucking lunatic.

I came up through a Computer Engineering program. There was programming in all 4 years of it, largely in C++, but also Java and Python (which I hated). But other required courses were the material properties of transistors, which I largely forgot. Then how to create adders and other simple electronics out of discrete components (NAND gates, XOR gates, etc). Then some assembly on this project board I actually still have, but can't recall at all what it ran.

I still remember, after I graduated, I was talking with a buddy of mine who did a Computer Science degree at a different school. At some point he asked me "So how did we get to where we are now, with operating systems and compilers and stuff?" I was blown away that after 4 years, and a Computer Science degree, he didn't know that.

Increasingly, the teaching of programming is "vibes based" as you put it. Fundamentally incompetent people are being handed degrees. The only reason they appear to be able to perform a task that vaguely resembles programming is that people smarter than they or their entire lineage will ever be set the table for them.

This is a subject near and dear to me. Because I positively loath working with most new programmers. I loath this entire generation of "programmers" that never had to manage memory. That just stitches libraries together to accomplish 99% of their tasks. And if it isn't garbage collected, and it isn't a library, they flail around uselessly.

If programming is as simple as piecing together libraries, shouldn't this have a depressing effect on salaries? Yet coders are making more than ever despite the abundance of libraries. Instead of having to be super-efficient, programs are now way more complicated, such as backend and frontend. One challenge has been replaced by another.

If programming is as simple as piecing together libraries, shouldn't this have a depressing effect on salaries?

There's also a lot more demand for software engineers now. Everything runs on apps, web sites, productivity software, etc. Everything that used to be mechanical and complicated now has an embedded computer. Availability of libraries is just one variable that has changed gradually over the course of decades.

Much love and respect to what you're doing and your opinion - I've followed along plenty on your friday threads.

But the problem with this attitude is having everyone write in assembly or C means we'd never get anything done. Re-Inventing the wheel is fun (sometimes), but it's not efficient. Stitching together libraries correctly is still difficult enough that very few people can be competent software architects.

I'd agree that people need to understand they're standing on the shoulders of giants, and I have seen a marked decrease in that knowledge/attitude among younger programmers. But if I had to build some of the things I've worked on without anyone else's great code, I'd be miserable.

But the problem with this attitude is having everyone write in assembly or C means we'd never get anything done. Re-Inventing the wheel is fun (sometimes), but it's not efficient. Stitching together libraries correctly is still difficult enough that very few people can be competent software architects.

My rant isn't so much "Everyone needs to write everything in assembly/C" as "Everyone, at some time, needs to have written something in assembly/C".

I sure as shit am not re-inventing the wheel constantly in my day job. I'm stitching together libraries for efficiency's sake, same as everyone else. But having at one point made my own wheel, even if not a particularly good one, I understand you aren't supposed to use a wheel flat on it's side. Or try to put a flat edge on it. Which keeps me from doing pants-on-head retarded things with other people's wheels.

But like... there are lots of people who are value-positive programmers in today's environment who would never be willing or even able to do anything worth doing in assembly. I bet you're much better than they are -- you're a grandmaster while they're merely competent -- but surely you agree they're still worthwhile to employ, even if only so they can do the dumb grungy projects on which your talents would be wasted. It feels like you are picturing the counterfactual universe as one in which everyone who was currently employed as a software engineer had your talents and depth of expertise -- but I think the more likely alternative is one in which almost no one can live up to your standards, so almost no one writes software, almost nothing gets built, and our society doesn't get to have nice things.

I'm sorry you have to work with them, though. The depth of conviction that makes great programmers great also means it is torture for them to be forced to collaborate with relatively shitty programmers.

torture for them to be forced to collaborate with relatively shitty programmers

This is mostly because we want them to notice our brilliant optimizations and beautiful abstractions in PR review, and decorate our PR with :thumbsup: and :+100: and :sunglasses: emojis. But it all just goes past them and they Approve without comment.

I completely agree. My university was a little "behind" the times and I was raised with the C method, even my AP class in high school was C++. I always knew that it would be of benefit to me and now after being in the industry, the nodejs generation is trash and I hate working with them.

Okay but garbage collection is better than reference counting for performance though, and reference counting is better than manual for sanity, certainly so when you're writing trees.

garbage collection is better than reference counting for performance though

These aren't the only two options available. Especially with the advanced research that has gone into static analysis these days.

And still, if you want any kind of performance you still have to worry about locality and all the other fun issues that only come up if you understand what is actually happening behind the scenes.

Being able to understand well how to do manual memory management requires more intelligence than just slapping stuff together, and understanding how it works is also very important for using automatic memory management well (if you write a nice, GCed, dynamic language naively or use a ton of lambdas or temporary objects, it may perform very poorly, whereas making it work well with the JIT makes your code look more like C)

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Here is an example. Once upon a time i had to shove about 10 thousand objects into NASA Worldwind and see how it ran. It ran like dog shit. Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame. So it triggered the gc literally multiple times a frame with a few thousand objects to render. That is a level of idiocy only possible if you learned to program in an environment where you were barely cognizant memory was a resource with costs associated with it.

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Eh, this basically feels like a box out of the famous XKCD comic.

Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame

This really is a problem though, if you write 'clean functional immutable nodejs' where every single function creates a new immutable object, it'll perform terribly. I've done it a few times on fun hobby projects, and it really did perform badly.

And learning manual memory management and writing some C definitely is worth doing in learning, and many fields of programming still require writing it. (That said, in most contexts doing it manually, or even worrying that much about memory management, is awful and just slows you down.)

If you don't understand the difference between dick measuring over your choice of text editor, and fundamentally not understanding how a computer works, then yes, it's superficially similar.

My disgust reaction to argumentum ad XKCD aside, it does make me wonder. Did programmers of old complain when things moved from byte code on punch cards to disks and actual languages, with compilers and all? I certainly had a few professors who seemed to have such a chip on their shoulder.

It wasn't byte code on punch cards to disks and actual languages. Compilers were rather early in the game, and macro-assemblers too. FORTRAN goes back to 1957, COBOL 1959, the well-known FORTRAN IV in 1962. FORTRAN was famously designed for punched cards, which is why the strict column rules in FORTRAN 77 and earlier. Aside from grouching about "you're not a real programmer until you've had to sort the box of punched cards you dropped on the ground", I don't think there was serious complaining about the move away from them. Complaints about assembler v. high level languages survived for much longer.

Yeah, it's probably fair that your point deserved more care and elaboration than argumentum ad XKCD can provide. Which: sorry about that! I was overly flip.

So!

Fundamentally software is a rickety tower of abstractions built on abstractions built on abstractions. At the lowest level you've got logic gates, and if you put enough of those (and some other stuff) together in the right configurations you can make stuff like arithmetic logic units; and if you put enough stuff of basically that abstraction layer together, you have yourself a CPU, and that and some other bits gets you a computer; and then you have the BIOS, the OS on top of that, and the language runtime of the stuff you're working on on top of that, and your program running on top of that. Obviously you already know this.

And the reason this basically kinda works is that a long time ago programmers figured out that the way to productivity is to have hardened interfaces at which you program; the point of these interfaces is to avoid having to concern yourself with most of the vast underground of abstractions that form a computer. Which means that most programmers don't really concern themselves with those details, and honestly it's not clear to me they should in the typical case.

That's because making maintainable software is about ensuring that you are, at all times, programming in the level of abstraction appropriate to your problem domain, neither going higher (resulting in perf issues, typically) or lower (resulting in bugs and long implementation times as you re-invent the wheel over and over). For every guy who tanks the performance of an app by not respecting the garbage collector, there's another that decides to implement his own JSON parser "for efficiency" and hooks it up to the [redacted] API, resulting in several extremely-difficult-to-debug issues in production that I personally burned several hours in fixing, all to shave milliseconds off an hourly batch process' running time. Not that I'm bitter.

So I guess that sort of statement-- "you're only a good programmer if you've used a language with manual memory management"-- feels like unjustified programmer-machismo, where someone chooses one of those abstraction layers between bare physics and the language runtime more-or-less arbitrarily and says "ah, but only if you deeply understand this specific abstraction layer can you truly be a good programmer."

Admittedly I work in distributed systems, where 99% of things that actually matter for performance occur over the network.

I'll second memory management, or more broadly understanding pointers, as being an important boundary.

There is nothing fundamentally lost moving from machine code to assembly - one's just shorthand for the other. Transforming C mechanically into assembly by hand is not hard, just pointless and tedious. But languages "above" memory really truly do lose sight of something. There is nothing in the Haskell Man's conceptual toolbox he can use to get a handle on the memory of the system he runs on.

I'll grant it's not often important, but there is a real line between "languages which require memory awareness" and languages that do not, and it's not arbitrary. A real aspect of system execution and performance is totally lost.

Abstractions leak. All of them. Even down to the physical layer. A great programmer needs to know enough about the layers below to recognize when they're being bit by a leaky abstraction. If you don't understand enough about the layers below to be able to recognize an operating system call that doesn't work, or a CPU that mis-executes an instruction, or a memory management system that causes your code to grind to a halt while you pay the cost of deallocation, you're going to be unable to solve a problem. Thing is, physical layer issues are very rare (and simple bit flips are much more common than something as esoteric as rowhammer). Operation system bugs are at least reasonably rare with mature operating systems. Issues due to allocators, on the other hand, happen ALL THE TIME if you have a lot of data. So they're rather more important.

I would honestly put both our complaints in the bucket of "Programmer is ignorant about the level of abstraction they are working at." But people don't know what they don't know. And the only way out is for them to have worked at multiple levels of abstraction to know the bounds of the one they are currently operating in.

I know some really old programmers (old enough to remember punch cards) and while I wouldn't say they are unhappy with the spectacular leaps in ease and power of computing, there is definitely an element of "I had to walk uphill ten miles in the snow both ways" in their attitudes.