site banner

Culture War Roundup for the week of October 3, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

24
Jump in the discussion.

No email address required.

What is your ideal programming education ?

Recently trying to teach my younger brother (CS freshman in Canadian university) programming and having that devolving into a yelling session (kicked the dog there) left me wondering about the state of programming education.

How is this CW?
  • Because in any discussion of any type of education system there is an undercurrent disagreement between the blank slatists and the "IQ believers (or whatever this group is called)".

  • How to teach something can also be split along CW lines. See common core, phonics vs whole language, etc.

  • On top of that there is the group representation angle. Certain groups of people are disproportionately represented in programming professions.

My thoughts/priors on the points above
  • I think IQ is very obviously correlated with programming ability, I think this is the default prior of anyone who believes in the predictive usefulness of IQ. However, I would go a step ahead and say that a very specific type of intelligence that probably correlates with IQ score, but is distinct is along certain dimensions could be a better predictor of programming ability. See Dehnadis work.

    My personal observation is that all good programmers I know show signs of high intelligence but not everyone who shows signs of high intelligence shows programming aptitude proportional to their intelligence. I am not entirely sure if its a "wordcel vs shape rotator" issue, the dichotomy isn't as obvious as is with Electrical Engineering for example.

  • I have come across two fairly distinct methods of teaching programming. I would classify them as 'trying to impart intuition' vs. 'trying to impart knowledge.'

    • The former consists of teaching via gamified methods where students are made to play elaborate games consisting of programming puzzles, modify existing code to draw out 2-d shapes and animations, etc. Once students develop some familiarity with changing words in a file to make the computer do something, they are introduced to data types, data structures, control flow, etc.

    • The latter is a more 'rigorous' approach where students are taught the fundamentals such as data types, structures, flow, interpreter vs compiler, etc first; Then they are made to write programs. These programs are sometimes gamified but not to the extent as the former.

    I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method. More on this later.

  • Obvious racial stratification. But I think putting that aside, the gender stratification is worth more discussion. Even the best discussions I could find on the topic simply boils down to "differences interest". I think that isn't the complete picture.

    I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer. I really don't know what it is at the root of this. My superficial intuition is that a certain aspect of becoming a good programmer is just relentlessness. Sometimes you need to try out 100 different bug fixes and read through 50 stack overflow and obscure forum posts to fix a certain problem or get something working. Men in my experience are much much more willing to swim through the stack overflow and debugger sewers than women.

    But that isn't the entire picture, I just don't see women writing naturally good code, if that even is a term. And by that I mean the code a person rights with the knowledge of the fundamentals but no knowledge of coding best practices such as separation of concerns, lose coupling, etc. Men in my experience naturally tended to write "better" code without prior knowledge. A lot of the female students I taught used to roll their eyes when being explained good practices.

Intuition vs Knowledge

Programming is hard. Teaching it is also hard. Beginner tutorials tend to have an order of more magnitude views than advanced tutorials.

I am sure that the intuition based teaching methods were born out of frustration with the fact that students couldn't connect the pieces together despite being aware of all the pieces and how they work. But having seen it first hand, I just don't understand how it can teach someone programming at all.

My brother knows how to draw a submarine and make it sway up and down but doesn't know that void means nothing. He is being made to write out words without knowing what they mean and of course its all served in a bowl of global variable spaghetti. The professor chose dumbed down Java 2-d animation package called Processing to teach the class. The documentation is horrendous, its a shadow of what Java is. Why not just use Java? Or even python??

This is very much madness from my pov. Changing lines in code the way the students in my brothers class are being made to do is so far removed from the act of programming or even the primitives of programming that I am left wondering if the "vibes" people have gotten their noses in there as well.

I was taught much differently with an introduction to compilers, data types, conditionals, etc. All of it in C, and despite using python for 99% of my word, I am eternally grateful for having started with C.

It is so much of an over-correction from what I assume is the traditional way of teaching programming that I just can't wrap my mind around it, It might pass for school children but University? I mean I get it even MIT is teaching intro to CS in Python, but at least they are still teaching the actual language and not some bastardchild of it.

I think the fact of the matter might be that demand for CS degrees far exceeds requirement for CS practitioners. The universities are not being honest to their students and are making it all seem like a game in a with the hope that it will all work out for some reason.

Edit - To further clarify why I think the intuition based method is ineffective.

Intuition is hard to impart.

Here's the submarine example from my brothers class with some more detail. The question asks for "Make the submarine sway up and down in a wave and go from left to right".

To even a notice programmer it is immediately obvious that this means the x-coordinates need to be incremented every frame and the y coordinates are just sin(x). That intuition is abstracted behind a 2-d animation task. This is adding in excessive intellectual baggage, its not necessary to anyone who understands a loop.

Valuable time is being wasted on making 2-d shapes do things as opposed to knowing the tools that make them do things. I could solve the submarine problem instantly because I know what a loop is.

I think it's important for novice programmers to first get an intuitive sense for what we're doing here and then get a rigorous education in it. Skipping either step is likely to go badly.

Maybe this is because I'm self taught, but I don't think either of your two options are how I think about intro to coding.

The order I learned languages (so far) was perl -> python -> C -> common lisp -> JS -> Kotlin -> Scala -> Go (with a smattering of others in between when I needed them for specific things). I don't think explicit data structures/data types came into (as opposed to them being an implementation detail) that until I hit C, but by the time I did, they were quite intuitive.

Maybe this is just me being solution oriented, but the way I've always looked at it (and introduced people to programming when I teach them) was to start with a problem they were solving, start with a blank page (I've not found giving them starter code to produce good results), and walk them through each of the relevant tools, being careful not to tell them the combinations that they need. For synthetic problems, stealing music from the internet is usually a good place to start. People like music, and getting it is a quick way to go from the "I don't know what I'm doing" to the "I'm a god" adrenaline rush that hooks people on programming. They build up a toolbox of solutions to problems, and later combine them into bigger solutions for bigger problems.

But having the power and agency to solve my own problems (real problems, not synthetic assignments created by an academic) myself is what got me into this, and it's what keeps me doing it every day.

I will, however, agree that the (often quite useless) abstractions do get in the way more than they help, and leave people in a nebulous "I have no idea what's going on" state of mind. Maybe that's the benefit of going through C at some point? It strips away almost all of the magic. Maybe I'm being hard on abstractions here, but library specific abstractions (as opposed to the ones built into the language itself) tend to be poorly done and make my life harder rather than easier.

Solving your own problems is exactly the rush. And it can't be something that someone told you would be a good problem for practicing programming. It has to be something where you want to have the result and are eager to get closer and closer step by step.

For example I wrote bots to automatically fill out various HTML forms, or modded games, built websites for gaming clans, processed and synced subtitles for downloaded movies, scraped websites like the parliamentary election result website to slice and dice the data myself, to process Wikipedia dumps in various ways etc. Nobody told me to do any of these, but such things led me through lots of classic CS topics and I read up on how to do them with a goal in mind. That's so much better than the prof dropping some artificial problem on you.

Like if I have two subtitles in two languages and one is properly synced up, but the other isn't, but might also not correspond to the first subtitle perfectly one-to-one, then how do I find out a plausible correct alignment? This leads to various algorithms like edit distance, longest common subsequence etc.

But this presupposes that you have such computational or automation use cases in your life. For example today with Netflix existing, I might have never learned about video file formats, subtitle file formats, never had to correct audio or subtitle sync issues. If my parents had been rich and Steam existed, I wouldn't have had to learn how to play with the Windows registry, to mess with the Program Files, to understand how to use firewalls to set up LAN games etc. And all such endeavors open up new problems to solve, now you have to install an IDE, figure out environment variables, understamd that vague C++ compiler error, read up on what the words mean, all still with the goal that you want to get that thing working, but without any external pressure like deadlines, so if I want, I can take a side quest into deep diving into graph algorithms for a few days or whatever.

If you don't see such problems around you, if you don't care to customize stuff on your computer, you may look at project ideas but those always seem artificial because someone already did them and we know there is a solution. It's a schoolish problem. When you build something for yourself, you have to define the problem, the scope, then go further and further. Some riddle websites are also cool, like http://www.pythonchallenge.com/

I'm not saying that such self study is sufficient but it made my formal studies much easier because I could tie the concepts to first hand experiences.

To me programming is quite straight forward and easy to understand. Many things that are most difficult to understand in the field come from having to use/do things that gloss over a hundred little pieces of computing reality that make things "easier." To be honest, programming is really shallow. Programming is a textual interface for controlling computers. That's it. Everything else a computer does that might be useful belongs to a different domain. And while programming is shallow per se, all those different domains that you come in contact with via programming are very deep, and that's where most of the joy comes from for me when programming. I really hate "frameworks" and programming-language fetishism. That's all fashion. The essence of form over function.

So, to answer your question, the ideal programming education in my mind starts with real programs that do real things, but that are super simple. Literally start everyone with Hello World, then Hello Susan, then Hello , etc. Read real programs that do real things, learn how they work, etc.

So how is the best way to teach data structures, algorithmic complexity, single responsibility, patterns, (more things which are not programming) etc.?

The most ubiquitous data structures and algorithms are really simple to understand. Well over half of programming with ds is sticking a bunch of primitives into a struct, giving it a name, and keeping track of a list of them (or a list of refs to them). Things like "single responsibility" and "patterns" more generally are already getting into territory that I deem to be fashion. To the extent that those ideas are useful they're trivial, and they go much beyond the point of useful in their prescriptions.

Complexity is also pretty easy to teach by just making people perform different algorithms with pen and paper so they can feel the difference between them. From that intuition it's not difficult to understand how different algos can be a better/worse choice depending on the size of your dataset. But this can also be misleading, so I would include a section on how to test these things in the real world.

Do you have any recommendations for getting into it later in life?

I messed around with Java, html and linux in high school, then got funneled into a pure bio track for about a decade for my career. At one point I went back to learning some R and python without having the fundamentals (I guess the academic version of a script kiddie) purely for doing genome sequencing/scRNA-seq work. Now I'm trying to learn some fundamentals; I've been working through the Harvard Edx CS50 class, with hopes of trying the machine learning class next.

Any thoughts? Keep in mind I'm probably limited to 1-2 hours per day with maybe a bit more flex on the weekend.

Approach it like a craftsman.

I dive into things head first, willing to make mistakes. After I fumble through my first attempt with middling success, I'm left with usually some notion of where i could have done better. Sometimes I have a clear picture of how that is, other times I need to do some research first. Then i do it again with my new knowledge.

Repeat for 20 years.

Eventually you will organically run into the usual problems. The state of your program getting corrupted by refences to objects not being well managed. Using function calls that aren't thread safe in a multithreaded environment. Variables being initialized in the debug environment but not the release one. Your code descending into spaghetti hell because you used one kluge too many.

These are the normal problems neophyte code slingers encounter. Sometimes being forewarned by the elders helps. But in my experience most people need to learn these lessons the hard way. The best way is to code early and often on pet projects. Finishing them is optional, but gratifying.

Downthread, there's a recommendation for people to learn coding by creating mods for Slay the Spire. That's great advice IMO, and could be generalized to "learn to program by writing code that fulfills your goals" if you aren't into video games.

I've had interest in programming before but I've never really had a goal or anything to actually go with it. Tutorials are nice but trying to learn programming without a problem is like trying to learn to use a gun without any bullets.

I have touched before on the idea that a [programmer] must have something they value more than "[programming]": The Art must have a purpose other than itself, or it collapses into infinite recursion.

Something to Protect

Programming is notorious for tutorial hell. One of the reasons for this is that not enough care is taken to differentiate the act of programming vs its applications. An analogue I can think of is that math is a tool, its used in Physics, Chemistry, Engineering, etc. Similarly programming is a tool its used in Physics, Engineering, Web development, etc. However, a lot of tutorials for some weird reason try to sneak in some Physics, Engineering, Web development (not literally) teaching along with the programming. This makes a beginner come away with the impression that the task of learning to program is much more onerous than it actually is. A lot of people would be put off by math as well if Calculus courses had some physics, engineering and biology in it.

So firstly, be clear in what you want to learn. Do you want to learn only programming? Or programming for some kind application? You would only do yourself favors if the programming and the programming relating to the application is well delineated, at least while you are learning and not informed enough to not get overwhelmed.

My suggestion to you based on;

At one point I went back to learning some R and python without having the fundamentals (I guess the academic version of a script kiddie) purely for doing genome sequencing/scRNA-seq work.

  • First get familiar with the act of programming.

    • Choose one language and get good at it. Python is a no-brainer for beginners in general. And especially for the type of work you have to do. Just stick with python, no R for now, until you are not thinking in syntax anymore.

    • I suggest you watch this video and learn everything covered in this video. This video is far from comprehensive. It's far from perfect. But the point is to stick to something at all to begin with! This video will give you the ABC's of python. You can learn the how to write poems and novels later.

    • Once you are comfortable enough with the basic syntax of your first language that you don't have to check the correct syntax over and over again. Start doing some practice problems. There are hundreds of resources for this. LeetCode is an (in)famouse resource. However, be warned that even the "Easy" problems are difficult a beginner. These problems are meant to be tests of computer science application, so if you find them too hard, here are some easier ones. The objective here is to build for lack of better words "muscle memory".

    • Optional step. Once you are comfortable doing at least leetcode medium. You can read up a book on "Data structures and algorithms" or any such books on how to write better code. But I don't think these are necessary for non software engineers.

  • Familiarize yourself with the tools of your trade within your programming language of choice.

    • This means learning how to use specific libraries related to your field. You will have an easier time picking these up if you are already a good programmer. A reason non software engineers (especially scientists and researchers !!) write such terrible code is because they learn the requisite libraries needed for their work before becoming even half decent at programming.
  • Do a project(s) of your own. This is good for soo many reasons. And is an order not a prescription.

    • Self motivating.

    • You will be tested on applying what you know

    • Most in line with real world work

    • Will force you to learn new things yourself via forum posts and documentation scouring, which is an essential skill in programming, despite not being related to it in any way.

    • Rewarding.

You do the above long enough, some of it concurrently and you will reach a point where you can just program. You don't think in syntax anymore, all languages will be the same. For example I can start writing in a new language within 30minutes - 1 hours of looking at it, because I already know what arrays, conditions and loops are; The syntax is superficial to a programmer, But to achieve that, you need to first master ONE language. Your ultimate goal is to understand the meaning not the teachers (compilers) password.


CS50x is good all things considered. For your needs I might suggest CS50p. But ideally you can pretty much learn programming without courses at all.

Thanks for the reply! Sorry, I was away all weekend. I'll take a crack at it.

So firstly, be clear in what you want to learn. Do you want to learn only programming? Or programming for some kind application?

Unfortunately, I think my final goals will be determined more by how much time I can carve out of the rest of my life for it rather than starting with some endpoint in mind. I'm fairly confident I never want to actually be writing the nitty-gritty code that analyzes bio data, but rather am looking for synergy with what I already know. I think at the far end if I ever end up running a bio startup incorporating machine learning it might be fun to mess around with in the beginning stages, or at the very least, be able to converse intelligently with the engineers involved. Bootstrapping a bio startup in my basement is much harder; you can do some bacterial and yeast work (probably illegally in a few different ways) for something in the range of thousands of dollars, but doing anything with mammalian cells would probably be in the 100k range just for capital costs and be more or less impossible to hide.

Anyways, that's where I'm at. I'll give those resources a try and maybe recalibrate my goals over the next few months.

I'd add reading other people's code. I picked up a lot of coding by osmosis as a kid just fumbling around existing codebases, just trying to get a program to do something I wanted. I literally had no idea what is a for loop or what are function calls, I just dived in and tweaked it. Of course it works better the more background knowledge you have. But the main point is to se real code, instead of the idealized stuff that a lot of courses teach, eg "design patterns" just for the sake of design pattern, unrealistic standards of code cleanness, like the very opinionated Clean Code etc. The best open source products from respected companies don't code like that, but get shit done. I'm not advocating for spaghetti code, just to get a taste for real, working codebases as opposed to toy examples with unrealistic elegance. By reading code you can pick up good or bad habits alike, but that's not a reason to avoid it.

I would put this into the "things that you probably should do" bin. Issue being there are a thousand things like this to be done. Read forums. immerse yourself in the culture, read open source code, read new papers coming out, read documentation for fun, etc.

It leads back to your initial point, motivation. Those who are motivated will naturally do all of those things out of curiosity. But I am not sold on the idea that making someone uninterested do those things will make them good. Nonetheless, reading other peoples code does have high returns relative to "things you should probably do aswell".

Also we don't demand this from Engineers or any other profession (maybe barring doctors). Electrical engineers are not prescribed looking at other engineers schematics in their free time (even if it made them better Engineers). Programming is in this weird zone where its not standardized enough that only the most passionate of autists are the ones who make it through all the hoops.

making someone uninterested do those things will make them good

Someone who is uninterested will never become good anyway, so you might as well encourage them to do these things and find out if they are or not. I have worked 25 years in this industry and never met a developer I respected who was not in love with it.

Well, maybe those professions are being held back then. But electrical engineers are close enough to programmers in culture, I'd say. Or at least they are in my bubble. And as for other engineers, there's less of an open culture and things are proprietary. Realistic projects can only be done on the company scale in industry, there no equivalent of free software or Github for those professions.

Also,I don't think that other professions are really as straightforward and standardized as these conversations make it seem. Programming isn't sooo unique. Generic IT admin stuff or network engineering, infrastructure design etc also has a lot of the same difficulties. And someone who mucks around their home router and built some PCs as a kid will be better at such IT work. You'll be a better car mechanic if you're in some car modding community since growing up. You'll be better at roofing, construction planning, flooring, plumbing design etc if you dive into it obsessively. People just don't do it that much for whatever reason.

I am an engineering manager and the biggest change I've seen is the motivation. Me, and other older millennials and younger Xers who grew up tinkering with our PCs would never think of doing anything else for a living. We're in tech because we're into tech. The new generation is different: they're in tech because it pays well. There's such a huge demand for tech workers that zoomers' CVs mostly look like this:

  • burger flipper

  • call center operator

  • tech boot camp

  • tester/data engineer/front-end dev

As soon as I wrote that I realized that the older generations were different. They didn't have PCs growing up. My (very good) architect didn't even have a PC at home until his daughter needed one for school. He majored in chemistry, not CS. Another guy I know ended up working with computers because he was a handyman in a research institute the director of which decided to start a computer importing business. He forgot his keys on the day the shipment arrived, my guy picked the lock open, then picked it closed and was hired for being resourceful.

However, both these guys have a very important shared characteristic: they are very smart. Guy #1 wasn't a simple chemist major but won not-just-participation prizes in All-Union Chemistry Olympiads. Guy #2 is a just a natural-born engineer, trucks to transistors.

And this brings me to the question of scale.

  • You can't scale the way my older colleagues ended up in software engineering

  • You could try to scale the way I ended up in software engineering: give everyone a Linux PC with a bunch of compilers and interpreters and wait for the nerds to emerge, but I think that wouldn't work as well as it worked in the 90's: everything is just too user-friendly these days. I drive a car, but have no intention of becoming a car nerd

  • And finally, this leaves formal schooling as the only thing that can scale

And schools have to be schizophrenic in their curriculum. They have to teach math so people can calculate the tip and plan their spending, literature so people can read and understand references in summer blockbusters, history so they can pick a new name for their street, computers so they don't install malware and can search for stuff online. On the other hand, they have to create people who will become mathematicians, writers, historians and programmers. The only way they can do that is by forcing their students to try everything. Do they hate math? Do they love math? Do they not care either way, but are good at it?

While I think Processing is not the best tool for college-level education, I think Processing/Logo/drawing primitives in Basic/XNA/Love2D are a good introduction to programming at middle-to-high school level. The goal is to comb the student body for people who are not into programming, but are good at it nevertheless.

If I was trying to reach students at the elementary level, I'd probably build out things from the Usborne computer books from the 1980s. I remember endlessly rereading these books at the school library, and even looking back over Introduction to Computer Programming now, it is still an excellent introduction to what a computer does, how it executes a program statement-by-statement, etc. Many activities could be made into craft exercises, covering the essential ideas without having to deal with any particular programming environment.

As for university, I see far too many "introduction to programming" courses attempt to "teach programming" without giving students any idea of what the language even means. How are they supposed to solve problems with code if they don't understand what the computer does with that code? The language needs to be simple enough that this can be done in at most two lectures; scheme is simple enough that you can have a decent go at this.

Types should be introduced early, because "what sort of things go into this function, and what comes back out?" is a really important question to ask when designing a function, and it allows machines to (partially) check students' work, instead of an over-worked TA in a lab session. The programming environment probably has to be interactive and graphical, because those damn zoomers barely know what a file is these days, and we don't have time to teach them how to drive a shell.

My best guess is something like SICP, taught using DrRacket, and moving to Typed Racket ASAP. Replace the hardcore EE/CS examples with simpler problems to solve, since students aren't coming in with as much mathematical sophistication and we're going to be asked to teach a cohort that's not all headed into Engineering/CompSci, and probably borrow some pedagogy/scaffolding/recipes from HTDP. Bring in some of the cool CS stuff once students know how to actually program; you want to show at least some amazing CS ideas so that you hook the people who are susceptible to such things.

Disciplined ways of structuring programs make more sense once a student has made a few big messes, so discussions of coupling, modularity, and so on can come later. But they must not be left for too long.

I have never seen a biological female coder outside of frontend and devops for some reason. I'm only a couple years in the industry though.

I don't have any strong opinions about learning coding other than to watch Uncle Bob's Clean Coding and fully internalize it, and to read and understand SICP.

I was fortunate enough to be mentored by a true expert in the field and had the fundamentals of pragmatic coding hammered into me. We would spend hours reviewing my code and arguing passionately about semantics and ontology. Being forced to explain my rationale and defend every single line I wrote made me a superior programmer to my peers in college.

I think a big part of being a truly good programmer is just having an appreciation for aesthetics and a sense of shame. You SHOULD FEEL BAD for writing bad code, even if it's truly nessessary. You should feel an intuitive sense of disgust seeing a function with too many side effects. Seeing awful inconsistent naming should make you cringe. These are all good things.

I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer. I really don't know what it is at the root of this.

It could just be there are so few in the first place. The proportion of coworkers I have of any gender that I consider particularly good programmers is quite low, and I've had over a period of ten years roughly... three female programming co-workers?

I don't recall them being remarkably good or bad. Like most of my coworkers I would class their code as "basically serviceable."

Have you known a lot of male coworkers that you viewed as being remarkably good coders?

I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer

I know a few. One is one of the best system designers I know, the other is the kind of get-your-hands-dirty fix-anything learn-any-stack type that any company with legacy code needs.

My ideal programming education was, essentially, being taught the basics at a very young age, and then figuring out the rest on my own with the help of the immense documentation, examples, tools, and communities on the internet. If you want textbooks, download them. If you want to RTFM, it's also online. And there are a billion possible complex projects you can set yourself to. Write your own compiler, write your own hobby operating system, write a video game, write a simulator for a complex physical system... lots of stuff.

I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method. More on this later.

I'm not sure either is better tbh, or if it matters. To learn to code, one needs to learn how to solve problems, and that requires 'teaching intuition' by having people solve a bunch of problems, whether it's initially 'calculate the nth fibbonaci number' or later 'design and write a simple video game'. But you also need to learn a thousand different general programming bits, plus another five hundred bits specific to a language, and that's gonna look like 'teaching knowledge' no matter how it happens (in 'java', what are all the types, what's an anonymous inner class, generics, type erasure, reflection, boxing, a reference, a package, how does the build system work). And - good luck 'intuiting' that, you need to read the docs or a bunch of stackoverflow answers that, when put together, are basically the docs.

So you need both, but good / motivated students of one will do the other by themselves.

Also, the problem with the brother's course / the processing.js thing isn't necessarily that it's too intuition-drive, but that it's too dumbed down. One could imagine a python course that was very 'teaching intuition', almost like a socratic method with code, but still used that to lead people through all the difficult parts of beginner programming, as opposed to just showing them how to move a cute little character around a 2d grid. (but then people would start failing again!)

I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer. I really don't know what it is at the root of this. My superficial intuition is that a certain aspect of becoming a good programmer is just relentlessness. Sometimes you need to try out 100 different bug fixes and read through 50 stack overflow and obscure forum posts to fix a certain problem or get something working. Men in my experience are much much more willing to swim through the stack overflow and debugger sewers than women.

I think Dr. Lawrence Summers shed some light on that matter in 2005 but it didn't go over well. I think 70+ years ago women may have been more represented in programming because either programming was easier compared to today or they were not actually doing programming as it's understood today. Probably a mix of both. Coding today has many more parts...more complexity in terms of interconnectedness, hence steeper learning curve. A Node.js trading platform or app is way more complicated, more moving parts, than anything produced in the 60s.

My personal observation is that all good programmers I know show signs of high intelligence but not everyone who shows signs of high intelligence shows programming aptitude proportional to their intelligence. I am not entirely sure if its a "wordcel vs shape rotator" issue, the dichotomy isn't as obvious as is with Electrical Engineering for example.

IQ is one of those things that's necessary but insufficient.

Programming is hard. Teaching it is also hard. Beginner tutorials tend to have an order of more magnitude views than advanced tutorials.

Most people are not learning how to actually program but learning how to follow instructions for a tutorial, which are related skills, but making the leap from tutorials to deeper or fundamental understanding is harder. If as soon as you deviate a little too much from the tutorial you get lost again, means you never understood it well to begin with.

I'll just say I've known a number of excellent women programmers. My personal opinion is that the main issue is just different interests (and often other options). Research seems to support that (roughly, women are interested in people, men are interested in things), it's one of the most repeated findings with biggest effect sizes in psychology.

As to teaching CS, I can't really remember what worked well for me. My sense is to focus on solving problems, and building out the world of tools, knowledge, and techniques that allow you to solve larger and more complex problems. I think it's important to have something concrete to attach abstract things to when learning. But that's just a first thought.

I’ll second the “excellent women programmers” thing; though I am not in tech, I am dating a woman who is as far as I can tell “good” at programming (graduated from Carnegie Mellon roughly in the middle of the CS cohort), and she tells me that there was at least one woman in her cohort who was brilliant enough that her professors described her entering industry as “a great loss to academia”.

That said, she also describes that women were an overwhelming minority, that the entry class was 50/50 M/F but very quickly all the women left, and she’s…well, not happy, but willing, to bang her head against a programming problem for ages without apparently making any progress. (She is quite neurotic, though, and had to really work through that during her undergrad.)

Programming is hard to fake. In most school subjects it's enough to know the "teacher's password", so memorization (of facts or algorithmic processes to solve one of a few types of problems that are likely to be in the test) is a decent strategy for getting good grades.

In programming, you have to problem-solve, face uncertainties, without an option to bullshit your way out of it (the code either compiles or not, it either crashes or not, and the computer doesn't care about your emotional state or your deadline or whatever).

I think the necessary relentlessness and intrinsic motivation required is comparable to playing musical instruments or sports. And incidentally, it's mostly boys who spend insane amounts of time on practicing the guitar or football or yoyo or skateboarding or even video games etc. without any external pressure from parents and teachers.

If you don't give a shit about playing the guitar, and have no aptitude for it, a private tutor will similarly have a very hard job to try and teach you to play.

It's impossible to teach things like this, it's only possible to learn them. By that I mean that the action has to come from the learner. The teacher can't actively put anything in the learner's brain. You can lead a horse to water and so on.

Intelligence surely is a factor here but it's not the only one. I know intelligent people who are not obsessive tinkerers and less intelligent ones who constantly muck around with some stuff, building various kludge and messing with their car, building stuff around the house, repairing this or that in a custom way etc. This itch to make things is a big component in who will actually learn to program and who won't.

@ last paragraph, 'intelligence' is just whatever causes intelligence, and if that trait makes people - in practice, in the complexities of society and technical work - smarter, then it is 'part of' intelligence too, because it really does lead to that person being smarter in the specific area

I'm saying that there's a separate personality trait that's something like the drive to make stuff despite failures, to not give up in the face of difficulty. And this is not always a Hollywood hero upward trajectory. One side of it is someone trying over and over with sub-par results or taking way longer than others with less stubbornness but more intelligence would. I know people who are relentless and put lots of energy into something fruitless and they aren't very skilled for it. They may build dangerous contraptions out of wood and metal but with lousy construction, inefficiently etc. They may obsess over reading history and politics and come out of it believing various pants-on-head tinfoil conspiracy theories, or may spend way too much time on building hopeless perpetuum mobile constructions etc.

Willingness to work hard (intrinsically driven industriousness, relentlessness, stubbornness) can be decoupled from intelligence. On the flip side, many intelligent people are lazy and coast along, wasting their potential.

There's a subtlety here, though. Why is it a separate personality trait, and not a 'component' of intelligence? Because if you are an 'intelligent person' 'wasting your potential', and that waste-of-potential is set up in such a way that it can't easily be externally fixed because you need to have that "drive" to figure out a bunch of different things to be smart, then that's just another cause of having lower intelligence.

Intelligence is usually understood as an ability, the cognitive processing power, your ability to deal in abstractions and meta levels, notice patterns, keep more stuff in your working memory, etc. It's distinct from experience, lexical knowledge, amount of acquired skills etc.

If you don't want to use the word intelligence like this, then let's name my concept intelligence_2, and understand my statement as "intelligence_2 is a distinct trait from willingness to work hard from an intrinsic drive."

Intelligence is usually understood as an ability, the cognitive processing power, your ability to deal in abstractions and meta levels, notice patterns, keep more stuff in your working memory, etc.

I mean, I could say something similar about some of these. Working memory isn't part of intelligence, it's just a separate trait. You can be incredibly intelligent, but just not have the working memory to keep a big list of facts or intermediate steps (although is this actually how working memory really works? idk.), and thus waste your potential in practice. But in practice it's a key component (not to say anything about what memory is or how it's constructed, which, idk, and the same is true of that "drive", they could all be high-level features of some more complicated underlying mechanism that doesn't have those as levers). Which is kind of my argument - intelligence is complicated and messy, it's related to many mechanisms in the brain, and there's not really a particular reason to say that the 'drive' isn't intelligence but working memory is - and we don't really know how intelligence works, so decomposing it in ways that seem convenient isn't necessarily the best approach.

If an intelligent person is externally motivated to do stuff, by teachers, parents, expectation, poverty etc. they can perform well.

So basically, drive can be substituted by something else, but the cognitive power of your brain can't be replaced through external influence.

To tie it back to the original point: just because you get good grades in high school, and get good test scores, doesn't mean you'll be a good at practical programming. You can even do a full CS degree program and still not be good at programming compared to your peers who pour a lot of hours into it from this itch to create stuff.

That's true, but I'm trying to say that a person with that 'drive' will, all else equal, understand things more deeply, figure out more stuff, and therefore be "more intelligent" in every observable sense we say "intelligence", and that's part of why they're better at programming. So saying it's separate from intelligence isn't quite right imo

More comments

I think the necessary relentlessness and intrinsic motivation required is comparable to playing musical instruments or sports. And incidentally, it's mostly boys who spend insane amounts of time on practicing the guitar or football or yoyo or skateboarding or even video games etc. without any external pressure from parents and teachers.

People brought up the "interest in things vs interest in people" gender difference. But I always prefer how much more neurotic women are. It's incredibly difficult to keep smashing your soft brain meats against a problem, fruitlessly at first, for hours, if not days, if you are biased towards experiencing profound and prolonged negative emotions in response to failure.

Women generally avoid the chance of failure more than men, likely due to their increased neuroticism. So they won't master skillsets that require you to fail repeatedly.

I think the necessary relentlessness and intrinsic motivation required is comparable to playing musical instruments or sports. And incidentally, it's mostly boys who spend insane amounts of time on practicing the guitar or football or yoyo or skateboarding or even video games etc. without any external pressure from parents and teachers.

Yup. Good coders are people who have been coding for years on their own time, almost as a hobby or recreation. It's not like something in which you just clock in and clock out.

This is where the discussion turns to "why do we expect programmers to be obsessive and do their profession also as a hobby when nobody expects that from accountants or civil engineers or surgeons or lawyers?" And some accusations that programming is toxic and elitist and exclusionary, biased towards basement dweller neckbeard incel nerd techbros who have nothing better in their lives than messing with a computer.

As this often comes up in discussions, I have tried to think it through and here's my current opinion. Those other jobs are perhaps less fun on the whole (fewer people enjoy them as a hobby). Those other jobs are also not available for practice for kids. At the same time I would expect that good professionals would tend to keep up with developments in their field even just out of interest. And the professional skills of engineers or mechanics of any sort probably correlate to how much they tinkered with things as kids. Whether this correlation is due to direct causation or the common-cause type is another question.

The complaints typically come from two places. One is DEI, the other is from older devs with families and outdated skills. Maybe a third one: accusing tech companies of implicitly requiring unpaid labor for skill development and exploiting the naive twenty something guys and depressing wages because "its supposed to be fun, here's some pizza and a ping pong table, now go make me some profits."

People who have excellent careers do have their career as a lifestyle. You don't become a star lawyer by doing your 9 to 5 and going home. You don't become a star surgeon by working regular hours. If you want to do accounting for a municipal office, you don't have to worry about accounting in your spare time. If you want to manage the finances of a hedge fund, your world is centred around your career.

If you want to have a fairly regular job as a coder you don't have to center your life around it. If you want to be skilled enough to be the tech lead of a graphics engine or writing the coolest new thing in fin tech you are going to have to work very hard to develop a high proficiency. Programming is very much a skill based profession and those who really want to master it will be better at it. You don't become a star musician, tennis player, chess player, coder or surgeon unless really make it your life's mission.

With that said many lawyers write wills for middle class people and many surgeons are removing tonsils while working regular hours.

Okay, but is there anything interesting or unusual in this aspect regarding programming? Why does this always come up when debating software dev?

My guess is that there is an abundance of otherwise low-status kids who, by virtue of spending too much time "in the basement" can punch above their weight. And that these avenues are not properly gatekept by usual prestige and status gatekeepers. Or is there some other reason?

A surgeon can't do surgery at home, but they are probably thinking about it, reading about it or spending more time at work.

First, a note on motivations: It's possible that autist techbros make the field the way it is. But people who usually advocate this position seem like they should also like your skill-development theory, because they are probably a fan of blaming Management for exploiting workers. The reason for the discourse is IMO because the autist nerds are low-status and so should be blamed and shunned even moreso than Management.

Second, an additional theory that likewise doesn't blame nerds: software is in its infancy, and the training and techniques are not well-studied enough. Once we learn more about it, it will become legible and really become a job that an everydayman can do, like plumbing.

Third, which came to mind after writing the second: all of software is automation. Any problems that become well-studied enough to be solved well, become automated away and hidden under layers of abstraction, which is how we got to the present day. With the newfound time, programmers are expected to solve the next ladder-rung of problems. Unlike car repair or plumbing which have physical movements that robots can't do easily, and so always need a person to put in some elbow grease.

I just had an idea. What if this is analogous to slut shaming? What if the point is that the "autists" give away something (programming labor) for too cheap because they enjoy it, thereby depressing the price on it? Young women slut shame their peers who are too eager to have sex with every guy for fun, because this no longer allows the more modest women to place demands on guys and sooner or later the default expectation becomes that every woman must quickly put out.

It's just an overload of the term "expect". It's not that we "expect", in the sense of having a social demand, that good programmers will be obsessive and do their profession as a hobby, it's that we "expect", in the sense of anticipated experience, that programmers won't be good unless it also happens to be the case that they're obsessive and do their profession as a hobby.

Of course, that instantly turns it into a signalling mechanism and Goodhart's it to death. But in spaces where there's less pressure on quality, the pattern is still observable.

However, I would go a step ahead and say that a very specific type of intelligence that probably correlates with IQ score, but is distinct is along certain dimensions could be a better predictor of programming ability. See Dehnadis work.

Caveat: at least some parts of the Dehnadi writeup in "Camel Has Two Humps" were retracted, although mostly summary-side rather than . This doesn't necessarily mean that they're wrong in the broad strokes -- but it does undermine the specific test he used.

I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method. More on this later.

I think there are benefits and costs to each approach, but I've largely emphasized the "impart intuition" approach to start, and then blending in knowledge focuses as time goes on. The failure modes of "imparting knowledge" are less obvious, especially in a classroom where most problems can be reduced into knowledge questions (tests) or can have their intuition components avoided or solved by one or two members of a full classroom (long-term projects). But the larger understanding about problems as things that need to and can be resolved internally instead of by repetition is especially important in computer programming.

More seriously, knowledge-focused studies are not merely less interesting to most new students, but they're also specialized to specific environments. They're important! There are a lot of problems that can arise if you see compsci as solely solving problems, not just in the bad practices sense but actively developing wildly non-performant or unsafe code unknowingly. But there's a lot of people come out of colleges with incredibly in-depth knowledge of Linked Lists, but not a) to avoid using them outside of a job interview, and b) how to learn how to handle the garbage collector for their current language of choice.

I really don't want to do women dirty like this but, I have yet to come across a "good" female programmer. I really don't know what it is at the root of this. My superficial intuition is that a certain aspect of becoming a good programmer is just relentlessness.

I know a good few (including cis), albeit generally more at the enthusiast level rather than as a career. I think it's less common, but that's plausibly social, plausibly preferring people-focused relentlessness, plausibly downstream of having the background, or plausibly just not having the sort of near-autistic 'not letting this go' aspect.

But the larger understanding about problems as things that need to and can be resolved internally instead of by repetition is especially important in computer programming.

I agree with this. Most of being good at coding rests on your ability to detect hidden abstractions in the business logic you're writing-- subtle regularities in the domain that can be used to write easier-to-understand and easier-to-modify code.

There's this saying: "Show me your flowcharts and conceal your tables, and I shall be continued to be mystified. Show me your tables, and I won’t usually need your flowcharts; they’ll be obvious." I think that's saying something basically similar, and I think it's true.

But trying to teach how to do that seems basically similar to trying to teach someone generic problem solving, which professional educators have been banging their heads against forever.

Yes, finding these hidden abstractions feels like "reverse engineering" to me, which in software could be broadly defined as: "determining business rules from code."

What got me started on programming was fractals. To this day, I greet every new language I learn with a Mandelbrot renderer. But I believe there needs to be a hook. For me, it was pretty pictures. That got me into graphics, OpenGL, raytracing, and I learnt programming almost as a side effect. For other people it will be other things. But there needs to be a thing that you want to make the computer do. That empowering cycle, of "I speak the magic incantation and then the machine does my bidding," is what drives motivation, and motivation is the primary factor of learning programming.

This is a subject near and dear to me. Because I positively loath working with most new programmers. I loath this entire generation of "programmers" that never had to manage memory. That just stitches libraries together to accomplish 99% of their tasks. And if it isn't garbage collected, and it isn't a library, they flail around uselessly.

That's "programming" in the same way purchasing a bird house kit from a hobby shop is "wood working". Yes, you are technically using a tool that is also used in that activity. But 95% of the work was done for you. And they are barely even cognizant of that fact.

But I'm a fucking crazy person. I got annoyed that 86Box didn't work with my HOTAS once, so I downloaded the source and fixed it myself. A certain especially difficult level of Populous pissed me off to such a degree, I download the version of DOSBOX with the debugger built in and began dissecting the bytecode that effects the AI speed. Successfully I might add. Patched a version of the game just for me that was a smidge easier for my aged reaction times. When the Oculus Quest came out, I was annoyed that a lot of Gear VR games would run on it, but the controls weren't mapped to the right buttons. Since most of those games just run Unity, and use the built in Oculus plugins, I discovered I could replace the .net bytecode in the dlls to remap the controls. So I took about a half dozen Gear VR games I liked, unpacked the APK files, edited the DLLs, repacked and signed the APK files, and then loaded them onto my Oculus Quest.

Lately I decided I wanted to learn x86 assembly, targeting the 8088 platform, BIOS/DOS function calls, EGA graphics and Adlib music. Wrote a sprite editor, currently working on an adlib tracker. All in assembly. It's so much fun, I love it.

So yeah, I'm a fucking lunatic.

I came up through a Computer Engineering program. There was programming in all 4 years of it, largely in C++, but also Java and Python (which I hated). But other required courses were the material properties of transistors, which I largely forgot. Then how to create adders and other simple electronics out of discrete components (NAND gates, XOR gates, etc). Then some assembly on this project board I actually still have, but can't recall at all what it ran.

I still remember, after I graduated, I was talking with a buddy of mine who did a Computer Science degree at a different school. At some point he asked me "So how did we get to where we are now, with operating systems and compilers and stuff?" I was blown away that after 4 years, and a Computer Science degree, he didn't know that.

Increasingly, the teaching of programming is "vibes based" as you put it. Fundamentally incompetent people are being handed degrees. The only reason they appear to be able to perform a task that vaguely resembles programming is that people smarter than they or their entire lineage will ever be set the table for them.

This is a subject near and dear to me. Because I positively loath working with most new programmers. I loath this entire generation of "programmers" that never had to manage memory. That just stitches libraries together to accomplish 99% of their tasks. And if it isn't garbage collected, and it isn't a library, they flail around uselessly.

If programming is as simple as piecing together libraries, shouldn't this have a depressing effect on salaries? Yet coders are making more than ever despite the abundance of libraries. Instead of having to be super-efficient, programs are now way more complicated, such as backend and frontend. One challenge has been replaced by another.

If programming is as simple as piecing together libraries, shouldn't this have a depressing effect on salaries?

There's also a lot more demand for software engineers now. Everything runs on apps, web sites, productivity software, etc. Everything that used to be mechanical and complicated now has an embedded computer. Availability of libraries is just one variable that has changed gradually over the course of decades.

Much love and respect to what you're doing and your opinion - I've followed along plenty on your friday threads.

But the problem with this attitude is having everyone write in assembly or C means we'd never get anything done. Re-Inventing the wheel is fun (sometimes), but it's not efficient. Stitching together libraries correctly is still difficult enough that very few people can be competent software architects.

I'd agree that people need to understand they're standing on the shoulders of giants, and I have seen a marked decrease in that knowledge/attitude among younger programmers. But if I had to build some of the things I've worked on without anyone else's great code, I'd be miserable.

But the problem with this attitude is having everyone write in assembly or C means we'd never get anything done. Re-Inventing the wheel is fun (sometimes), but it's not efficient. Stitching together libraries correctly is still difficult enough that very few people can be competent software architects.

My rant isn't so much "Everyone needs to write everything in assembly/C" as "Everyone, at some time, needs to have written something in assembly/C".

I sure as shit am not re-inventing the wheel constantly in my day job. I'm stitching together libraries for efficiency's sake, same as everyone else. But having at one point made my own wheel, even if not a particularly good one, I understand you aren't supposed to use a wheel flat on it's side. Or try to put a flat edge on it. Which keeps me from doing pants-on-head retarded things with other people's wheels.

But like... there are lots of people who are value-positive programmers in today's environment who would never be willing or even able to do anything worth doing in assembly. I bet you're much better than they are -- you're a grandmaster while they're merely competent -- but surely you agree they're still worthwhile to employ, even if only so they can do the dumb grungy projects on which your talents would be wasted. It feels like you are picturing the counterfactual universe as one in which everyone who was currently employed as a software engineer had your talents and depth of expertise -- but I think the more likely alternative is one in which almost no one can live up to your standards, so almost no one writes software, almost nothing gets built, and our society doesn't get to have nice things.

I'm sorry you have to work with them, though. The depth of conviction that makes great programmers great also means it is torture for them to be forced to collaborate with relatively shitty programmers.

torture for them to be forced to collaborate with relatively shitty programmers

This is mostly because we want them to notice our brilliant optimizations and beautiful abstractions in PR review, and decorate our PR with :thumbsup: and :+100: and :sunglasses: emojis. But it all just goes past them and they Approve without comment.

I completely agree. My university was a little "behind" the times and I was raised with the C method, even my AP class in high school was C++. I always knew that it would be of benefit to me and now after being in the industry, the nodejs generation is trash and I hate working with them.

Okay but garbage collection is better than reference counting for performance though, and reference counting is better than manual for sanity, certainly so when you're writing trees.

garbage collection is better than reference counting for performance though

These aren't the only two options available. Especially with the advanced research that has gone into static analysis these days.

And still, if you want any kind of performance you still have to worry about locality and all the other fun issues that only come up if you understand what is actually happening behind the scenes.

Being able to understand well how to do manual memory management requires more intelligence than just slapping stuff together, and understanding how it works is also very important for using automatic memory management well (if you write a nice, GCed, dynamic language naively or use a ton of lambdas or temporary objects, it may perform very poorly, whereas making it work well with the JIT makes your code look more like C)

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Here is an example. Once upon a time i had to shove about 10 thousand objects into NASA Worldwind and see how it ran. It ran like dog shit. Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame. So it triggered the gc literally multiple times a frame with a few thousand objects to render. That is a level of idiocy only possible if you learned to program in an environment where you were barely cognizant memory was a resource with costs associated with it.

I'm not dismissing garbage collection whole sale. I'm dismissing programmers who have known nothing else.

Eh, this basically feels like a box out of the famous XKCD comic.

Because whatever fresh graduate they had coding it reallocated memory for each objects screen transformation every frame

This really is a problem though, if you write 'clean functional immutable nodejs' where every single function creates a new immutable object, it'll perform terribly. I've done it a few times on fun hobby projects, and it really did perform badly.

And learning manual memory management and writing some C definitely is worth doing in learning, and many fields of programming still require writing it. (That said, in most contexts doing it manually, or even worrying that much about memory management, is awful and just slows you down.)

If you don't understand the difference between dick measuring over your choice of text editor, and fundamentally not understanding how a computer works, then yes, it's superficially similar.

My disgust reaction to argumentum ad XKCD aside, it does make me wonder. Did programmers of old complain when things moved from byte code on punch cards to disks and actual languages, with compilers and all? I certainly had a few professors who seemed to have such a chip on their shoulder.

It wasn't byte code on punch cards to disks and actual languages. Compilers were rather early in the game, and macro-assemblers too. FORTRAN goes back to 1957, COBOL 1959, the well-known FORTRAN IV in 1962. FORTRAN was famously designed for punched cards, which is why the strict column rules in FORTRAN 77 and earlier. Aside from grouching about "you're not a real programmer until you've had to sort the box of punched cards you dropped on the ground", I don't think there was serious complaining about the move away from them. Complaints about assembler v. high level languages survived for much longer.

Yeah, it's probably fair that your point deserved more care and elaboration than argumentum ad XKCD can provide. Which: sorry about that! I was overly flip.

So!

Fundamentally software is a rickety tower of abstractions built on abstractions built on abstractions. At the lowest level you've got logic gates, and if you put enough of those (and some other stuff) together in the right configurations you can make stuff like arithmetic logic units; and if you put enough stuff of basically that abstraction layer together, you have yourself a CPU, and that and some other bits gets you a computer; and then you have the BIOS, the OS on top of that, and the language runtime of the stuff you're working on on top of that, and your program running on top of that. Obviously you already know this.

And the reason this basically kinda works is that a long time ago programmers figured out that the way to productivity is to have hardened interfaces at which you program; the point of these interfaces is to avoid having to concern yourself with most of the vast underground of abstractions that form a computer. Which means that most programmers don't really concern themselves with those details, and honestly it's not clear to me they should in the typical case.

That's because making maintainable software is about ensuring that you are, at all times, programming in the level of abstraction appropriate to your problem domain, neither going higher (resulting in perf issues, typically) or lower (resulting in bugs and long implementation times as you re-invent the wheel over and over). For every guy who tanks the performance of an app by not respecting the garbage collector, there's another that decides to implement his own JSON parser "for efficiency" and hooks it up to the [redacted] API, resulting in several extremely-difficult-to-debug issues in production that I personally burned several hours in fixing, all to shave milliseconds off an hourly batch process' running time. Not that I'm bitter.

So I guess that sort of statement-- "you're only a good programmer if you've used a language with manual memory management"-- feels like unjustified programmer-machismo, where someone chooses one of those abstraction layers between bare physics and the language runtime more-or-less arbitrarily and says "ah, but only if you deeply understand this specific abstraction layer can you truly be a good programmer."

Admittedly I work in distributed systems, where 99% of things that actually matter for performance occur over the network.

I'll second memory management, or more broadly understanding pointers, as being an important boundary.

There is nothing fundamentally lost moving from machine code to assembly - one's just shorthand for the other. Transforming C mechanically into assembly by hand is not hard, just pointless and tedious. But languages "above" memory really truly do lose sight of something. There is nothing in the Haskell Man's conceptual toolbox he can use to get a handle on the memory of the system he runs on.

I'll grant it's not often important, but there is a real line between "languages which require memory awareness" and languages that do not, and it's not arbitrary. A real aspect of system execution and performance is totally lost.

Abstractions leak. All of them. Even down to the physical layer. A great programmer needs to know enough about the layers below to recognize when they're being bit by a leaky abstraction. If you don't understand enough about the layers below to be able to recognize an operating system call that doesn't work, or a CPU that mis-executes an instruction, or a memory management system that causes your code to grind to a halt while you pay the cost of deallocation, you're going to be unable to solve a problem. Thing is, physical layer issues are very rare (and simple bit flips are much more common than something as esoteric as rowhammer). Operation system bugs are at least reasonably rare with mature operating systems. Issues due to allocators, on the other hand, happen ALL THE TIME if you have a lot of data. So they're rather more important.

I would honestly put both our complaints in the bucket of "Programmer is ignorant about the level of abstraction they are working at." But people don't know what they don't know. And the only way out is for them to have worked at multiple levels of abstraction to know the bounds of the one they are currently operating in.

I know some really old programmers (old enough to remember punch cards) and while I wouldn't say they are unhappy with the spectacular leaps in ease and power of computing, there is definitely an element of "I had to walk uphill ten miles in the snow both ways" in their attitudes.

A couple things stand out to me from your post. I consider myself a very good programmer, and after managing around 100 other ones I've gotten some experience as to what makes good ones. Note that my market is east coast enterprise software for whatever that's worth.

First, I find it surprising you've never met a great female coder. I think you need to work with more people. They're rare, absolutely, but if you're not in a place that hires women because they're women then the bell curve of their quality is the same as men from my experience.

I was trained in the "intuition" method. While I've met phenoms that were self-taught or bootcampers my guess is around 2% of them are good enough for me to want to hire, vs around 10% that have gotten a degree (and those degree programs were, when they got them, using an early focus on data structures etc.).

That being said, I did find so much theory for so long to be incredibly frustrating. It's not a stretch to say that I learned more about coding in the monthlong training put on by my first company than I did in a whole year of school. I think the theory matters, but the current CS and SWE accreditation requirements need some major adjustments, IMO. I would have killed for a databases class and requiring the use of source control.

I agree with you that trying to dumb down coding instruction seems like a win but isn't. You may keep freshmen around long enough to sink into the program where they can't get out without burning money, but they'll be shittier programmers. Your brother's professor using Processing seems almost unbelievably stupid for so many reasons. Java and C# are already high level enough for a newbie to take a crack at them, much less something like Python. Why on earth would you waste a semester not developing exposure to a valuable language?

I am actually, however, grateful that my C/C++ class was second year. The place I was taught had a brutal class that was supposedly just 1 hour but was easily the hardest single programming class I took. Because of it I was able to wipe the floor with other students at a different school when I transferred, but I honestly don't know if I would have fallen in love with programming if my only runtime error messages were just segmentation fault.

Agreed that C and C++ bloooooow as starter languages. You want something with reasonable error messages and stack traces. And good IDE support-- I think statically typed is actually lower-frustration than dynamically-typed while learning because the compiler tells you if you've fucked up in a particularly obvious way before even running the program.

EDIT: Also if I never again have to write a conversion function between (pick any two) char *, wchar_t *, _bstr_t, CComBSTR, CStringA, CStringW, basic_string, and System::String it'll be too soon.

EDIT: Also if I never again have to write a conversion function between (pick any two) char *, wchar_t *, _bstr_t, CComBSTR, CStringA, CStringW, basic_string, and System::String it'll be too soon.

Oh god. Oh god no. All those repressed memories are coming back. What have you done?! I don't have enough whiskey in the house to make this go away!

my only runtime error messages were just segmentation fault.

gdb --args !!

You can even set the SIGSEGV handler to automatically attach gdb to the running program.

GDB is

  1. not easy to learn

  2. even less easy to learn if you are a part of the modern GUI/webapp/the-fuck-is-a-shell generation (so, the problem statement at hand)

  3. doesn't even scale to larger projects, so you can hardly say you'll use it in a real job

Compare it with, let's say, the chrome debug console. Or the vscode debugger for python. They're far more intuitive than x/10g info all-regs, b 0x1234, ni×100, etc.

Sure, but just bt and q already gets you 50% of the benefit.

You can just run a nodejs program with --inspect and the chrome debugger will automatically find it and connect (green icon, top left corner), and then you can debug it just like you would a website! It's much nicer than gdb.

I took the theory-and-math-heavy track at school because I figured it would be interesting. It's true that I learned more during an internship than in any of those courses. In programming in particular, degrees definitely are signaling and gatekeeping.

Still, I think programming is in its infancy. Whenever discussions about "is it really engineering or not" come up and the inevitable comparison to bridges and EE happens, people point out that software is bad and buggy, and lots of people don't follow "best practices."

It could be that in a hundred years or so, they will look back and laugh at how primitive we were. Maybe training catches up and programmers become everydaymen like plumbers instead of passionate autists.

Still, I think programming is in its infancy. Whenever discussions about "is it really engineering or not" come up and the inevitable comparison to bridges and EE happens, people point out that software is bad and buggy, and lots of people don't follow "best practices."

I've had that conversation and I'm not sure we'll ever be able to get to that level of "best practices" because the need for software so heavily outstrips the supply. If we needed 100 bridges tomorrow or the world was going to end I'd assume there'd be some that aren't able to handle heavy trucks. See chinese physical engineering corner cutting.

Plus, getting a great gay sex app together is less important than a bridge. Software isn't as consistently life-critical as designing roads is (and when it is, the standards are really high).

Finally, there's the fact that we're dealing with building things that at some level don't have simple and constant physical constraints to box us into "standards". Nobody can build a bridge that corkscrews up to the moon on their own, but you can do things with software you can't do with any other type of engineering.

A few thoughts:

I learned programming I think by something closer to the first intuitive method, and it's still how I generally encourage people to learn. I would describe it as, first write a program that does something you find interesting or useful, no matter how minor. Pay little attention to design, cleanliness, optimization etc, just bang it out and get it working. Once you have something that basically works, then work on refactoring and adjustment to make it better, possibly with the advice and supervision of somebody more senior. Expand that initial project to be bigger and do more things, or start a new project that's more ambitious, and repeat. Keep at it for a while and eventually you'll learn all of the important parts and gain experience.

I see this as useful in that it maximizes natural interest. I find it rare and difficult to build interest in reading walls of text about elaborate rule systems regarding things you've never done. Both building things you find interesting or useful and getting hands-on experience in how to make it better are much easier to maintain interest in and stick with.

I do think that there's a certain type of intelligence or way of thinking necessary to be a good programmer. It's probably kind of correlated with proper IQ, though not quite the same thing. Like many other related things, I have no idea to what extent it's genetic versus developmental, but clearly many adults just aren't capable of it.

Regarding gender, I'm not really sure why, but it seems women self-select out of development at a very high rate, most specifically American women. Of all of the developers at my company, I think under 10% are women. Women are a lot more common in testing, product reps, project managers, line management, pretty much everything but development. There's zero women in our architecture groups. I'm not honestly sure if women are actually less likely to want to advance into the higher ranks of technical skill, or if it's actually about the same for the total number of women doing development work. I've never personally worked with a woman I thought was a super awesome developer, though some other people I would trust to make such calls have made such claims of some of the other women I work with, and I'd also say not that many of the men were super awesome either.

I haven't seen any reason to adjust my priors on that beyond the 90s-era standard - anyone who has the skills and the interest is welcome to do the job, but I reject the notion that there's any significant overall bias or prejudice keeping deserving people out of the profession, or that we need to put our fingers on the scales in some way to get "better numbers" on the participation of any particular group.

I consider myself a competent (not great, certainly not "rock star") programmer. I was formally taught. I definitely have strong areas and weak areas. I am good at coding and recognizing when someone else has written crappy code, and figuring out higher-level solutions. I am not great at grinding out leetcode problems (I can do it, but not brilliantly or easily), and algorithm classes were my bane.

(I am trying to teach myself about quantum computing right now, just out of personal interest, and man, I am not enjoying having to brush up on my linear algebra.)

Coding to me is more art than science, but I learned the science and understand the need for that foundational background. Thus, I agree with most of your observations. Most people can learn to build some things by following tutorials and memorizing patterns, and that's probably good enough for putting together apps like legos, but real programming requires knowing something about the bits and the hardware, at some level, and knowing a lot about data structures and algorithms.

I think that, as you say, persistence is the key to being a good programmer. Anecdotally, I have also observed few really good women programmers, but I suspect this is more because the kind of persistence/obsessiveness involved in figuring out solutions to hard problems with no more reward than that sudden "Aha, I finally got this fucking thing to work!" is rare in women. Whether that rarity is the product of biological or social conditioning is kind of irrelevant, frankly - women just don't usually have it.

That said, I also see in newer hires a lack of that sort of persistence both among the men and the women. Probably somewhat "Old man griping about the young'uns" here, but I've seen an increasing tendency of kids with fresh CS degrees to not really know how to independently track down the solution to a problem, including scouring stackoverflow and APIs et al. They sort of throw their hands up and ask very broad questions in channels like "Hey, this thing isn't working, can anyone tell me why?" Or "How do I do ?" ("Thing" being a fairly complicated multi-step process that requires experience mostly consisting of trying to do it and figuring out how to get past each obstacle one by one.) And they do this at every step. I have to gently point them at where to go to look for the solution to the next step, when I want to say "This is called being a programmer - read the documentation, use Google, and figure it out!"

My general impression is that the quality of CS grads has definitely gone down. Admittedly, I am not seeing a lot of grads from the top schools, so maybe things are better there and all those kids go to FAANG companies.

That said, I also see in newer hires a lack of that sort of persistence both among the men and the women. Probably somewhat "Old man griping about the young'uns" here, but I've seen an increasing tendency of kids with fresh CS degrees to not really know how to independently track down the solution to a problem, including scouring stackoverflow and APIs et al. They sort of throw their hands up and ask very broad questions in channels like "Hey, this thing isn't working, can anyone tell me why?" Or "How do I do ?" ("Thing" being a fairly complicated multi-step process that requires experience mostly consisting of trying to do it and figuring out how to get past each obstacle one by one.) And they do this at every step. I have to gently point them at where to go to look for the solution to the next step, when I want to say "This is called being a programmer - read the documentation, use Google, and figure it out!"

I am amazed at how many programmers never read the documentation. I've had many coworkers, not just younger, give up if they don't already know the solution to the problem. It should be bog standard to peruse the API calls, and fiddle in test programs with anything you don't understand. But most don't.

They don't want to discover a solution, they want someone to figure it out for them, and tell them exactly what to do.

A level up from that would be reading the source code of the libraries yourself (assuming it's available). A level up from that would be studying the byte code as an absolutely last resort.

Because sometimes you do find bugs in libraries. Or annoying deficiencies. Once I was using an open source websocket library, but had to use a bearer token for authentication. The websocket library came years before the requirement for bearer tokens, and it did not support it at all. The guy who maintained it had weird ideological reasons to never include it. But fuck it, it's open source. So away I went hacking it in. No big deal. But it's remarkable how few programmers are truly willing to go wading through someone else's code.

Honestly, given the quality of the documentation provided by most libraries etc these days, it's often easier to jump straight into the source. At best, the behaviour is undocumented. More likely the documentation is from six months ago when it worked entirely differently. The devs probably notified everyone about the change in behaviour via Discord or something.

EDIT: I think being given solutions to problems tends to be more valuable when you've tried your best to solve the problem yourself already. Then the answer is like a flash of light. If you just read about a problem and a solution, it's just a series of facts to store away for later.

The latter is a more 'rigorous' approach where students are taught the fundamentals such as data types, structures, flow, interpreter vs compiler, etc first; Then they are made to write programs. These programs are sometimes gamified but not to the extent as the former[...] I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method.

I realized as an adult that I do not retain knowledge if I am given that knowledge before I have any way to apply it. I suspect I'm not alone in this; but regardless, I strongly prefer the teaching methodology where you are made acquainted with tools by being given problems which necessitate using those tools. By "tools", here, I refer to algorithms and data structures, among other things. (I think this is why, even though I loved my Algorithms and Data Structures courses, I hated Operating Systems and whatever one it was that taught us assembly language. I retained very little of those and do not count them among the good or useful courses I took.)

I'm aware that this "knowledge-first-to-use-it-later" approach is similar to how the hard sciences are taught; I hated it there as well.

My actual start in programming came from hacking around in the Civilization 4 Python codebase, where I built mods for Fall From Heaven 2 and by necessity had to learn programming syntax-- I was only formally educated in programming later. Contrary to what your argument above would predict I was by far the strongest coder in my graduating class, and went on to get a job in FAANG (where I was, in my judgement, roughly at the top 20% of programmer strength in the company.)

So I don't know the total of what my "ideal programmer education" consists of, but I'm pretty sure a big chunk of it would involve writing a self-designed mod for the game Slay The Spire.

Okay okay, hear me out, this has a number of advantages:

  1. Slay the Spire is entirely programming-first. There is no "editor" interface, as a Unity game would have.

  2. Slay the Spire modding has, as its first step, decompiling the codebase. This gets your student exposure to "the act of having to understand somebody else's extremely nontrivial code".

  3. The codebase is also written using fairly reasonable best practices, particularly for a gaming studio-- it uses polymorphism to deal with all the myriad cards and their effects, which allows you to see very intuitively how polymorphism is used in the wild and why it's valuable. (I know that in my own programming education all of our programs were trivial enough that interfaces and abstract classes seemed weird and pointless, and none of my instructors could give what felt like adequate explanations for their use.)

  4. You can get something pretty cool out the other side-- a game mod! Having something cool and nontrivial that you're in the process of building is worth any number of credit points in inspiring motivation to actually learn programming.

  5. It's Java, which is a very standard programming language which features automated memory management.

So I think if I were designing a programming practicum it would feature game-modding as a big part of it, with perhaps some Code Combat or similar coding game in the first couple of weeks to familiarize students with the basic syntax and philosophy around programming in some reasonably entertaining format. And, of course, some problem sets later that showcase situations where students are given no choice but to use the standard data structures and algorithms.

What were you doing with FFH? I might have some of your work installed.

I'm a very poor programmer, the only thing I did was edit some Leader files to stack good traits on an Amurite leader in Ashes of Erebus so I could try to make an archmage Master of all the elements and achieve omnipotence.

I did the More Leaders Modmod!

The coding was extremely low-quality and the Python was probably buggy as hell. But it was mine.

EDIT: Wait, I think Ashes of Erebus did end up incorporating some of my work! How's that project going, by the by?

I think Ashes is basically finished. They added a race of Hamstalfar, hamsters lording over elves. That broke my immersion somewhat.

Was your More Leaders anything to do with the Minor/Emerging leaders in Ashes of Erebus?

I added a bunch of minor leaders, but I didn't do any of the mechanics behind Minor Leaders in general.

I... did not much like the Hamstalfar.

I also have the experience that I am incapable of remembering things that are not applicable.

The way I was taught Python was a cycle of:

  • Learn a new idea (conditionals, for loops, Big-O notation, lists/sets/dictionaries/strings, recursion, objects, etc.)

  • Solve a few problems that require that idea in isolation

  • Solve a few problems that require that idea in addition to everything else I know

  • repeat for the entire semester

  • Solve one problem using many (but not all) of the ideas, with minimal direction

Once you have a big enough toolbox, you have to start solving the problems instead of guessing the teacher's password.

Based on my anecdotal experience of people wanting to learn programming from me, the only reason a person can't code is because they aren't motivated. I know lots of people who like the idea of coding, or like the salary of it, but they don't like coding. Most good programmers I know like it. There are things I like the idea of, or like the rewards of, but I don't like the activity of it (social games/PUA), so I stay bad at them.

I think that's way more important than the choice between games or formal whatever theory.

I was self-taught from a young age. It never really felt like “learning”, it was just “fun”, because I was so fascinated by it all. Constantly trying out new languages, thinking of new little projects to do (everything from webdev to bare metal stuff like bootloaders), I even took programming books with me when we went on family vacations. None of it was very structured, but I did end up exposing myself to a very broad variety of areas.

In general it’s very hard to teach someone a complex skill when they have neither intrinsic talent nor intrinsic motivation. I have no suggestions about what to do in that case, except for asking him which topics he finds the most interesting and encouraging him to work on those things.

Edit to expand a little more: I think all learning is essentially self-learning. In the course of my life, I have never encountered any significant evidence to the contrary. Talk about using a “gamified” approach or a “rigorous” approach is ultimately neither here nor there, because YOU can’t actually “teach” him anything with either method. He has to do the legwork himself. He may gravitate to a more rigorous approach, or a more vibes-based approach, it doesn’t really matter. In my view, the role of a teacher is mainly to be a sounding board for questions, but crucially the teacher should be an intelligent sounding board. They should not only provide answers, but they should also examine the hidden underlying assumptions behind the student’s questions, and they should correct misconceptions and guide the student in the right direction when they seem to be going down a bad path.

Teach them python. Give them all domain names, and digital ocean droplets.

Let them troll each other. Fin.

I broadly agree except for the part about women (I actually have known a lot of women who were great software developers, although I agree that women seem proportionaly less likely to get good at it) and had a similar background: I started in CS where I remained for two years (taking all of the CS classes before stopping after data structures in C) before changing my focus to a physical science. I now work in a job where I am essentially a data scientist for a group of (much smarter) physicists all of whom can program with the kind of intuitive knowledge that you described. All of these people (whom have been perfectly willing to self teach themselves other things in math and physics) are hugely constrained by the fact that they don’t know really know how data structures work. If they ever bother to learn Ill be out of a job.

I have asked some of my colleagues how they learned to program. The older ones are all completely self taught (and are usually a little better since this means that they had to learn C or Fortran) while the people my age usually got some sort of CS for physical scientists (which is usually a 1 semester long class in python where they learned to use numpy and matplotlib). I really don’t understand why more of them haven’t self taught and have concluded that it must be a matter of motivation or maybe even professional chauvinism.

Lots of scientists fundamentally don’t believe that things which happen on a computers are or could be that interesting. This is just insane to me given that computers are finally getting powerful enough that they can run some of their experiments with simulations (or at least use simulations to better plan them). Given the importance of these skills I really believe that universities should require physical scientists to spend a year taking real computer science, or whatever else it takes to make more of the profession aware that these skills are becoming as important as competence in mathematics for doing original research these days.

This is just insane to me given that computers are finally getting powerful enough that they can run some of their experiments with simulations (or at least use simulations to better plan them)

This has been true for a while though. Computing in general, whether it be by simulation or code for a tool or just for analyzing large amounts of data, have taken over or created many subfields in the hard sciences.

I agree and I should have clarified that I specifically meant High Performance Computing which is where knowledge of things like parallel programming is essential.

I have come across two fairly distinct methods of teaching programming. I would classify them as 'trying to impart intuition' vs. 'trying to impart knowledge.'

The former consists of teaching via gamified methods where students are made to play elaborate games consisting of programming puzzles, modify existing code to draw out 2-d shapes and animations, etc. Once students develop some familiarity with changing words in a file to make the computer do something, they are introduced to data types, data structures, control flow, etc.

The latter is a more 'rigorous' approach where students are taught the fundamentals such as data types, structures, flow, interpreter vs compiler, etc first; Then they are made to write programs. These programs are sometimes gamified but not to the extent as the former.

I consider the latter "imparting knowledge" method superior. It's more in line with all the hard sciences I have been taught and all the good programmers I am aware of claim to have been taught using this method. More on this later.

I want to push back on your last point.

I went to a university that was top 20 on this list and was one of the strongest programmers of my year and am currently working at a FAANG. My introduction to programming was in high school and was mostly self-taught -- I took two programming courses in high school but they were primarily HTML-oriented. Before college, I was never explicitly taught about if-statements or for-loops (though I had been using them for ~3 years) and didn't know asymptotic notation, heaps, hash tables, etc.

Then I got to college and voraciously consumed all the "formal" CS education. So to me, my education was primarily the former method and "I turned out fine". (Though I certainly would have appreciated getting a formal CS education a year or two earlier, and certainly rigorous education is necessary at some point).

The main caveat is that "move this frog around the screen to eat bugs" is kind of a pathetic attempt to gameify education and I would have hated it. Working on my own projects and actually achieving them made programming basically seem magical, and (imo) that's what made me super excited to learn the formal stuff when I got to college.

Yeah, I have kinda the same story.

First programming I did was on a graphic calculator, learning only from the manual of the calculator. Had no clue what loops where, or what a condition is, much less what a data structure is and why would I need that. I just wanted to have a program to so some calculations for me on a test, hacked away in BASIC.

Then when I finally got the proper programming courses everything fell into place, and I'm fully convinced that prior experience was essential. I was actually engaged learning about the more complex topics of programing.