site banner

Culture War Roundup for the week of November 21, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

There are a lot of novel bad things that are happening in America right now, ranging from inconvenient to life altering. The things I've been hearing about from my social circle include major tech layoffs, inflation, and increased serious illness due to diseases like RSV and flu hitting people in unexpectedly strong ways. My general response to this has been, "well maybe next time, we shouldn't shut down the entire world due to a relatively non-dangerous disease like coronavirus." Basically, I'm implying that there's a line of causation from COVID lockdowns of a few years ago to the economy now failing, and to people's immune systems now failing, etc. Do you think this is a fair response to take? To be honest, there's probably a lot of other factors at play as well that I'm not accounting for in that analysis, due to my unfamiliarity. These factors may include foreign issues, like Russia's invasion of Ukraine, leading to increased energy prices, etc.

They're doing dull, rote work that often amounts to little more than taking someone else's code or architecture and adapting it very slightly to fit a specific new situation.

Come to think of it, why did computer programming stop being the "women's work" it originated as? Because that... actually kind of fits the description of secretaries and computers (as in, the job title), but in practice (in 2022, but it was true in 2010 to a large extent too) it's a little different than that. And I can kind of see it with more imperative "only do this thing" FORTRAN and, later, Excel-as-programming language, but it's weird that it doesn't apply to software as a whole (though MS' Power Apps platform might have something to say about that).

I think that it might be worth looking at the tooling and tools; I believe that software and developers are just uniquely bad at writing good documentation and it's to the point where you actually have to do heavier analysis to get anything done any more.

Maybe having to dig hard to get anything done in all these damn frameworks was job security after all?

Come to think of it, why did computer programming stop being the "women's work" it originated as?

In essence, it never was. That's a just-so story spread by tech SJWs and their predecessors, mostly based on the ENIAC programmers and one article by Grace Hopper which was trying to encourage women to become programmers. The ENIAC programmers were women mostly because it was built during WWII when (young) men were in short supply. As of 1960 (first figures I can find), only 31% of "Computer Specialists" (there was no further breakdown) were female. (And yes, that's higher than programmers today)

I'd bet that "computer specialists" includes computer operators who just typed things in and didn't program.

"Computer Specialists" was later broken down into "Computer Programmers" and "Computer Systems Analysts" (and the very small "not otherwise specified" category), neither of which would just be typing; there were other categories for that. It's possible there was some misclassification, of course, but I doubt it was all that significant. There were ~13,000 computer specialists in 1960, 31% of which were women. By 1970 (I have no intermediate data) there were 258,000, 20% of which were women. In 1990 35% of the 974,000 in the equivalent occupations (according to me, anyway) are women, and that's the absolute peak; we see a nadir of 22.5% in 2009 and it's been stuck around 24% since then.

Come to think of it, why did computer programming stop being the "women's work" it originated as?

Frankly, programming now is far more complicated than it used to be for developers. There are far more moving parts and our expectations of what an engineer has to know and do continue to expand at an insane pace.

Some of that is because of what we expect out of applications, others are self-inflicted wounds by bad software architects.

Consider what someone would consider a simple CRUD application that one day would be maintained with:

dull, rote work that often amounts to little more than taking someone else's code or architecture and adapting it very slightly to fit a specific new situation

That app will still have:

  • A Database using a language used nowhere else in the application, with its own infrastructure and design

  • A middle tier using a language used nowhere else in the application, with its own infrastructure and design. It must account for security, access to the database, and working with various clients (usually the front end).

  • The middle tier may itself integrate with other applications, and each of them use a specific security model. It has to translate information from the model used in those external apps into its own (and then sometimes back out)

  • A front end using language used nowhere else in the application. It too must account for security and access to the middle tier. It utilizes a dizzying collection of packages (along with the worst package manager in the industry) and has to translate information from the middle tier into its own models. It has to handle user input, control-flow logic for users, and routing.

  • All of this will be managed and deployed with an ALM tool and pushed out to the cloud (if you're lucky). There's a whole 'nother set of security concerns here, the idea of environment progression, tracking work and generating release notes, running tests, and provisioning infrastructure as code itself in YET ANOTHER language used nowhere else in the application.

There are ways to simplify all this - for instance, you could in theory use a single language across an application, though that has serious downsides too.

And of course setting all this up is a solid order of magnitude harder than updating it.

But a typical bug is going to cross-cut against every one of these components. Compare that with writing an accounting program that performs an equation on a couple of numbers that are sent as input into the system, which is mostly what legacy computing was when women were equally represented. You could write out an entire program on a sheet of paper if you were doing it in english. Not the case at all with modern development.

There are ways to simplify all this - for instance, you could in theory use a single language across an application, though that has serious downsides too.

Well, it would also help if the only language to be used in this manner wasn't fucking JavaScript, and it would also help if there was market space for any competitor to Microsoft, since they're to my knowledge the only ones who actually seem to try to unify stuff (and when they don't, they just buy the companies that sell and do hostile takeovers on the ones that don't; press F for Borland). Of course, that costs money and there's always that Embrace-Extend-Extinguish thing going on... which will almost certainly haunt that company for the rest of its days.

Meanwhile, the software development community at large would rather just sit there and suffer with (comparatively) sub-par tooling; it's telling that people brag about their favorite development environment being a shitty 1970s text editor in a way unique among tradespeople (like an electrician choosing to do knot-and-tube wiring for a new install).

It's a weird trade to be in, for sure.

I used to think software dev tools were bad until I started interacting what an average mechanical or electrical engineer have to deal with. We are simply too spoiled with free open access to an incredible array of tools and like to bitch whenever something is slightly subpar.

Wellll re: Microsoft in theory you can build a full stack app almost entirely in C# with Blazor and Entity Framework :)

Which I would know if I was a disgusting .NET - loving peasant.

Which of course I'm not.

Knowing the languages is the very easiest part, IMO. Getting to know which parts of your application interface with which other ones and which external services how exactly and how the entire CI pipeline works, that's what seems to get more complicated by the day.

Freemcflurry might have the right of it.

Or the field may have been so niche that expertise was randomly distributed. Hopper was certainly doing “real” programming and probably had a staff of card-sorting interns.

This is also compatible with a theory that women got pushed out as soon as the field became prestigious/expensive enough to attract a larger talent pool of men.

My understanding is that "computer programming" as we think of it today was never women's work. What the Buzzfeed articles called programming was more like taking a program written by a man and transcribing it on to punch cards, similar to how a secretary would take dictation in shorthand and then type it up.

That job was called "keypunch operator" and as far as I know was never considered "programming".