site banner

Culture War Roundup for the week of November 21, 2022

This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.

Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.

We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:

  • Shaming.

  • Attempting to 'build consensus' or enforce ideological conformity.

  • Making sweeping generalizations to vilify a group you dislike.

  • Recruiting for a cause.

  • Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.

In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:

  • Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.

  • Be as precise and charitable as you can. Don't paraphrase unflatteringly.

  • Don't imply that someone said something they did not say, even if you think it follows from what they said.

  • Write like everyone is reading and you want them to be included in the discussion.

On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.

13
Jump in the discussion.

No email address required.

Come to think of it, why did computer programming stop being the "women's work" it originated as?

Frankly, programming now is far more complicated than it used to be for developers. There are far more moving parts and our expectations of what an engineer has to know and do continue to expand at an insane pace.

Some of that is because of what we expect out of applications, others are self-inflicted wounds by bad software architects.

Consider what someone would consider a simple CRUD application that one day would be maintained with:

dull, rote work that often amounts to little more than taking someone else's code or architecture and adapting it very slightly to fit a specific new situation

That app will still have:

  • A Database using a language used nowhere else in the application, with its own infrastructure and design

  • A middle tier using a language used nowhere else in the application, with its own infrastructure and design. It must account for security, access to the database, and working with various clients (usually the front end).

  • The middle tier may itself integrate with other applications, and each of them use a specific security model. It has to translate information from the model used in those external apps into its own (and then sometimes back out)

  • A front end using language used nowhere else in the application. It too must account for security and access to the middle tier. It utilizes a dizzying collection of packages (along with the worst package manager in the industry) and has to translate information from the middle tier into its own models. It has to handle user input, control-flow logic for users, and routing.

  • All of this will be managed and deployed with an ALM tool and pushed out to the cloud (if you're lucky). There's a whole 'nother set of security concerns here, the idea of environment progression, tracking work and generating release notes, running tests, and provisioning infrastructure as code itself in YET ANOTHER language used nowhere else in the application.

There are ways to simplify all this - for instance, you could in theory use a single language across an application, though that has serious downsides too.

And of course setting all this up is a solid order of magnitude harder than updating it.

But a typical bug is going to cross-cut against every one of these components. Compare that with writing an accounting program that performs an equation on a couple of numbers that are sent as input into the system, which is mostly what legacy computing was when women were equally represented. You could write out an entire program on a sheet of paper if you were doing it in english. Not the case at all with modern development.

There are ways to simplify all this - for instance, you could in theory use a single language across an application, though that has serious downsides too.

Well, it would also help if the only language to be used in this manner wasn't fucking JavaScript, and it would also help if there was market space for any competitor to Microsoft, since they're to my knowledge the only ones who actually seem to try to unify stuff (and when they don't, they just buy the companies that sell and do hostile takeovers on the ones that don't; press F for Borland). Of course, that costs money and there's always that Embrace-Extend-Extinguish thing going on... which will almost certainly haunt that company for the rest of its days.

Meanwhile, the software development community at large would rather just sit there and suffer with (comparatively) sub-par tooling; it's telling that people brag about their favorite development environment being a shitty 1970s text editor in a way unique among tradespeople (like an electrician choosing to do knot-and-tube wiring for a new install).

It's a weird trade to be in, for sure.

I used to think software dev tools were bad until I started interacting what an average mechanical or electrical engineer have to deal with. We are simply too spoiled with free open access to an incredible array of tools and like to bitch whenever something is slightly subpar.

Wellll re: Microsoft in theory you can build a full stack app almost entirely in C# with Blazor and Entity Framework :)

Which I would know if I was a disgusting .NET - loving peasant.

Which of course I'm not.

Knowing the languages is the very easiest part, IMO. Getting to know which parts of your application interface with which other ones and which external services how exactly and how the entire CI pipeline works, that's what seems to get more complicated by the day.