site banner

Small-Scale Question Sunday for March 19, 2023

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

3
Jump in the discussion.

No email address required.

Is there a good writeup on how AI might affect employment in software engineering in the next 3 years or so?

GPT-4 performance on programming competition problems drops precipitously on problems written after 2021.

In general, people who work outside of a field have a poor understanding of what people in that field actually do - which can manifest as an underestimation of how difficult it is to automate various tasks.

At the time of this writing, the ArtStation job board has 83 open listings for concept artists, even though that should be the niche that AI art generators excel at the most.

I believe you're over-generalising from your personal work situation and if i still worked in mgmt consulting i might have done the same.

A lot of the stuff that junior consultants do is highly vulnerable to GPT-style AI replacement, and so is Indian outsourcing resources for the same stuff. I assume it's similar with junior analysts ans the like in commercial banking, based on what I've heard from friends working in that sector it seems very similar, as are junior lawyers at big firms.

These are fairly small groups of people and it's well known that a lot of what they do is practically slave labour for fairly uncomplicated stuff while dangling great future positions, working with high level people straight out of university (and learning) in front of them.

This isn't the case for most jobs in industrial sector and large governmental organisations though, and there are vastly more people employed there.

I'm not saying that these organisations are doing more complex work, only that the lower level stuff is less vulnerable to near term AI replacement, if for no other reason that the actual work doesn't consist all that much of (that kind of) symbol manipulation, even if people like to pretend otherwise.

You just stated this without justification though. I think it’s totally plausible, maybe even the most likely scenario, but I’d like to see some kind of reasoning for it, preferably with numbers.