site banner

Small-Scale Question Sunday for February 16, 2025

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

2
Jump in the discussion.

No email address required.

I see an opportunity to replace certain human labor at my workplace with humanoid robots. For background, I am an equipment engineer at a Fortune 100 manufacturing company and think my job is somewhat low-risk of getting automated soon.

Pros:

  • Good for my career at the company
  • Improves my skillset in case I ever want to switch jobs. I predict humanoid robots will become much more common as more companies adopt them and their skills widen and improve.

Cons:

  • I am putting people out of a job. This would likely substitute for people instead of complement them. The company recently laid off 100s of employees and soon thereafter announced using a robot dog for some of their tasks. Is this just a form of natural selection?
  • Can look bad on me if the robot isn’t as good as promised. There are ways to temper expectations that I plan to do during my pitch to management.

What are The Motte’s opinions on this in regards to:

  • My career development
  • The moral implications of putting low-skilled people out of work
  • Anything else

Are you running a business, or a charity?

My perspective is that your "career development" is mostly illusory. If automating part of your process results in a better product or cheaper manufacturing, perhaps you will get a bonus? Certainly you will get a resume point. Perhaps it will get you a promotion? A raise? You don't seem to think it will result in you, too, being replaced by a machine, at least not immediately, so in terms of self interest it seems like an obvious choice.

As for the moral implications of making low-skilled people unemployed, like... if you don't do it, eventually someone else will, except you will get none of the benefits while still suffering all the possible downsides. There may be public policy arguments about this that matter from a moral or legal perspective, but unless it is your job to make or enforce public policy, then you don't really have a seat at that table.

In the medium-term future (two or three centuries at most), I think that we either get widespread universal basic income, or we get rampant Luddism. Authoritarian governments and relatively culturally homogeneous nations seem likely to weather that transition better than pluralistic democracies, as identitarian competition for resources and handouts ramps up toward infinity. You will contribute to this process no matter what you choose to do in your current role; the best you can do is what is best for yourself, as that is what you have the most control over and the greatest understanding of.

My conviction is that in the future, much shorter term than your medium, countries that embrace automation to a fuller extent will utterly dominate and destroy those that don't. Will it be authoritarian or democratic ones? I can see it go either way. Democratic unions blocking even automatic parking gates at the docks, versus an autocrat saying that a robot-staffed megafactory for making drones is being built, and those who protest will be the first to experience its products. Or a democratically-minded government allowing unlimited productivity explosion if the owners are forced to dole out a pittance of the gains as universal basic income, versus a paternalistic dictator protecting his people from unemployment.

In the medium term, I think that the concept of a government will lose its meaning. The division will be between those individuals who control a force capable of credibly threatening other individuals controlling a trillion drones, and those who don't.