Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?
This is your opportunity to ask questions. No question too simple or too silly.
Culture war topics are accepted, and proposals for a better intro post are appreciated.
Jump in the discussion.
No email address required.
Notes -
Suppose we developed a specific form of genetic engineering that allows us to freely modulate a person’s compulsions and interests (for the purposes of this example, say a compulsion for accurately completing taxes). In almost all ways, the resulting human is completely normal, except:
To you, would it be ethical to take some eggs/sperm, hire some surrogates, and create a small army of these beings to have them do all of our taxes? How does your position differ from creating an artificial general intelligence (supposing we developed the capability) to do the same?
Welllll we haven't assumed the ability to arbitrarily modulate the AI'S compulsions and interests.
Which is a big question these days.
More to the point, though, are we allowing the modulated person to request that their modulation be changed if it no longer suits them, if they feel they're suffering with the current setup?
Unless you're ALSO suggesting that these behavioral changes are SO ingrained that they won't gradually shift over time as they accumulate experiences and/or head trauma.
I think that's where the ethics of it start to kick in. If your modulated human one day says "I would rather not do taxes today. In fact, can we adjust my brain a little so I can get a feeling of optimistic joy from viewing a sunset? I read some books that made that sound really nice."
(Aren't we just talking about Replicants from Blade Runner, here?)
This would make for a more nuanced thought experiment (how high a rate of these behavioral drifts is tolerable, what is to be done with those that experience such drifts), but for the purposes of my current question, I'm assuming it's 100% effective and permanent.
I'm assuming they'd never desire an adjustment because the thought would never cross their minds.
My ignorance of sci-fi is obviously showing here, as two other posts noted similar concepts I did not know (Tleilaxu, Genejack). It seems Genejack is more or less what I'm thinking of. As for Replicants, I only saw Blade Runner once many years ago but I don't recall any modulation of interests/desires, more just enhance capabilities and a muted emotional response?
lol there are a lot of potential scifi analogues.
Like the MeSeeks from Rick and Morty.
But I'd reiterate my point. The ethical issues mostly arise when you assume that their mental conditioning is NOT 100% effective and that it might occur to them to do something different.
If you've got a creature in front of you that WANTS to do taxes, enjoys doing taxes, wants to want to do taxes, and doesn't ever think there's anything wrong with that... and isn't otherwise causing itself harm due to some secondary effect of the programming, I don't think you're obligated to do anything other than facilitate their ability to keep doing taxes as long as that is relevant.
But I do think that's where we're starting to lose the analogy to AI, since we kind of know less about their individual internals than we do about human's.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link