ThenElection
No bio...
User ID: 622

Of course not. Your obligation is to get a well paying job at an AI company, usher in the apocalypse, and convert the universe into computronium, which can run innumerable simulations of bee lives in lands of endless flowers and honey and free of suffering.
I think of it more as a (negative) reward signal in RL. When a human touches a hot stove, there's a sharp drop in dopamine (our reward signal). Neural circuits adjust their synapses to betterpredict future (negative) reward, and subsequently they take actions that don't do it. There's a bit of a sleight of hand here--do we actually know our experience of pain is equivalent to a negative reward signal--but it's not too wild a hypothetical extrapolation.
How do atoms fit in? Well, it's a stretch, but one way to approach it is to treat atoms as trying to maximize a reward of negative energy, on a hard coded (unlearned) policy corresponding to the laws of physics. E.g. burning some methane helps them get to a lower energy state, maximizing their own reward. Or, to cause "physical" pain, you could put all the gas in a box on one side of the box: nature abhors a vacuum.
But oysters aren't fish either. Something like ostrotarian would probably be best, but that will invariably end up confusing the people you're trying to communicate your dietary desires to.
I kind of fall into a similar category: I'm a vegetarian who eats bivalves (because no central nervous system) and caviar (because yum). When going out to eat, I say vegetarian because it communicates all the information people need to make any accomodations they want to; giving my full dietary philosophy would be more about signaling and self aggrandizement than anything useful to them. (And, in my head, I don't really identify as anything, dietary wise.)
I don't think it's too hard to get around that objection: just divide suffering into useful suffering and pointless suffering, and then switch the objective to minimizing the pointless suffering. Suffering from touching a hot pan is useful; suffering by immolating someone on a pyre is pointless.
That's the sleight of hand I mentioned: because qualia are so mysterious, it's a leap to assume that RL algorithms that maximize reward correspond to any particular qualia.
On the other hand, suffering is conditioned on some physical substrate, and something like "what human brains do" seems a more plausible candidate for how qualia arise than anything else I've seen. People with dopamine issues (e.g. severe Parkinson's, drug withdrawal) often report anhedonia.
That heavy philosophical machinery is the trillion dollar question that is beyond me (or anyone else that I'm aware of).
this leads you to the suspicious conclusion that the thousands of simple RL models people train for e.g. homework are also experiencing immense sufferring
Maybe they are? I don't believe this, but I don't see how we can simply dismiss it out of hand from an argument of sheer disbelief (which seems just as premature to me as saying it's a fact). Agnosticism seems to be the only approach here.
- Prev
- Next
Got me to wondering: has there ever been a video game or movie where the villain (hero?) becomes convinced that the only way to end all suffering in the universe is to extinguish all consciousness and life? I feel like I've seen this trope a thousand times, but I can't put my finger on one that matches it perfectly. Maybe one of the FF games? Probably some anime somewhere.
More options
Context Copy link