ThenElection
No bio...
User ID: 622

I don't think it's too hard to get around that objection: just divide suffering into useful suffering and pointless suffering, and then switch the objective to minimizing the pointless suffering. Suffering from touching a hot pan is useful; suffering by immolating someone on a pyre is pointless.
But oysters aren't fish either. Something like ostrotarian would probably be best, but that will invariably end up confusing the people you're trying to communicate your dietary desires to.
I kind of fall into a similar category: I'm a vegetarian who eats bivalves (because no central nervous system) and caviar (because yum). When going out to eat, I say vegetarian because it communicates all the information people need to make any accomodations they want to; giving my full dietary philosophy would be more about signaling and self aggrandizement than anything useful to them. (And, in my head, I don't really identify as anything, dietary wise.)
I think of it more as a (negative) reward signal in RL. When a human touches a hot stove, there's a sharp drop in dopamine (our reward signal). Neural circuits adjust their synapses to betterpredict future (negative) reward, and subsequently they take actions that don't do it. There's a bit of a sleight of hand here--do we actually know our experience of pain is equivalent to a negative reward signal--but it's not too wild a hypothetical extrapolation.
How do atoms fit in? Well, it's a stretch, but one way to approach it is to treat atoms as trying to maximize a reward of negative energy, on a hard coded (unlearned) policy corresponding to the laws of physics. E.g. burning some methane helps them get to a lower energy state, maximizing their own reward. Or, to cause "physical" pain, you could put all the gas in a box on one side of the box: nature abhors a vacuum.
Of course not. Your obligation is to get a well paying job at an AI company, usher in the apocalypse, and convert the universe into computronium, which can run innumerable simulations of bee lives in lands of endless flowers and honey and free of suffering.
Got me to wondering: has there ever been a video game or movie where the villain (hero?) becomes convinced that the only way to end all suffering in the universe is to extinguish all consciousness and life? I feel like I've seen this trope a thousand times, but I can't put my finger on one that matches it perfectly. Maybe one of the FF games? Probably some anime somewhere.
Yes, the world at that point was a powder keg, and you can name at least a dozen incidents before the assassination that could have set it off. The assassination was far from the root cause, but it was the proximate event in a spiral.
The world is in a similar state today, and normalcy bias is what prevents us from seeing it. Seemingly minor events can trigger repercussions far out of expectations if conditions are right.
The elites of the USA (who are often to be said to be captured by the left) are pro-Ukraine, pro-Israel, though. A substantial fringe of academics and student protestors doesn't change that.
The risk is that this escalates to a broader conflict. Not Iran vs whoever--Iran is a paper tiger, and all other factors being equal it's good that it's now further from getting nukes than it was (one hopes). But I'm worried this triggers a series of international incidents that leads to a Taiwan war. Although it seems far-fetched, it also seemed far-fetched that an assassination of an archduke could spiral to a world wide conflagration.
Iran needs to respond somehow, for domestic political reasons if nothing else. And, one thing leads to another, and Hormuz ends up mined, and China decides, well, the world is going to suck for a couple years and the US is otherwise occupied, might as well take advantage of the moment.
I think the take is usually "even if someone gives fully informed consent to have a violinist attached to their circulatory system, they have the right to remove him at any time, even if it causes his death and they agreed not to initially." There are people willing to bite the bullet on this.
As a bi guy, I've dated both men and women. And it is multiple orders of magnitude easier to get a date with a man than it is with a woman. Quantitatively, my inbound like/match rate online was literally 100x when matching with men (I'd get a number of likes in a day with men that it'd take me almost a year with women).
Sure, a fair bit of that was just casual sex. But even if 75% were just looking for casual sex, that's still an order of magnitude more ease dating men than women.
I suspect this mismatch is that your "average man" encompasses a lot of things that make him substantially above average.
Why should you care? Well, it's your prerogative to or not. But two reasons:
-
As young men drop out of the caring game, that makes the market (both economically and sexually) less competitive. There are more opportunities and niches to get utility from. Still less than a hypothetically static situation, but people dropping out mitigates some of the increased difficulty.
-
It's far better to strive and create than to passively survive. For society, sure, but also better for you as a person. There are forms of joy that aren't available to someone just existing.
But... There's no way that Aella would actually have trouble finding a partner who wants kids who is okay with her lifestyle. Not some captain of industry, but also not some random meth addict on the street either. There are plenty of total simps in tech with a solid paycheck who'd be thrilled to go for her, and she knows that.
This is all a marketing gimmick. Come save the poor whore with a heart of gold and a mind of platinum!
I often use it as a lookup tool and study aid, which can involve long conversations. But maybe that falls under "as a tool."
The last time I had a bona fide conversation with an LLM was maybe three months ago. These actual conversations are always about its consciousness, or lack thereof--if there's a spark there, I want to approach the LLM as a real being, to at least allow for the potentiality of something there. Haven't seen it yet.
- Prev
- Next
That's the sleight of hand I mentioned: because qualia are so mysterious, it's a leap to assume that RL algorithms that maximize reward correspond to any particular qualia.
On the other hand, suffering is conditioned on some physical substrate, and something like "what human brains do" seems a more plausible candidate for how qualia arise than anything else I've seen. People with dopamine issues (e.g. severe Parkinson's, drug withdrawal) often report anhedonia.
That heavy philosophical machinery is the trillion dollar question that is beyond me (or anyone else that I'm aware of).
Maybe they are? I don't believe this, but I don't see how we can simply dismiss it out of hand from an argument of sheer disbelief (which seems just as premature to me as saying it's a fact). Agnosticism seems to be the only approach here.
More options
Context Copy link