@fozz's banner p

fozz


				

				

				
0 followers   follows 0 users  
joined 2022 November 15 15:51:22 UTC

				

User ID: 1869

fozz


				
				
				

				
0 followers   follows 0 users   joined 2022 November 15 15:51:22 UTC

					

No bio...


					

User ID: 1869

Asking people to choose to forgo unnecessary luxuries so that starving children can eat has nothing to do with being a sociopath.

A narcissistic sociopath might try to pretend this was an unreasonable request for whatever reasons a mind like that might manufacture.

Why do you personally think it's more important for a person to have unnecessary luxuries than for a starving child to eat?

Love it. Probably the best example, and it's good to examine these limit cases.

I agree you have a special moral relationship to your children, as you made the choice to give them consciousness.

I'm an antinatalist, so I don't think we should have children. Having children is creating the (near certain) potential for (significant) suffering without the consent of the sufferer.

Apart from your unique responsibility to them because you chose to give them consciousness, your children are not owed any more of your moral "duty" than any other child. That you feel you owe them more is just a result of biology and culture.

There is no moral difference between your child and some child on the other side of the globe. Both are having a conscious experience and are capable of suffering & happiness.

If you want to participate in morality, which is engaging in the process of having positive effects on the conscious experiences of others, then you'll recognize privileging family and friends over strangers, even enemies, is nothing more than an evolutionary survival strategy with a little fondness of familiarity on top. It's utterly Self-centered, amoral, and leads to many problems.

If you disagree, and you’d like to change my mind, you could chill with calling everyone selfish cowards blind to your obvious correctness.

I'm a selfish coward. That's the plain language for what I am based on what I'm doing.

I should give my resources, at least above subsistence, to save the lives of starving children. That is the correct moral action.

I am a coward for failing to do so. That is, I am too scared to proceed with what I have reasoned is morally right.

And my fear is based on my desire to preserve the Self. I am selfish. Obsessed with Self, to the point I will let children starve rather than deny my Self.

Using less plain and direct language is just a tactic people use to preserve their delusions.

It's lying.

I told you, if you mean “in the eyes of god” or taking the view from nowhere or some other abstraction, then sure, a person’s proximity to me does not affect their moral value.

This completely at odds with this:

It does, however, change my moral duties toward them, for a host of social, moral, and practical reasons.

3 million children really, actually die of starvation each year. Real children. You can literally, truly, concretely, actually save a number of their lives. Say, 10 lives. Just by forgoing insanely lavish luxuries that we all treat "middle class" in the West. You wouldn't even have to forfeit your life, just a bunch of your stuff.

Saying "it's not my moral duty" makes no sense. No one is going to assign this duty to you. Reason makes your "duty" self-evident.

If you want to participate in morality, you need to acknowledge "a person’s proximity to me does not affect their moral value" as you did above, and then engage in the process of treating them as if they have the same moral value as you. By keeping them from suffering, aiding their happiness, etc. It's your "duty."

If you acknowledge "a person’s proximity to me does not affect their moral value," and yet do not make the appropriate changes to act on what you know, then you are a hypocrite, and a selfish coward.

I am a hypocrite, and a selfish coward too.

Acknowledging this is useful if it leads to action.

I understand the is/ought distinction, thank you.

Then why do you keep saying things that prove otherwise?

I am pointing out that the morality you describe is so foreign to most humans that calling it “obvious” is presumptuous at best.

It's obvious in the way I said it was:

Me: It's only because conscious experience exists that morality exists; it's only by rationally thinking through the implication of this that you can participate in morality. The moral way of assessing value is by measuring the capacity to suffer, or the other end, experience happiness/flourishing. And it's when you do that you realize there is no (unselfish) basis to place a higher value on anyone. You'll see that it's only your selfishness that blinds you to this simple truth.

If you can't see it's obvious that all people are of the same value, it's just selfishness. Like, you've been deluded into thinking your Self is privileged, and other people are privileged because of their proximity to your Self. This is obviously immoral, enforced by evolution & culture.

Again, if you had to choose to either save a starving child's life, or have a high thread count duvet cover and heated seats, and you choose the luxuries instead of the child's life and welfare, then you are a selfish coward. And that is what you (and I) are doing. Not theoretically. We're actually choosing to do it in the real, concrete world.

I’m sure you can find examples of principled equal-opportunity altruists, though I suspect many historical examples were motivated by religious and ascetic principles somewhat misaligned with yours. I said statistically zero.

I won't quibble about numbers here. My point was only that it's not at all impossible. But it's certainly not popular.

Again, is/ought. Here: https://en.wikipedia.org/wiki/Is%E2%80%93ought_problem

"If by “equal moral value,” you mean some universalist abstraction like “in the eyes of God” or somesuch, then sure."

Roughly, yes. "God" being, similar to what you said, some kind of abstraction of a universal ethic.

Well, then I’d say it’s quite a daring assertion to call something “obviously morally true” when statistically zero people who have ever lived have believed it or acted upon it...and so on...

This is all false.

There are people who've given their lives for strangers. There are people who've donated organs to strangers. There are people who've sold all their possessions above subsistence and give it to strangers. People who give all their income above subsistence to strangers.

I deny that human morality is math at all. People are not indistinguishable, interchangeable, widgets. The essence of humanity is sociability - our particular relationships and cooperation with each other. Your cold math at best ignores it, and at worst denigrates it as pernicious. That's a recipe for trouble.

Ha. You feel attacked. I get it. :)

You're placing a higher value on the lives of some people due to their proximity to you. This is because you are selfish, by nature. Reputation, reciprocation, kin selection, etc. These are all "is" considerations. (It's cool we all feel it.)

It's only because conscious experience exists that morality exists; it's only by rationally thinking through the implication of this that you can participate in morality. The moral way of assessing value is by measuring the capacity to suffer, or the other end, experience happiness/flourishing. And it's when you do that you realize there is no (unselfish) basis to place a higher value on anyone. You'll see that it's only your selfishness that blinds you to this simple truth.

A man you've never met in Kenya is of equal moral value to your father. This sentence flies in the face of everything we feel, but it's obviously morally true.

You seem to be confusing is/ought.

If you choose not to give your life to save 10 people, you are a selfish coward.

I don't mean that as harshly as it sounds, as we are all born selfish cowards, wired that way as a result of billions of years of evolution. And then it's reinforced by our culture. It's super hard not to be a selfish coward.

We don't like to think of ourselves as selfish cowards, so we imagine ourselves to be moral, even when the evidence is clear.

3 million children die of starvation each year. You can literally, truly, concretely, actually save a number of their lives. Say, 10 lives. Just by forgoing insanely lavish luxuries that we all treat "middle class" in the West. You wouldn't even have to forfeit your life, just a bunch of your stuff.

Now, I don't believe people should be forced to sacrifice themselves or sell their shit. It's a personal decision they should arrive at after doing the rational/ethical math.

But the math is clear.

If you're curtailing other people's flourishing to aviod suffering you're a sociopath.

What do you mean?

Interesting.

I'd put it above 99%. Unless Trump dies or Twitter disappeared, it's a near certainty.

From where I sit, the media has been boycotting Trump since he lost the election. I rarely hear anything about him, other than about indictments and charges I don't think will ever come to fruition.

Now that he's running for POTUS in 2024, his return to Twitter will force the same non-stop media coverage shitstorm that happened in 2016-2020. He feeds on that, as a person and as a candidate, and I don't know how he turns it down.

I actually Elon will be the big winner here. The traffic will be off the charts & I'm skeptical advertisers will stay away for long.

In case you spent the day under a rock, Trump has been reinstated on Twitter after winning a poll by Elon Musk. Not kidding.

I'd like to keep this open & not push the discussion in any direction, so I'll keep this short.

Predictions? What does this mean for Twitter? Elon? Trump? Social media? The country? The world?

Dear lord.

The concept of being "compelled" by a logical argument should not be this difficult to understand, such that I'm just assuming you're misunderstanding on purpose.

Of course you are free to act in any way you choose. Believing X is the right moral thing to do doesn't guarantee a person will do X.

...Why do you think that an act being logical would make it moral? Or vice versa for that matter?

If you want to participate in "morality", which is inextricably meshed with the experience of conscious beings, then the logic is airtight. If you choose not to participate in morality, none of this will concern you.

Do you want to reduce suffering, or do you want to increase flourishing?

You can do both, though the reduction of gratuitous suffering is more urgent.

The objection is simple, regardless of whether it is logical or not, the sort of universalist utilitarianism espoused by Singer and others is fundamentally inimical to human flourishing due to it's tendency to promote sociopathic and self-destructive behavior

Yes, "self" destructive behavior is absolutely necessary. (Sociopathic behavior has nothing to do with it.)

Self-destructive behavior is a dramatic way of saying selflessness, or the lack of selfishness. And normalizing this is a way forward.

Addiction to self is a big part of the problem.

Yeah, he's better at rockets than NASA, but he'll fail at being a Reddit mod. Sure.

I'm not sure Trump saying something is very good evidence it's true.

It's not that meaningful to the people currently working at Twitter.

He was able to rid himself of 90% of them, so it's sort of immaterial.

Somewhat related...

What's interesting to me is all the flack he's getting about giving people the option to either (A) "get hardcore" and work a lot to make Twitter awesome or (B) quit and get severance.

We've gotten a bit nutty about "work-life" balance. Some people don't want that. They like to work a lot. It's not like Musk is enslaving people and forcing them to do manual labor for god's sake. They get to choose to work at a sweet ass campus doing shit they love for great pay.

I'm very certain Musk, literally one of the most recognizable people in the world, can find the people he needs to run a lean & mean ship at Twitter, and make it awesome. Because plenty of people would LOVE to work 60-80 hours a week on a free speech challenge like Twitter, when it is well-positioned to be The Center of the Internet (to the extent is isn't already).

Call me a cynic, but I'm familiar with enough people who do essentially nothing while getting paid (well) for it that I can empathize a lot with Musk here. In my career, I've seen departments with 20 people handling the workload of 2 or 3, and departments that were 90% automated years ago...but the fog of bureaucracy allowed 10 people to just draw a paycheck for standing around and watching a system.

Musk doesn't want dead weight, as no business owner does.

...it's a "mugging". Singer's core thesis is that the core determinant of an act's morality is whether it is done with the goal of increasing global net utility. He uses the drowning child example of as a sort of high energy gotcha. "Oh you don't want to increase global net utility? I bet you're the sort of monster who would just stand by and watch a child drown".

The logic is airtight. The objections are some variation of "that's not realistic!" which isn't really an objection. It's just plain true there is no moral difference between a child drowning in front of you and a child starving across the globe, and all the objections I've heard are weak.

Calling a correct argument a "mugging" because you are compelled by reason and intellectual honesty to accept it doesn't really change anything.

I don't see any contradictions between Singer & Benatar. Both are trying to limit suffering. Antinatalism makes good sense to me.

@fozz appeals to sustainability in iterative games (an argument that's vulnerable to simple rug pulls)

One of the most common objections to consequentialism is based on a hypothetical situation in which a surgeon has to do a delicate brain operation on a patient who happens to be the ideal organ donor for four other patients in the hospital, each of whom will die shortly unless they receive, respectively, a heart, a liver, and – for two of them – a kidney. The doctor is highly skilled, and is confident of her ability to carry out the brain surgery successfully. If she does, her patient will lead a more or less normal life.

We agree that the consequentialist must accept that, in these circumstances, the right thing for the surgeon to do would be to kill the one to save the four, but we do not agree that this means that consequentialism should be rejected. We think, on the contrary, that the appearance of unacceptability here comes from the fact that this is one of those rare cases in which the action is right only if perfect secrecy can be expected. Moreover, it is not an action that should be recommended to others.

It's more than sustainability, as I said, though I think sustainability is at the heart of the issue and will be fine for this discussion.

I'm advocating for a sort of hybrid of utilitarianism & deontology. Ultimately, the ends are what matter. But it turns out the best way of ensuring sustainably good ends involves honoring certain non-strictly-utilitarian principles in certain circumstances.

In your thought experiment, I'd say a correct moral decision would be for the patient to choose to die in order to save the four people who would benefit from his organs. The logic is related non-directed organ donations, where the donor lives.

Deontologically, we'd need to normalize rational, voluntary personal sacrifice such that surgeons would never need to accidently kill patients on purpose to maximize life-saving organ availability. People ought to recognize suffering (regardless of proximity) & feel a very natural obligation to help—as in the child drowning in a shallow pond.

Yes, I am saying people should be willing to sacrifice their own lives to save the lives of others. If by your death you can save 10 other people, while I'm sure we can imagine lots of creative exceptions (e.g. they were 10 Hitlers), but it's generally, and obviously, the right thing to do.

EA recognizes this foundational principle of self-sacrifice.

Giving away all your wealth above subsistence is whack-a-doodle & flies in the face of all of economics. But people are doing it. They're also donating organs to strangers. This will grow to greater and greater levels of sacrifice. Because the logic is airtight. It feels super hard, but there is no escaping the logic.

The characteristic that makes this non-political is that the government (roughly in the same position as the surgeon) has no say in any of this. Your choice to sacrifice self for the sake of others is free and personal.

I'll bet he'll find people without much problem.

And I'll bet he can run a better Twitter with 10% of the staff.

Pretty ironic to hear someone arguing a truly free speech platform—which Musk explicitly says is his most important goal with Twitter—is not that meaningful...on a website that had to be created because of fears of free speech limitations on the social media website from whence it escaped.

Elon's actions will lead directly to profits in a way that is easy to understand.

Twitter was bleeding money, losing $1.1 billion in 2020 alone.

They had a $13M meals program that was feeding less than 10% of the staff because no one showed up to HQ. It was costing hundreds of dollars per meal, with more people preparing food than eating it. It's laughably stupid.

My sense is Twitter was hyper-bloated, with ~10x more employees than they needed, so 90% layoffs seem about right. It's a microblogging website that grew to have a bunch of completely superfluous positions with people who literally contributed nothing.

Right-sizing the staff, cutting needless expenses, adding a revenue stream with a re-imagined Twitter Blue, reducing trolls/bots—these are all common sense. The advertisers will come back, as the only metric that matters is user engagement, and it's at an all time high & will continue to grow through 2024 with what will be the most "entertaining" election in U.S. history.

Elon will turn Twitter into the profitable Center of the Internet, and a certain tribe will be pretending the sky is falling the whole time.

#RIPTwitter & #Twitterisdown was trending during the highest engagement period in Twitter's existence. It's the fakest news that's ever been.

Participating in morality, should you choose to do so, requires allowing yourself to be persuaded/compelled by rational moral arguments.

You could be compelled by Singer's argument & fail to fully align your actions with what you've been persuaded is right & true. (I'm in this camp.)

The conclusions of EA are hard to follow. They often require one to reorient their life significantly.

They require sacrifice for the welfare of others.

Singer, Caplan, and their apologists are exactly the sort of high IQ idiots I had in mind.

Petty insults aside, what is wrong with Singer's argument?

As I said, that (i.e. semantics) is at issue.

Robbing Peter to pay Paul buy bed-nets and donate to Democratic causes is not only acceptable but morally laudable if it increases net utility.

Robbing isn't the right thing, because, among other problems, it's not sustainable. It sows clear distrust & it won't ultimately help you win in pursuit of your value. It works once or twice, but falls apart in an iterative game. (But yes, robbing from the rich in order to feed impoverished children is obviously ethically solid in a vacuum.)

Instead, people ought to be compelled to freely give to the most effective charities via rational arguments & through the example of others. It needs to grow organically into a cultural norm. And, again, robbing is an unstainable norm.

See Singer's 'Child Drowning in a Shallow Pond' thought experiment for the key premise of EA.

EA contains the rational argument for why each person is morally responsible for every other person, regardless of geographic or relational/genetic proximity.

Semantics.

I'm persuaded by the ethical arguments. If, in practice, the ethical arguments are not honored, then the "social movement of EA" is uninteresting to me.

I don't think the FTX/SBF debacle relates to EA at all.

EA is a set of ethical arguments.

That some guy within the sphere of people associated with EA might be greedy/stupid was already in my model.

It doesn't have the slightest effect on the ethical arguments.