- 40
- 8
What is this place?
This website is a place for people who want to move past shady thinking and test their ideas in a
court of people who don't all share the same biases. Our goal is to
optimize for light, not heat; this is a group effort, and all commentators are asked to do their part.
The weekly Culture War threads host the most
controversial topics and are the most visible aspect of The Motte. However, many other topics are
appropriate here. We encourage people to post anything related to science, politics, or philosophy;
if in doubt, post!
Check out The Vault for an archive of old quality posts.
You are encouraged to crosspost these elsewhere.
Why are you called The Motte?
A motte is a stone keep on a raised earthwork common in early medieval fortifications. More pertinently,
it's an element in a rhetorical move called a "Motte-and-Bailey",
originally identified by
philosopher Nicholas Shackel. It describes the tendency in discourse for people to move from a controversial
but high value claim to a defensible but less exciting one upon any resistance to the former. He likens
this to the medieval fortification, where a desirable land (the bailey) is abandoned when in danger for
the more easily defended motte. In Shackel's words, "The Motte represents the defensible but undesired
propositions to which one retreats when hard pressed."
On The Motte, always attempt to remain inside your defensible territory, even if you are not being pressed.
New post guidelines
If you're posting something that isn't related to the culture war, we encourage you to post a thread for it.
A submission statement is highly appreciated, but isn't necessary for text posts or links to largely-text posts
such as blogs or news articles; if we're unsure of the value of your post, we might remove it until you add a
submission statement. A submission statement is required for non-text sources (videos, podcasts, images).
Culture war posts go in the culture war thread; all links must either include a submission statement or
significant commentary. Bare links without those will be removed.
If in doubt, please post it!
Rules
- Courtesy
- Content
- Engagement
- When disagreeing with someone, state your objections explicitly.
- Proactively provide evidence in proportion to how partisan and inflammatory your claim might be.
- Accept temporary bans as a time-out, and don't attempt to rejoin the conversation until it's lifted.
- Don't attempt to build consensus or enforce ideological conformity.
- Write like everyone is reading and you want them to be included in the discussion.
- The Wildcard Rule
- The Metarule
Jump in the discussion.
No email address required.
Notes -
I don't think the FTX/SBF debacle relates to EA at all.
EA is a set of ethical arguments.
That some guy within the sphere of people associated with EA might be greedy/stupid was already in my model.
It doesn't have the slightest effect on the ethical arguments.
I think it does. The standard conservative critique of utilitarianism is that it externalizes responsibility and incentivizes sociopathic behavior. Robbing Peter to
pay Paulbuy bed-nets and donate to Democratic causes is not only acceptable but morally laudable if it increases net utility. This characterization may be dismissed by members of the EA community as an uncharitable strawman, but as Bankman-Fried has just demonstrated this is not a strawman at all.Robbing isn't the right thing, because, among other problems, it's not sustainable. It sows clear distrust & it won't ultimately help you win in pursuit of your value. It works once or twice, but falls apart in an iterative game. (But yes, robbing from the rich in order to feed impoverished children is obviously ethically solid in a vacuum.)
Instead, people ought to be compelled to freely give to the most effective charities via rational arguments & through the example of others. It needs to grow organically into a cultural norm. And, again, robbing is an unstainable norm.
See Singer's 'Child Drowning in a Shallow Pond' thought experiment for the key premise of EA.
EA contains the rational argument for why each person is morally responsible for every other person, regardless of geographic or relational/genetic proximity.
As another user user put it "real effective altruism has never been tried" and "Compelled" is a telling choice of words
Singer, Caplan, and their apologists are exactly the sort of high IQ idiots I had in mind.
Participating in morality, should you choose to do so, requires allowing yourself to be persuaded/compelled by rational moral arguments.
You could be compelled by Singer's argument & fail to fully align your actions with what you've been persuaded is right & true. (I'm in this camp.)
The conclusions of EA are hard to follow. They often require one to reorient their life significantly.
They require sacrifice for the welfare of others.
Petty insults aside, what is wrong with Singer's argument?
No, being a utilitarian/rationalist requires allowing yourself to be persuaded by rational moral arguments.
As @FarNearEverywhere put it in another thread, it's a "mugging". Singer's core thesis is that the determinant of an act's morality is whether or not it is done with the goal of increasing global net utility. He uses the drowning child example of as a sort of high energy gotcha. "Oh you don't want to increase global net utility? I bet you're the sort of monster who would just stand by and watch a child drown". Singer's critics, and critics of utilitarianism in general, point out that Singer's model has a lot of anti-social implications/second order effects which undermine it, see the prior example of robbing Peter to pay Paul, and this bit from existential comics.
These criticisms are often dismissed as strawmen, and "things that would never happen" but in practice they keep happening.
Edit: a word
The logic is airtight. The objections are some variation of "that's not realistic!" which isn't really an objection. It's just plain true there is no moral difference between a child drowning in front of you and a child starving across the globe, and all the objections I've heard are weak.
Calling a correct argument a "mugging" because you are compelled by reason and intellectual honesty to accept it doesn't really change anything.
I don't see any contradictions between Singer & Benatar. Both are trying to limit suffering. Antinatalism makes good sense to me.
and?
...Why do you think that an act being logical would make it moral? Or vice versa for that matter?
The objection is simple, regardless of whether it is logical or not, the sort of universalist utilitarianism espoused by Singer and others is fundamentally inimical to human flourishing due to it's tendency to promote sociopathic and self-destructive behavior. Now as an anti-natalist you might find that objection weak because you don't put a whole lot of stock in the value human flourishing, but if that's the case I'm afraid that I am just going to have to cite irreconcilable differences. Do you want to reduce suffering, or do you want to increase flourishing?
The thing is that I am being intellectually honest, and that is exactly why I do not feel compelled to accept Singer's, Benatar's, or your arguments. I know what my goals and standards are, and the standard I try to hold myself to is "don't say anything you don't mean". Absolute sincerity even (perhaps especially) in the face of Armageddon. hat-tip to @DaseindustriesLtd up-thread
Edit: a word
If you want to participate in "morality", which is inextricably meshed with the experience of conscious beings, then the logic is airtight. If you choose not to participate in morality, none of this will concern you.
You can do both, though the reduction of gratuitous suffering is more urgent.
Yes, "self" destructive behavior is absolutely necessary. (Sociopathic behavior has nothing to do with it.)
Self-destructive behavior is a dramatic way of saying selflessness, or the lack of selfishness. And normalizing this is a way forward.
Addiction to self is a big part of the problem.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link