- 40
- 8
What is this place?
This website is a place for people who want to move past shady thinking and test their ideas in a
court of people who don't all share the same biases. Our goal is to
optimize for light, not heat; this is a group effort, and all commentators are asked to do their part.
The weekly Culture War threads host the most
controversial topics and are the most visible aspect of The Motte. However, many other topics are
appropriate here. We encourage people to post anything related to science, politics, or philosophy;
if in doubt, post!
Check out The Vault for an archive of old quality posts.
You are encouraged to crosspost these elsewhere.
Why are you called The Motte?
A motte is a stone keep on a raised earthwork common in early medieval fortifications. More pertinently,
it's an element in a rhetorical move called a "Motte-and-Bailey",
originally identified by
philosopher Nicholas Shackel. It describes the tendency in discourse for people to move from a controversial
but high value claim to a defensible but less exciting one upon any resistance to the former. He likens
this to the medieval fortification, where a desirable land (the bailey) is abandoned when in danger for
the more easily defended motte. In Shackel's words, "The Motte represents the defensible but undesired
propositions to which one retreats when hard pressed."
On The Motte, always attempt to remain inside your defensible territory, even if you are not being pressed.
New post guidelines
If you're posting something that isn't related to the culture war, we encourage you to post a thread for it.
A submission statement is highly appreciated, but isn't necessary for text posts or links to largely-text posts
such as blogs or news articles; if we're unsure of the value of your post, we might remove it until you add a
submission statement. A submission statement is required for non-text sources (videos, podcasts, images).
Culture war posts go in the culture war thread; all links must either include a submission statement or
significant commentary. Bare links without those will be removed.
If in doubt, please post it!
Rules
- Courtesy
- Content
- Engagement
- When disagreeing with someone, state your objections explicitly.
- Proactively provide evidence in proportion to how partisan and inflammatory your claim might be.
- Accept temporary bans as a time-out, and don't attempt to rejoin the conversation until it's lifted.
- Don't attempt to build consensus or enforce ideological conformity.
- Write like everyone is reading and you want them to be included in the discussion.
- The Wildcard Rule
- The Metarule
Jump in the discussion.
No email address required.
Notes -
I don't think the FTX/SBF debacle relates to EA at all.
EA is a set of ethical arguments.
That some guy within the sphere of people associated with EA might be greedy/stupid was already in my model.
It doesn't have the slightest effect on the ethical arguments.
I think it does. The standard conservative critique of utilitarianism is that it externalizes responsibility and incentivizes sociopathic behavior. Robbing Peter to
pay Paulbuy bed-nets and donate to Democratic causes is not only acceptable but morally laudable if it increases net utility. This characterization may be dismissed by members of the EA community as an uncharitable strawman, but as Bankman-Fried has just demonstrated this is not a strawman at all.Robbing isn't the right thing, because, among other problems, it's not sustainable. It sows clear distrust & it won't ultimately help you win in pursuit of your value. It works once or twice, but falls apart in an iterative game. (But yes, robbing from the rich in order to feed impoverished children is obviously ethically solid in a vacuum.)
Instead, people ought to be compelled to freely give to the most effective charities via rational arguments & through the example of others. It needs to grow organically into a cultural norm. And, again, robbing is an unstainable norm.
See Singer's 'Child Drowning in a Shallow Pond' thought experiment for the key premise of EA.
EA contains the rational argument for why each person is morally responsible for every other person, regardless of geographic or relational/genetic proximity.
As another user user put it "real effective altruism has never been tried" and "Compelled" is a telling choice of words
Singer, Caplan, and their apologists are exactly the sort of high IQ idiots I had in mind.
Caplan is in fact anti-Singerian on this issue, whatever his other faults or positions. Here, @VelveteenAmbush has linked to this piece where Caplan demonstrates Singerian logic:
The point is roughly as you say: utilitarians are only honest inasmuch as it is strategically expedient in a particular case. They cannot credibly precommit to a rule, for the simple reason that their prime directive is increasing utility; in a hypothetical scenario where they receive compelling evidence that breaking a rule would achieve that end with no cost-prohibitive second-order effects, they are ethically bound to break that rule. Of course, they can counter that deontologists are equally vulnerable to inconvenient hypotheticals (Kant's axe-wielding murderer), if not more so; but it is what it is. A utilitarian can, at most, say «I will not defect against the conventional morality to opportunistically increase utility, so long as I am not double dog sure it won't backfire». They couch it in different words: Yud says that only God gets to be a naive utilitarian (tbh based), Scott inserts his fretful caveats, Kelsey Piper has another set, @fozz appeals to sustainability in iterative games (an argument that's vulnerable to simple rug pulls) and Singer says, essentially, that the better breed of humans get to be our secret masters, our empathy-filled shepherds (or as Chesterton would've put it, omnipotent moral busybodies. I wonder what Lewis or Dostoyevsky would have made of it).
But that's what it amounts to: means justifying ends with the ends being greatest good for the greatest number, doing as much diligence as possible given the situation; and praying that you're not too high on your own supply or on performance-enhancing drugs to miss a critical hole in your too-clever-by-half plan, or a whistleblower.
P.S. I modestly propose a improvement on ethical dilemmas, to put a finer point on the real issue. (Probably this already exists in the literature).
You're a surgeon, you have five patients, you can kill one for his organs to save four yadda yadda. The twist is: you have an intern who's a deontologist. Luckily, the intern has a well-suppressed but rapidly lethal condition, let's say diabetes or epilepsy or extreme allergy, that allows you to easily cause her death (e.g. by swapping available insulin with saline, or releasing your pet Asian giant hornet you secretly keep for this exact scenario) and claim innocence.
A) Assuming the intern will discover and ruin your plan otherwise, do you kill her before the operation begins?
B) You may kill the Neurological Patient on your own, but you'll need the intern's help to finish all transplantations in time. If you let her go home, you'll be able to obscure the paperwork, but two of your beneficiaries will surely die. If you solicit her help, she will inevitably check medical records of all patients involved later, discover that the Neurological Patient had a trivial condition and, given your skill, was unlikely to die without your intent, figure it out and report you, causing a scandal, sending you behind bars and preventing all possible future utilitarian transplantations, perhaps not just in this clinic but globally.
Again, will you DO IT or not?
To go further, if you kill the hypothetical intern before you harvest their organs do you take pictures and sell them, presumably so that some (according to conventional morality) sick individuals can gratify themselves to them?
The funds thus aquired will allow you to purchase medical supplies to treat the local poor, maximising the utility.
Thus highlighting that utilitarianism is trivial to twist into supporting things that should not be supported. I would go so far as to say trivial to twist to support almost any action.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link