site banner
Jump in the discussion.

No email address required.

I don't think the FTX/SBF debacle relates to EA at all.

EA is a set of ethical arguments.

That some guy within the sphere of people associated with EA might be greedy/stupid was already in my model.

It doesn't have the slightest effect on the ethical arguments.

EA is a social movement. Arguments in themselves are completely inert.

Not really. If we have some seemingly plausible argument that still constantly leads people to heinous actions then the good heuristics would be to reject this argument even if we can't conclusively prove that it is always false. Humans are flawed, and evaluating consequences and strength of each argument personally is hard work. I am not sure that sufficiently clever demagogue can't trick me into following something bad if I treat each argument as inert. However, if I put up some guardrails in dangerous places - like if adherents of this concept often turn out to be psychopathic conmen, or if adherents of that concept frequently end up committing genocide when they raise to power - I may be restricting myself a bit but at least I cut off a huge chunk of possibility space where I would be convinced to follow something very bad.

Sure this Bankman guy fucked up, but I think you’d be hard pressed to find any ideology without its share of bad actors. It can be fair to dismiss an ideology at some point based on this heuristic after enough consistent failures with few counterbalancing successes (maybe communism comes to mind as an example of this category). But does EA fit that condition?

Does it consistently lead people to bad actions? More frequently than other systems?

That remains to be established. I hope the answer is "no" because we have enough nicely looking ideological packages leading people into bad places, we don't need more. But my point is not that it's bad, my point is it's not "inert" - we need to watch out and evaluate whether it's bad or not, we can't just rely on the idea that "ideas are inert", because very often they aren't.