site banner

Small-Scale Question Sunday for March 10, 2024

Do you have a dumb question that you're kind of embarrassed to ask in the main thread? Is there something you're just not sure about?

This is your opportunity to ask questions. No question too simple or too silly.

Culture war topics are accepted, and proposals for a better intro post are appreciated.

2
Jump in the discussion.

No email address required.

If you have a secular ethical framework that is not utilitarianism or something utilitarian-adjacent (eg consequentialism), what is it? I’m having a difficult time imagining a system that can’t be understood through some broadly-conceived utilitarian underpinning.

If the term "utilitarianism" can be extended to cover any ethical system that a reasonable person might adopt, then we run the risk of making the term vacuous.

Suppose we have a person who has to choose between two mutually exclusive options. He can spend his life becoming a great novelist, or he can spend his life working in tech and making a lot of money to spend on malaria nets for Africans. If he becomes a novelist, his work will be regarded by future generations of literature aficionados as one of the pivotal novels of the 21st century, although it will have limited impact outside of academic circles. If he instead spends his life buying malaria nets, he will save some non-trivial number of lives in the DRC (although of course the future impact of these individuals is impossible to calculate).

According to the brand of utilitarianism endorsed by Peter Singer and a number of Effective Altruists, it would be morally blameworthy of the person to not spend his life buying malaria nets and saving other people. I on the other hand think he is equally free from a moral perspective to choose either option, and in fact I'd be inclined to say that becoming a great novelist is the better option, because it would be a shame to waste a genuinely unique talent. How can utilitarianism accommodate my position?

You could say "well you're still basing your decision off of what you think maximizes The Good, and utilitarianism is just maximizing The Good, so it's still utilitarian". But the claim that we should pursue The Good is uncontroversial, perhaps even tautological. The purpose of a moral system is to describe, in explicit terms, what The Good is in the first place.

There is room for both. This scenario also presupposes accurate forecasting of outcomes. There is no way to know if you'll be able to wright the seminal novel of a generation or even be a great programmer. Your position only exists in the fictitious past. If your choice is between working and saving some people people and doing nothing, then you should work and save some people. That is something we can predict the outcome of.