site banner

Friday Fun Thread for September 29, 2023

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

This is not fun. But I need to discuss it.

This is a bit naive. But I think one of the underrated aspects of living in our times is that we get to see poverty being eradicated in real time. And there is a real chance that it might be a thing of the past by the time Im old.

Im at a place now where I can start making dents myself. As in donate to charities.

The weird part. I only feel the want for Asia and Latin America to improve. Even though the biggest improvements are to be had in Africa. When I think of Africa I just think that they are so far gone, there is really nothing to be done about it. And in part they also deserve it. Ive seen with my own eyes Asians of various flavors put their lives on the line just to make a living. Africans not so much. The ones Ive met have been on general much more prone to a life of free riding, crime and whatnot than their poor Asian counterparts who are working machines who make massive sacrifices for a better standing.

The dilemma is obviously the highest "ROI" of my charity money is to be had in Africa. But I dont want to help Africans.

  • -11

Object-level: Obviously, Effective Altruism's concept of effective charities is what you want here. The usual place to start is GiveWell, but all of their top charities work mostly in Africa. Individual EAs often donate to other specific causes though, and there are many smaller projects and charities catalogued in various places. There's GWWC, maybe the places open philanthropy gives grants to, maybe just browse posts on the EA forum. Some of them do most of their work in specific places, some of which are in Asia, so you'd just look around there.

I think, even from a universalist hedonist utilitarian perspective, the longtermist idea that something like AI safety research or governance work is more important than malaria nets is very compelling. Marginally preventing a few dozen unnecessary deaths vs playing a part in shaping the entire future. Or at least, it would be if AI safety research and governance work was net positive or doing anything important, which isn't obvious. But however you approach it, the issue of AI and future technology transforming everything does seem to eat every other ethical concern if you think enough about it. Holden Karnofsky, previously co-CEO of Open Philanthropy, recently stepped down to focus on AI and is now "Director of AI Strategy".

the issue of AI and future technology transforming everything does seem to eat every other ethical concern if you think enough about it

If someone things that whatever goal is so important that all tool are justified and ethical concerns are unimportant, then it will not go well

I despise such hand wringing on whether or not something as basic as cause prioritization is warranted. The question is whether its true, and everything follows downstream of that.

Do you deny the general principle that some things can be considered to be more important than others? If not, then your issue is with the object level arguments for why AI is the most pressing issue of our time. Anyone who doesn't see the blistering speed of progress and the obvious issues arising from us creating something smarter than us that we are not ~100% sure we can control is, to put it bluntly, not making full use of even their own human intelligence. I don't trust their judgment of what a superhuman one would do.

Otherwise it's going "Oh no, won't someone think of the clogged toilets!" when your ship is about to hit an iceberg. Humans have been trading off things for each other for as long as we've existed, and I don't want to waste both of our time by giving a billion examples of it being true.

cause prioritization is entirely fine, deciding that anything is justified to reach goal X is not

Who exactly says "anything" is justified? That's a strawman if I've seen one.

Even Yudkowsky claims that dropping bombs on data centers is justified, not that we should blow up the entire planet in advance or return to the stone age.

Serious problems justify serious solutions, that's the whole point.

maybe I misunderstood

the issue of AI and future technology transforming everything does seem to eat every other ethical concern if you think enough about it

or extrapolated it too far

Fair enough, but I'd like to reframe your concerns with a hypothetical example-

Imagine we spot an asteroid on some deep space scan that has a significant non-zero chance of hitting Earth within a decade and causing a mass extinction event. For anything but <1% odds, any intervention necessary should necessarily take precedence over everything else.

As for AI, plenty of people think the odds are much much worse, and the time scales shorter

My position is that some basic and minimal rules should be upheld, for several reasons.

  1. many ethical positions are actually coordination rules: society with random murder, rape and looting is simply less efficient than one that manages to avoid such destructive tendencies (and while you can claim that some external looting may be efficient: it got less efficient over history, and for asteroid impact we would want global coordination anyway)

  2. if scenario X gives unlimited power to powerful they will happily invent fake X scenario or exaggerate it, we should limit incentives to that

  3. there are many ethical positions that I would not want abandon, even if someone credibly claims that it will would have good consequences (I do not care how much convincing sophistry would be applied is that slavery and rape should be legal, I am going to oppose it anyway even if superintelligent aliens would arrive and announce that it should be done)

  4. scenario X may be based on serious mistake and not actually apply

For asteroid impact: I would accept 50% asteroid tax, I would not accept slavery and outlawing criticism of government.

In general I would not accept "any intervention necessary", as it often results in counterproductive interventions or utterly not needed evil. Though I have no big illusions about my potential influence. Or would be likely to be convinced to support stupid policies anyway, lockdowns initially seemed a good idea to me (not examined yet whether it made sense to start them or whether it was stupid/evil/based on pure panicking).

Note that we had several cases in history of (2)/(4) scenario happening

More comments