Advanced search parameters (with examples): "author:quadnarca", "domain:reddit.com", "over18:true"
Showing 1 of 1 result for
domain:nunosempere.com
The linked post seeks to outline why I feel uneasy about high existential risk estimates from AGI (e.g., 80% doom by 2070). When I try to verbalize this, I view considerations like
-
selection effects at the level of which arguments are discovered and distributed
-
community epistemic problems, and
-
increased uncertainty due to chains of reasoning with imperfect concepts
as real and important.
I'd be curious to get perspectives form the people of the Motte, e.g., telling me that I'm the crazy one & so on.
Regards,
Nuño.
Link copied to clipboard
Action successful!
Error, please try again later.