site banner

Five More Years | Slate Star Codex

slatestarcodex.com

On this day five years ago, Scott made a list of graded predictions for how the next five years would pan out. How did he do?

He correctly predicted that Democrats would win the presidency in 2020. He correctly predicted that the UK would leave the EU and that no other country would vote to leave. He seemed under the impression that Ted Cruz would rise up to take Trump's mantle, but to my mind the only person in the Republican party who has a meaningful chance of opposing Trump is DeSantis. I think a lot of the technological predictions were too optimistic (specifically the bits about space travel and self-driving vehicles) but I don't work in tech and amn't really qualified to comment.

Near the end of the article, in a self-deprecating moment, he predicts with 80% confidence that "Whatever the most important trend of the next five years is, I totally miss it". To my mind, the most significant "trend" (or "event") of the last five years was Covid, and I think he actually did okay on this front: the second-last section of the article is a section on global existential risks:

Global existential risks will hopefully not be a big part of the 2018-2023 period. If they are, it will be because somebody did something incredibly stupid or awful with infectious diseases. Even a small scare with this will provoke a massive response, which will be implemented in a panic and with all the finesse of post-9/11 America determining airport security.

  1. Bioengineering project kills at least five people: 20%
  1. …at least five thousand people: 5%

Whether you think those two predictions cames to pass naturally depends where you sit on the lab leak hypothesis.

30
Jump in the discussion.

No email address required.

Roe v. Wade substantially overturned: 1%

fits pretty nicely with

At least one prediction here is horrendously wrong at the “only a market for five computers” level: 95%

But if COVID was the result of gain-of-function research (which seems to me pretty likely), then yeah, maybe it's the bioengineering one.

Despite the 5% chance of 5,000 deaths, I would say if you squint - "If they are, it will be because somebody did something incredibly stupid or awful with infectious diseases. Even a small scare with this will provoke a massive response, which will be implemented in a panic and with all the finesse of post-9/11 America determining airport security. Along with the obvious ramifications, there will be weird consequences for censorship and the media, with some outlets discussing other kinds of biorisks and the government wanting them to stop giving people ideas. The world in which this becomes an issue before 2023 is not a very good world for very many reasons." - holds up pretty well.

Good reminder that these sorts of prediction are really hard, perhaps impossible. All in all, I would consider this to be a pretty reasonable effort at a nearly impossible thing.