The Irony of “Longtermism”

See also: The Irony of “Progress Studies”

Defined by Will MacAskill for EA Forum, Longtermist is “the view that the most important determinant of the value of our actions today is how those actions affect the very long-run future.”

The irony is that this only holds true in the abstract. According to an Open Philanthropy estimate and AI Expert Surveys, there’s a 50% chance of transformative Artificial Intelligence emerging by around 2050. If that happens, basically nothing we do in the meantime will matter, at least not with regards to total expected utility. Ensuring that the AI is safe, human-aligned, benevolent, etc, is of primary and nearly sole importance.

If you take this idea seriously, we should be obsessed with the short term to the exclusion of all other timescales.

This has practical implications. For example: should you expend energy cultivating the next generation of scientists, or just focus on your own research output? If we seriously only have 30 years, the latter becomes much more compelling.

Similarly, if you’re serious about longtermism, the altruistic case for having children becomes much weaker. Particularly precocious offspring might be able to productively do longtermism-relevant work in their mid-twenties, but that’s not enough time to recuperate the cost of raising them. [1][2]

The average age of respondents to EA Survey 2019 was 31. Similarly, SSC’s Survey gives a median reader age of 30. So even if it’s a bit over 30 years to transformative AI, it’s not as if the existing cohort of longtermists will die off. Will MacAskill will be just 63 in 2050, Hilary Greaves 71 and Toby Ord 70. Nick Bostrom will be the oldest at 77, but he reportedly skips meals to drink a vegetable “elixir”, so I think he’ll be okay.

Quick step back: I’ve been duplicitous in my use of “concern”. The longtermist mindset is something like: “We care immensely about the long-term, that’s why we focus immense on the short-term”. But that’s precisely my point. This isn’t a deep contradiction, it’s not hypocritical. It’s just ironic.

Score another point for nominative anti-determinism.


Footnotes
[1] If you’re really altruistic (and perhaps sociopathic), you could have kids who you never see, though perhaps children treated that way are unlikely to follow in your footsteps, and may actually be perversely likely to become some kind of scorned longtermism supervillain.

[2] This also means Alexey Guzey’s offhand criticism doesn’t land with any particular force.


As usual, you could have skipped this entire post and just read a tweet instead.