Replying to Robert Wiblin on Young Rationalists

[Edit 2021/02/07] I got a couple ages wrong in the original spreadsheet. Mean/median age at founding was previously listed as 26, but is actually 27. Mean/median age in 2020 was listed as 36 and 35, but are actually both 37. This very slightly weakens my arguments. The historgrams were also off and have been corrected. Thanks to Gytis Daujotas for catching these.

In response to Where are all the Successful Rationalists, Robert Wiblin writes:
The EA community seems to have a lot of very successful people by normal social standards, pursuing earning to give, research, politics and more… Typically they aren’t yet at the top of their fields but that’s unsurprising as most are 25-35.

He’s basically right. On the SSC 2019 survey, the median reader was 30. [1]

So Wiblin’s right, but his comment begs a much more important question: why do rationalists think being young is incompatible with being at the top of your field?

Take for example, Patrick Hsu. He’s 29, an assistant professor at Berkeley, and has his name on several seminal CRISPR papers. He reads and endorses Alexey Guzey so he’s clearly into weird blogs on the internet. It’s not unreasonable to think that people like this should be part of the rationality community.

Or to take a more visible case, consider all the very young startup founders.

Brian Timar looked at the companies he’s interested in, and found that the average and median age at founding was 30. If you scan YC’s top 100 companies, many of the founders were under 35 when they started:

92% of founders were under 30 when they started their companies, mean and median age were both 27.

But okay, we might not expect to know about founders who are just getting started. You’re not successful until your company actually succeeds. So how old are those same founders now? Still pretty young.

38% are still under 35, and 85% are under 40. Mean and median are both 37.

Data for both charts here.

Granting Wiblin’s point about youth, I’ll ask again: where are all the successful rationalists?


It gets a lot worse when you consider that ideologically, rationalists should be uniquely well positioned to start a company.

Linear Returns from Wealth
For a normal person, the expected financial value of a startup may be high, but the expected returns to personal happiness are very low. A billion dollars will probably only make you a little bit happier than million, so it makes sense to be risk averse and keep your day job. But for a rationalist utilitarian, returns from wealth are perfectly linear! Every dollar you earn is another dollar you can give to prevent Malaria. So when it comes to earning, rationalists ought to be risk-neutral, and skew more heavily than normal people towards starting companies.

Willingness to be Weird and Lame
Paul Graham says “One of the biggest things holding people back from doing great work is the fear of making something lame.” Meanwhile, rationalists totally disregard this tendency to spend years working on things no one else cares about. Will MacAskill has been speaking for years on the importance of keeping EA weird, but even before that there was a decade of posts about nootropics, longevity and superintelligence, long before it was cool.

Another panel from back in 2009: “I think we can fairly say that we’re all, Peter maybe less so, not afraid to being weird”

Low Probability High Return Bets
Rationalists are also already into doing things with a tiny probability of huge impact, as described in OpenPhil’s Hits-based Giving. This is the entire justification for caring about this like AI Safety. You multiply out the probabilities, and preventing even a 1-in-a-thousand chance of extinction turns out to be a very effective use of time.

Slant Towards Tech
Many rationalists are in the UK, but today the epicenter skews more towards the Bay Area. As I wrote last time, 40% work in software engineering, so they should be relatively well poised to start companies.

So seriously, where are all the successful rationalists?


Many qualities are ascribed to startup founders. Visionary, optimist, contrarian, workaholic.

What you don’t hear is founders praised for their intellectual honesty.

That shouldn’t come as a surprise. It’s no secret that you need a kind of unreasonable self-confidence to pitch VCs. Less discussed but analogous is the process of recruiting early employees. It is not very compelling to offer: “Come work for me, if we’re incredibly lucky there’s a miniscule chance you’ll get .1% of a billion dollar company which comes out to $500 thousand after dilution, $250 thousand after tax, over 4 years, minus the strike price.”

Alex Danco asks Are Founders Allowed to Lie. It’s a good piece and you should read it, but the fact that he even has to ask means it’s possible the answer is “yes”, which is just not something any normal industry says about its leaders.

The easiest explanation is that founders really are just consciously manipulative, but I worry we’re underestimating how hard that would be. It takes enormous energy to maintain a lie, and tremendous sociopathy to do so consciously. From what I can tell, a lot of these founders are actually disproportionately philanthropic. I can’t rule out that this is just a PR move or whatever, but this whole idea just feels somewhat extreme and conspiratorial.

So okay, maybe it’s unconscious? Maybe founders are uniquely out of touch with reality and genuinely believe that they’re very likely to beat the overwhelming odds against them?

Again, this doesn’t feel right. Sure, you have to be optimistic, but you can’t be straight up delusional and continue to function at a very high level. Kara Swisher and Elon have a great exchange about this, maybe my favorite moment in any interview ever:

[KS:] What about things that are just critical of you that you don’t like? Do you think you’re particularly sensitive?

[EM:] No. Of course not. Count how many negative articles there are and how many I respond to. One percent, maybe. But the common rebuttal of journalists is, “Oh. My article’s fine. He’s just thin-skinned.” No, your article is false and you don’t want to admit it.

Do you take criticism to heart correctly?

Yes.

Give me an example of something if you could.

How do you think rockets get to orbit?

That’s a fair point.

Not easily. Physics is very demanding. If you get it wrong, the rocket will blow up. Cars are very demanding. If you get it wrong, a car won’t work. Truth in engineering and science is extremely important.

Right. And therefore?

I have a strong interest in the truth.


If founders aren’t liars or delusional, what could explain their seemingly irrational optimism?

Rather than general dishonesty, my theory is that founders neglect one kind of reasoning very specifically. The same kind most rationalists are obsessed with: taking the outside view.

I’m using “outside view” as a kind of general term for meta-level thinking, consulting base rates, or using bayesian epistemology. Basically, it means not trusting your first-order estimates too much, looking around to see whether or not those estimates are justified, and reasoning “from behind the veil”. As Inadequate Equilibria describes it:

Modest epistemology doesn’t need to reflect a skepticism about causal models as such. It can manifest instead as a wariness about putting weight down on one’s own causal models, as opposed to others’…

If we were fully rational (and fully honest), then we would always eventually reach consensus on questions of fact. To become more rational, then, shouldn’t we set aside our claims to special knowledge or insight and modestly profess that, really, we’re all in the same boat? [2]

Here’s a more concrete example: A rationalist has a good startup idea, so they set out to calculate expected value. YC’s acceptance rate is something like 1%, and even within YC companies, only 1% of them will ever be worth $1 billion. So your odds of actually having an exit of that magnitude are 10,000 to 1, and then you’re diluted down to 10% ownership and taxed at around 50%. Of course, there are exits under and above a billion, but back-of-the-napkin, you’re looking at an expected $5,000 for 10+ years of work so grueling that even successful founders describe it as “eating glass and staring at the abyss”. [3]

This is so deeply ingrained in my head as the rational way to think, that it took me a long time to realize that other people just fundamentally don’t approach problems this way. I would venture to guess that the normal train of thought is closer to: “Most startups fail, but that’s because their ideas are bad. Since my idea is very good, I’ll neglect the base rates. I am very special.”

Does that sound mean? It shouldn’t. There’s nothing wrong with thinking that you’re special. It’s not a moral belief, or a claim to entitlement. It’s just the understanding that you are not a median member of the general population, so base rates about “anyone who has ever applied to YC” don’t apply.

The rationalist sees the 1% acceptance rate and gets intimidated. Normal people see that applying to YC explicitly does not require a business plan, incorporation, existing revenue, or an introduction, and understand that any idiot with a couple hours can will out a web form. Accordingly, they totally ignore the base rate.

That’s all the acceptance part of starting a company, but much more important is actually coming up with an idea you believe in. Rationalists tend to accept the Efficient Markets Hypothesis. They look at an industry, think “what are the odds I know more than people who have done this for a decade?” and assume any seeming inefficiencies are just a Chesterton’s Fence.

That’s not what normal people do at all. Normal people look at an industry, they see a gross inefficiency staring at them in the face, and they think “wow, that’s grossly inefficient!”

And then sometimes, they even set out to solve it.


[1] If the median rationalist is now 30, and Yudkowsky started writing in 2007, was his audience mostly teenagers?

[2] I’m not exaggerating. In fact, this is a massive oversimplification of the estimates of startup success rationalists actually put together.

[3] To be fair, Yudkowsky is specifically attempting to correct against modest epistemology, concluding with an exhortation to not take the outside view so much and instead “spend most of your time thinking about the object level”. To be clear, this is not his solution for all humans, nor his model of perfect rationality. It’s targeted specifically at the kinds of people who read this book and who he believes are a) disproportionately likely to overvalue the meta level and b) disproportionately likely to have good object level beliefs.