No Revival for the Industrial Research Lab

Bell Labs is dead. It remains dead. And we have killed it.

In fact, even the concept seems to be dead. The top result in Google for Industrial Research Lab is a retrospective history. Pluralize the query as Industrial Research Labs and the top result is a 1946 index of the many labs that used to exist. Try Corporate Research Lab and you’ll get the cheery article The Death Of Corporate Research Labs.

Summarized in this last piece, Rosenthal determines the cause of death:

Lack of anti-trust enforcement, pervasive short-termism, driven by Wall Street’s focus on quarterly results, and management’s focus on manipulating the stock price to maximize the value of their options

Looking backwards, this is a tragedy. How shall we comfort ourselves, the murderers of all murderers?

But looking forward, it’s an immense opportunity! So long as we remedy the underlying causes, the great industrial lab may rise again. We may already be on the right track. Antitrust pressure is rising, and privately held companies not subject to Wall Street’s demands are booming.

What about short-termism? Via Tyler Cowen, Warren (2014) finds “no clear evidence of flawed short-term oriented management practices”.

This is good news! The industrial research lab has fallen, but with the causes gone, it will rise again.

As Ben Southwood concludes his piece for Works in Progress “we will see the return of various large in-house labs.” [1]


I am not so optimistic. Regardless of the antitrust situation, industrial research labs will not return.

Southwood and Rosenthal’s findings are both derived from Ashish Arora, Sharon Belenzon, et al.'s The Changing Structure of American Innovation, worth reading in full. As he concludes:

It seems unlikely that corporate research will rediscover its glory days… For some time, quick wins from low-hanging fruit (such as optimizing auction or advertising formats) may cover up the problem, but the fundamental challenge of managing long-run research inside a for-profit corporation remains a formidable one… incumbent firms continue to rely on outside inventions to fuel their growth. In the longer run, therefore, university research will remain the principal source of new ideas for such inventions. [emphasis mine]

In other words: why bother innovating when you can let someone else do it for you? Firms will still engage in the development half of R&D, but this will take the form of translating existing findings into products, rather than breakthrough fundamental research. Arora & Belenzon again:

In summary, the new innovation ecosystem exhibits a deepening division of labor between universities that specialize in basic research, small start-ups converting promising new findings into inventions, and larger, more established firms specializing in product development and commercialization (Arora and Gambardella, 1994). Indeed, in a survey of over 6,000 manufacturing- and service-sector firms in the U.S., Arora et al. (2016) find that 49 percent of the innovating firms between 2007 and 2009 reported that their most important new product originated from an external source.

…As a result, federal research dollars for the university sector grew from an estimated level of $420 million (1982 dollars) in 1935-1936 to more than $2 billion (1982 dollars) in 1960 and $8.5 billion in 1985. Between 1960 and 1985, the share of university research of GNP grew almost twofold from 0.13 to 0.25… Corporate labs historically operated in an environment where university research and start-up inventions were scarce.

Arora & Belenzon cites a 4.25x increase from 1960 to 1985. What about the years since? Courtesy of the NSF:

Federal funding has continued to increase rapidly, up another 3.5x since 1985. For its part, institutional funding (meaning internal funding in the form of endowments, gifts and so on) has grown over 6x. Faced with this bounty, it’s no wonder firms lost their appetite for footing the bill themselves.


To some extent, this is great news! The industrial research lab may be dead, but that doesn’t mean innovation is over, it’s just coming from universities instead. More funding means more science can get done, and who cares if it happens in industry or academia?

Unfortunately, the research we see today is of a different nature. In a section titled “Inventions originating from large corporate labs are different”, Arora & Belenzon enumerate the kinds of innovations we’ve lost in the shift towards university labs:

  • Corporate labs work on general purpose technologies
  • Corporate labs solve practical problems
  • Corporate labs are multi-disciplinary and have more resources

Again, the paper is worth consulting for full details, but it suffices to say that different mechanisms for attracting, nurturing, and managing talent will result in different types of outputs. None of this is to say that the corporate labs were on balance superior to today’s university labs, merely that we are missing out on some innovations (and likely getting others in return).

Though more sensationally, it’s worth worrying that there’s been a profound exodus of top talent from research into industry. As Patrick Collison describes:

One thing that I think is underemphasized is that [corporate research labs] competed on the basis of compensation. They just paid more than other potential sources of unemployment. PARC’s strategy for aggregating the best computer scientists in the world was to pay them more than they would be earning in academia. And in the 70s, you couldn’t Google and earn millions of dollars a year, Silicon Valley hadn’t really left the launchpad.

To a significant extent, the same thing applies to Bell Labs. They were quite explicit that their strategy was to compensate really well and present more favorable employment than academia… It could be the case that because there are so many high return loci for super talent people to go and deploy their talents, you could never quite aggregate talent to the same extent.

…it’s hard to think of any major successes from these kinds of labs over the last 10 or 15 years. [lightly paraphrased for clarity]

As the lamentation goes, “The best minds of my generation are thinking about how to make people click ads.” It’s hard to prove that there has been a brain drain, but if so, it would indicate a strict loss in quality, as opposed to the more stylistic shifts Arora & Belenzon describes.


Bell Labs is dead, and it’s not coming back. Since corporate labs are uniquely suited to some kinds of valuable research, this loss is troubling.

Where does that leave us?

One option is to attempt a massive overhaul of the entire system. If corporate labs have dwindled in the face of the government funded university system, we ought to redirect some portion of those tax dollars to industry instead and seek a better balance. To some extent, this already happens. DARPA awards funding to small businesses through it’s Small Business Innovation Research grants, as does the NIH.

Even here, corporate labs won’t have the same incentives they had in the glory days from 1940-1970. So long as the university system thrives, firms will pursue growth through innovations discovered externally.

In that light, I have a different proposal: Instead of reviving the corporate research lab out of nostalgia, we should consider the more specific goals these labs accomplished, and then target them directly. If we are eager for more multi-disciplinary or research more closely tied to practical problems, we ought to build institutions to pursue those particular aims.

Even more broadly, we can think of institutions as just one of many possible mechanisms to allocate human capital. Nadia’s Helium Grants are also a mechanism for talent allocation, as is venture capital, and as is Substack.

The scientific world is not merely a collection of modules that produce research in hermetic isolation. Rather, it is better understood as an interconnected ecosystem. Increased profitability in software may cause cost disease elsewhere. An exciting new topic in one field could cause a genius exodus in another. Without this understanding of how it all interacts, attempts to recreate a single piece of the 1960s without the supporting context are doomed from the start.

We should seek to understand conditions as they exist today, think deeply about the particular aims we wish to satisfy, then design improved mechanisms within a contemporary environment.


Thanks to Ashish Arora and Nintil for their comments.


See also
Nintil – Fund People not Projects
Alexey – Reviving Patronage and Revolutionary Industrial Research
Nadia Eghbal – Seed stage philanthropy


Footnotes
[1] Ben Southwood read the same paper as Rosenthal and came to the opposite conclusion: the decline was the result of too much antitrust enforcement. The two only appear correlated because success driven by corporate research leads to antitrust. His full quote is “Perhaps antitrust bodies will be restrained, and we will see the return of various large in-house labs.”, but my summary of the conclusion as optimistic still stands.


Frequently Asked Questions

What about Google Quantum?
I’m horrifically unqualified to make this judgement, but this does stand out as an important achivement.

Jack Hidary of Google’s quantum computing initiatives once said: “We literally created a spreadsheet of the experts in this space. We only came up with 800 names globally.” So maybe this is the exception that proves the rule, and Google was able to succeed precisely because there is not a thriving university system for quantum computing.

It’s also worth understanding the achievement as a result of Google’s partnership with NASA, though I don’t know the details of each party’s contributions.

What about AI?
Again, I’m not really confident here, but as I understand it, breakthroughs in AI consist largely of scaling up existing techniques, or inventing new techniques to enable greater scale.

If that’s a fair summary, the apparent dominance of firms in AI research would seem to be a product of their outsized resources, namely compute and data.

Still, why invest in research instead of applying the findings from academia? My guess is that the field is just moving too quickly, and being even a year or two ahead makes a huge difference.

Does this present a promising blueprint for other fields? Maybe. There are other fields that could see dramatic progress powered largely by advances in compute. Or maybe other fields that are amenable to AI-driven progress sooner than we expect. Though even in these cases, I would not expect Google to take over unless the results translate easily into profits. Instead, we’ll likely continue to see Google partner with universities, while staying focused on their own core competence.

What about Open AI?
OpenAI is a non-profit wrapped in a capped-profit LP, managed like a startup.

But okay, incorporation aside, why does it exist? It could be that AI Safety really is the primary concern, and Open AI was founded by a small group of eccentric billionaires motivated by a contrarian research hypothesis. Or maybe that was once the ostensible excuse, and now it’s just a regular startup that bootstrapped talent agglomeration through hype.

I’m not sure, and I very much hope to read the history of OpenAI once someone (or something) writes it.

What about the development in R&D?
Although the dedicated in-house lab is dead, corporate R&D spending is not.

From Nicholas Bloom in his Conversation with Tyler:

The share of R&D in the US and Europe… funded by the government has been declining over time. In fact, in the US, when you go back to the '60s, roughly two-thirds of it is funded by the government and one-third by private firms. Now it’s the reverse.

According to the NSF, it might be more like 75% private firms:

(I assume “private firms” just means non-governmental, as opposed to “firms not listed on public markets”.)

So yes, corporate R&D spending is very high, but remember that it’s a broad category.

When I think about Bell Labs, I think about Claude Shannon’s Information Theory, a leap in basic research that powered the information age, althought it wasn’t invented for any narrow purpose.

In contrast, the D in R&D stands for “development”, and won’t yield this kind of fundamental breakthrough. Take a look at Google’s 10-K. The costs are broken down as:

  • Cost of revenues
  • Research and development
  • Sales and marketing
  • General and administrative
  • European Commission fines [for antitrust violations]

As described Quartz, “Much of those costs [of revenues] were from the fees Google pays companies like Apple to be the default search engine on iPhones and other devices, which are called traffic acquisition costs.”

So the R&D line refers to Google’s actual published research and the cost of cutting-edge projects like their Quantum Computing initiative, but it’s also just the cost of hiring engineers to work on ads. So yes, corporate R&D spending is high by some measure, but a lot of it is development, and a lot of that development has nothing to do with what we think of as research.

Exactly how much is research and how much is development? I’m not sure, and in some cases, it’s not even clear where you would draw the line.

Progress Studies: A Discipline is a Set of Institutional Norms

In a world with Progress Studies, academic departments and degree programs would not necessarily have to be reorganized. That’s probably going to be costly and time-consuming. Instead, a new focus on progress would be more comparable to a school of thought that would prompt a decentralized shift in priorities among academics, philanthropists, and funding agencies. Over time, we’d like to see communities, journals, and conferences devoted to these questions.

Patrick Collison and Tyler Cowen, We Need a New Science of Progress

Contrary to the article’s title, what we have now is not a “Science of Progress”. It is at best a “Subculture of Progress”, but really, more like a subculture of demanding progress.

How do we bridge the gap?

When William James defined psychology as “the science of mental life”, he did not imagine the institution of academic psychology as it exists today. Psychology is much more than James’s broad charter. It is also an established standard of rigour (e.g. p < 0.05, placebo-controled RCTs), a canonical body of established knowledge founded on that standard, and living practitioners tasked with upholding truth and banishing heresy.

The science of psychology relies also on living institutions. It is a set of journals, grant-making organizations, academic departments and conferences, all with their associated level of prestige.

And then there is folk knowledge: Taboos born from historical failures never to be questioned. Social threads that mediate relationships between practitioners. The particular cultures and subcultures that span those threads. Foundational assumptions held sacred. [1]

As Tyler once described it: “You need barely scratch the surface in our prevailing ideologies to find central questions almost completely unaddressed.”

Unless Progress Studies gains acceptance from the existing institutions, it must strive to build new ones in its name. Otherwise, it risks never ascending to become a genuine “Science of Progress”.

Be careful however, not to cross into institutional role play. It would be too easy to replicate the trappings of “real sciences” without any of the associated benefits. In a quest for legitimacy, we must be careful to ask who we’re seeking it from, lest all power stem from the same corrupted source.

Cargo Cult Science, Serious Social Science

To avoid cargo cult science, you have to first understand the purpose of the thing you’re trying to replicate. It is not enough to build something that looks like an airplane from the outside if you haven’t understood the engine itself.

As Aaron Swartz writes in Serious Social Science:

The first thing that comes is the numbers. Real science papers are filled with tables and graphs and regressions on piles of data, so the social scientists decide to do all that.

…This is not to say that there is anything intrinsically wrong with using math or jargon or making grand claims. But to adopt these habits reflexively is to put the means before the ends. Scientists do not use math because it is complicated but because, for what they are doing, it is effective.

No one has yet attempted to artificially imbue Progress Studies with mathematical complexity, but there have been other attempts to be more like a real science. Remember Jasmine Wang’s attempt to compile a canon of knowledge in the early days of Progress Studies? In the year since, that canon has gone largely untouched by today’s practitioners. [2] Though it still serves as an interesting reference, very little of Wang’s canon is actually widely cited. [3]

I’m prone to my own prescriptive behavior. This whole series is an exercise in trying to explain what Progress Studies ought to be and how it should function. But I’ll admit, these sorts of top-down efforts are unlikely to have much impact.

Instead, the shortest path to becoming a legitimate science is to:

  1. Publish good research
  2. Make the case that it could not exist under an existing field
  3. Label it Progress Studies

This is deceptively simple. In reality, the quality of research can only be judged within a particular institutional framework. We know work is good in other fields because it’s influential, highly cited and revered by the associated scientific community. Despite our fondness of the term “independent researcher”, no such thing has ever been possible. [6]

Minimum Viable Norms

If we don’t need prestigious conferences or journals, what is required?

At a minimum, norms must:

  1. Enable substantive discourse
  2. Which in turn progresses the field
  3. Resulting in a coherent standard of quality
  4. Allowing us to publish good research and label it Progress Studies

Where are we currently in this process?

Without established standards of rigour: authors can go back and forth criticizing each other’s work without making any progress.

Without foundational assumptions: it is too easy to dismiss an entire body of research on grounds it is not even attempting to assert or contest.

Without technical jargon: complex concepts must be rehashed each time, or worse, deployed with different definitions to suit the context. [7]

We already see a bit of this happening. My exchange [8] with Noah Smith is admirable in some sense (at least we are replying to each other), but regrettable in another. I did not really engage with his arguments, but merely attempted to criticize the cultural moment he chose to partake in. In response, he dodged my meta-level objection to optimism as an inconsistent interpretation of the data, and instead chose to double down on his object-level claims.

In other cases, I’ve seen outsiders alienated by the entire concept of Progress Studies as relying on the naive assumption that “progress” is a good worth pursuing. Rather than seen as challenging complacency, we’re accused perpetuating the status quo. Though Tyler Cowen’s Stubborn Attachments attempts to serve as this foundational definition, it is a frequently misunderstood book, still lacking in proper exegesis.

Finally, our jargon is simply not well established, nor are its operationalizations. Just as Effective Altruism settled on DALY, Progress Studies has attempted to coalesce around various measures of productivity and growth. Since Cowen dodged the question in Stubborn Attachments, writing instead of the nebulous Wealth Plus, we’re turned to specific metrics like GDP and Total Factor Productivity. Unfortunately, both are poorly understood, and don’t proxy well for the kind of progress we actually care about. [9]

We may discover further along that more is required, but these three are a good place to start. In the coming months, it will be up to us to propose, experiment with, and coalesce around better norms.

As the Swartz piece concludes:

[It’s] unlikely that the existing disciplines can be reformed. Instead what is needed is a culture of serious social science built outside the existing systems of academia… there is certainly much more to do, including building structures to do the work in.

Science advances one funeral at a time, but scientific institutions merely decay. While tenured professors eventually die, the institutions do not. Without natural senescence, an immortal being can be arbitrarily dysfunctional. [10] [11]

As absurd as it sounds, it is easier to construct a new field from scratch than to reform the existing ones. Without the entrenched interests and sacred institutions, we might actually stand a chance.


Footnotes

[1] See also David Chapman.

[2] For that matter, it’s not clear to me that there are actually Progress Studies researchers. Mostly, it seems to be a side project for people who’s real work is in building non-profits or working at think tanks. [4] [5]

[3] Perhaps you once skimmed Vannevar Bush’s Science: The Endless Frontier (or at least read Nintil’s article about it), but I don’t know anyone who claims to have actually understood Heidegger’s The Question Concerning Technology.

[4] Jason Crawford is at least full-time, though his grant from EA Funds describes it as “Telling the story of human progress to the world, and promote [sic] progress as a moral imperative” which sounds more like propaganda than research. That’s not a bad thing, we do need science educators! But first there must be a science.

[5] I occasionally get emails from people wowed by my blog’s prominence despite not having been around very long: “One does not just show up on the internet and write/think this well out of nowhere.” My answer is that very few other people are even trying! Scott Alexander is among the most popular authors, and has been working a demanding full-time job this entire time. As has Nintil, until he quit very recently. Leopold Aschenbrenner is not employed, but only because he’s a full time student.

[6] A more accurate title is perhaps “extra-institutional researcher” which I first heard here, but that’s a mouthful.

[7] The term “jargon” evokes esoteric slang. What I really mean is technical language, consistently defined and operationalized.

[8] Noah Smith wrote Techno-optimism for the 2020s, I responded with Isolated Demands for Rigour in New Optimism, which he has since replied to in a series of posts (1, 2). This gets messy after a while, but so long as everyone is linking back to the previous posts, it’s not too hard for a reader to follow along.

[9] Certainly, they do not proxy well for the progress I think we ought to care about.

[10] I believe some version of this is attributed to Peter Thiel, but can’t find the source.

[11] Harvard is the oldest institution of higher education in the US, and still ranked #1. Oxford is the “oldest university in the English-speaking world”, and depending on who you ask, shares that #1 ranking.


Appendix: Abolish Peer Review

Rather than attempt to identify a set of “minimum viable norms”, should we just assume all the trappings of academia are necessary? It’s not an ideal system, but that doesn’t mean you can just pick and choose which parts you want, and just hope the whole thing still holds together.

That’s a good argument, and I think agree we will have to do the work of explaining why some norms are not worth keeping.

In machine learning, arXiv has already eroded the importance of journals and conferences. It is still very important to get accepted to NeurIPS, but that very acceptance is contingent on having your pre-print widely cited.

In Progress Studies, many of the conventional trappings are being replaced as well, for better or for worse.

Instead of citations, we have retweets, and instead of journals, anyone can publish on their own blog. There is even a grant system! Though Emergent Ventures is central to Progress Studies, Jason Crawford is funded by a variety of sources, including “Open Philanthropy, the Long-Term Future Fund, and Jaan Tallinn”.

And instead of pre-publication peer-review, we have post-publication rebuttals.

Can post-hoc review even be called a legitimate standard of knowledge production?

Remember that while peer view has been around in some form since the 18th century, the term itself only took off around 1967:

As Scientific American writes:

Science and The Journal of the American Medical Association did not use outside reviewers until after 1940, "(Spier, 2002). The Lancet did not implement peer-review until 1976 (Benos et al., 2006). After the war and into the fifties and sixties, the specialization of articles increased and so did the competition for journal space.

I wasn’t sure about this claim, but as a sanity check, Wikipedia confirms:

The present-day peer-review system evolved from this 18th-century process,[11] began to involve external reviewers in the mid-19th-century,[12] and did not become commonplace until the mid-20th-century.

So it is not quite accurate to say there was no peer-review system before 1970, but it is worth understanding that our modern system is a relatively recent invention.

And yet, so much of the legendary science we now hail as transformative pre-dates 1970. Nintil’s Peer Rejection in Science summarizes breakthrough discoveries once considered crankery. How many of these would never see the light of day in today’s system? Or from Alexey Guzey’s Peer Review is a Disaster:

Peer reviewers in your field are your competitors, who have not themselves solved the problem you claim to be able to solve. They have both personal and professional interest (especially so if funding is limited) in giving low scores to grant applications of competing teams and to recommend rejection of their journal submissions. Further, since they’re experts in the grant application topic, while rejecting your paper or grant application, they can lift your research ideas and then pursue them themselves. This happens more frequently than you would expect.

This is not a niche view held merely among outsides. Richard Smith, former editor of the British Medical Journal once published the widely cited article Peer review: a flawed process at the heart of science and journals, where he writes:

Famously, it is compared with democracy: a system full of problems but the least worst we have.

…You can steal ideas and present them as your own, or produce an unjustly harsh review to block or at least slow down the publication of the ideas of a competitor. These have all happened.

Of course, any method will have false negatives and false positives. I’m not claiming Progress Studies’s current process of post-hoc review is obviously better, merely that is bad in a different way, and thus has the opportunity to produce knowledge that would not otherwise be possible.

We have to try something new, and while meta-science tries to come up with an improved mechanism, we might as well get started experimenting.

Don't Read the News

Following my recent criticism of a Stat article, you may be wondering, who can we trust?

I have written several harsh criticisms in the past, railing against Substack and Lambda School (2). Let me be clear: in none of these cases do I mean to imply that I prefer the alternative. I am merely attempting to correct simple factual errors and reduce the status of what I perceive to be over-hyped institutions in my particular corner of the internet.

So sure, Substack has its problems, but I am not telling you to run off and use Wordpress! [1] Lambda School’s CEO has lied, but that does not mean you should attend a competing bootcamp, or get a 4-year CS degree. [2]

Analogously, there was a bad Stat article, but I am certainly not recommending that you go off and read CNN or Huffpost or whatever. The only reason I don’t critique those other sources is because I already know they’re unreliable, and I assume you do as well. [3]

And yet, presumably, you would like to “stay informed”. So what’s the solution?

One option is to rigorously fact check everything you read, but that’s cumbersome and still bottoms out somewhere. I found errors in the Stat article, but then took reports from the CDC at face value. More importantly, you just don’t have the time.

Instead, I propose a much simpler solution: don’t read the news.

Could it be that simple? Surely there are serious repercussions for being so dangerously and completely uninformed?

Here are some of the headlines on the front page of the New York Times:

  • See the complete list of insults President Trump posted on Twitter from 2015 to 2021.
  • Bryan Cranston tells Kara Swisher why he won’t play Donald Trump.
  • Biden’s Stimulus Plan Will Bring Relief, but There’s One Flaw
  • Joe Did It. But How?
  • Democrats Are About to Control Congress. What Will They Do?
  • Man Lived Undetected at O’Hare Airport for 3 Months, Officials Say
  • Improve Your Life With These Tiny Chores

I compiled those on January 18th when I wrote a first draft of this post. On the 25th as I prepare to publish, it’s not much better:

  • Are We Ready for a Monday Without Trump?
  • I Want to Call the Capitol Rioters ‘Terrorists.’ Here’s Why We Shouldn’t.
  • Something Special Just Happened in Russia
  • Ninja, a Gaming Superstar, Has a Message for Parents
  • Rupert Murdoch, Accepting Award, Condemns ‘Awful Woke Orthodoxy’

Wow! how can you not click those? How did Joe do it? What’s the one flaw of his stimulus plan? What are these tiny chores I can use to improve my life? Why won’t Bryan Cranston play Trump?

This is not news. It’s clickbait, and it’s bad for you. I don’t mean to pick on the NYT. It’s among the best of the popular outlets, and it is still horrible.

Here’s Aaron Swartz writing in 2006:

None of these stories have relevance to my life. Reading them may be enjoyable, but it’s an enjoyable waste of time. They will have no impact on my actions one way or another.

…With the time people waste reading a newspaper every day, they could have read an entire book about most subjects covered and thereby learned about it with far more detail and far more impact than the daily doses they get dribbled out by the paper. But people, of course, wouldn’t read a book about most subjects covered in the paper, because most of them are simply irrelevant.

…I have not followed the news at least since I was 13 (with occasional lapses on particular topics). My life does not seem to be impoverished for it; indeed, I think it has been greatly enhanced.

You might think such a person would be civically disengaged to a slovenly degree, but that couldn’t be further from the truth! In his brief life, Aaron led a successful campaign against SOPA, helped create Creative Commons and attempted to create a proto-Sci-Hub. On a less political note, Aaron is credited with the co-creation of Reddit, RSS and Markdown.

It was not despite, but thanks to his news-aversion that Aaron was able to build projects with continued relevance a decade later. Rather than being caught up in the news of the day, he worked on things that actually matter in the long run.

And so convinced by his arguments and inspired by his life, I also don’t read the news. I stopped in 2013 when I first came across his writing, and have never looked back. Like Aaron, I find this has substantially improved my quality of life.

Frequently Asked Questions

I’m still not convinced, news has a lot of merit, and you haven’t come close to a full refutation of it’s supposed benefits.
For a longer treatment, see Rolf Dobelli’s Avoid News, an excellent and persuasive perspective. There’s more on the harm of news in Aaron’s full piece, as well as his earlier All News is Bad News.

How do you know anything about what’s going on?
I do read, just not the news. I have a long research agenda, and read according to the work I want to publish in the next few months. I do subscribe to a couple regular sources, but only other blogs that publish infrequently. I also skim Marginal Revolution, which takes all of 3 minutes.

But mostly, my friends and family tell me about the news, because they are all reading it. If something truly important happens, I am fairly confident that I will find out.

Isn’t that unfair? Aren’t you just shifting the burden of labor onto your friends, and benefiting from their curation?
Yes, it is unfair. That’s why I have attempted to propose a better scheme: N friends will take turns reading the news and update the others if anything important happens, while each expending just 1/Nth of their current effort. To date, no one has accepted this proposal, or even considered it seriously.

But for the most part, the news simply isn’t important. The 2020 presidential election had no immediate impact on me, nor did the recent inauguration. Rather than anxiously waiting for live updates, I would rather see well-reasoned retrospectives days or even months after the event. I avoided all political news after the 2016 presidential election, but then read Edward Luce’s The Retreat of Western Liberalism. Similarly, I avoided nearly all Covid news once I had already committed to a fairly strict lockdown, then read Apollo’s Arrow.

You don’t know what you’re missing!
I do occasionally sample the news for this exact reason, and regularly find that I am missing approximately nothing of consequence.

What if I have to make a decision informed by current events?
Occasionally, a genuinely important event will surface.

Say you may need to make a decision about whether to flee the country to avoid Covid. In that case, reading the news is still not important. You should identify the matter at hand, consider it carefully, and then make a decision. At this point, you may wish to consult news articles, but that is very different from reading them regularly or following a specific outlet. You are deciding what to view and have a specific purpose in mind, rather than being passively fed content that simply makes you miserable and anxious.

What about your civic duty? It’s important to be an informed voter.
Although it’s a short article, writing The Epistemic Pain of Prop 22 took a week of full-time background research. I am fairly confident that I spent more time thinking seriously about my ballot than 99% of voters.

Again, this has nothing to do with reading the news. When an election comes around, I encourage you to become informed and make the best decisions you can! That may involve reading voter guides, thinking deeply about your values, and yes, maybe even consulting the news. But even here you are free to remain ignorant on every other day.

Note that even this level of engagement is only acceptable if you are a genuinely conflicted voter! If you were pretty sure every day of the last 4 years that you were going to vote against Trump, you have no excuse for trying to “stay informed”, as your decision had already been made.

I read the Swartz/Dobelli articles, and I now think even books and blogs cause the same harms as the actual news.
That’s fair. I’ll admit to sometimes beings sidetracked Marginal Revolution, and can relate to this quote from the Swartz piece:

Edward Tufte notes that when he used to read the New York Times in the morning, it scrambled his brain with so many different topics that he couldn’t get any real intellectual work done the rest of the day.

In the past, I have had to cut down my media diet further to avoid distractions. This choice was easy to execute because I do not receive automatic newsletters, so reading those outlets is an intentional choice every time.

I’ve taken the further action of blocking some sources on my main browser, such that I’m forced to open a different application, wait a few seconds, and then navigate to the site. This is a minor burden, but it’s enough to prevent me from getting locked into compulsive habits.

In considering your media diet, think not only about what value it brings you, but about the potential harm. I can skim today’s posts on MR in just a couple minutes and see if anything catches my interest. There is rarely anything aggravating that will ruin my mood or “scramble my brain”.

What about listening to the news or watching it on TV?
Even worse. It is too easy to be stimulated by things that don’t matter, and too hard to skim or skip ahead.

Reading the news is enjoyable.
It might be stimulating in the moment, but that’s not the point. The point is that it’s detrimental to your overall quality of life, and the tradeoff isn’t worth it.

What about particular news stories with breaking updates?
I’ll admit to neurotically refreshing the NYT map every 30 seconds on election night just like the rest of you. Though I look back on this as a tremendous waste of mental energy, it really was fun to participate in the collective orgy of anxiety and madness.

But think of this as an occasional vice, the way you think about gambling or drinking. It is a fun thing to indulge in on occasion, but it is not a good way to live your life.

But seriously, what do you read?
I read Marginal Revolution, Alexey Guzey’s Twitter, Byrne Hobart’s Medium, Gwern’s newsletter, and a few blogs. I occasionally read Hacker News.

Occasionally, upon finding a great new source, I will binge read the best pieces. When I first found out about Everything Studies, I felt nearly enlightened. But after reading his archives, I feel that I’ve properly internalized the blog’s worldview. I still check it occasionally, but the marginal impact of each new post on my thinking is fairly low.

Sometimes blogs have blogrolls that list other blogs the author likes. These are also great sources of new writing that don’t require you to actually read the news.

For what it’s worth, I’ve have enjoyed the blogs from Nintil, Andy Matuschak, Dormin, Devon Zuegel, Mark Lutter, Vitalik, Dan Luu, Ben Kuhn, Zvi, sam[ ]zdat, Sarah Constantin, The Scholar’s Stage, Aaron Swartz and Scott Alexander.

I would love to see a Best Of compilation for Matt Levine or Andrew Gelman, please let me know if these exist. Both seem like good sources, but the backlogs are simply too big.

What do you read in the morning? How do you start your day?
Because I’m unemployed, I wake up without an alarm and don’t consume caffeine. That means by the time I’m out of bed, I’m ready to do whatever I’ve planned for the day, and do not need to spend the first hour of my morning “waking up” or shaking off grogginess.

What about “dead time”? What do you do while you’re commuting or waiting for water to boil?
Since I’m unemployed and under fairly strict lockdown, I have very little deadtime. When I do have deadtime, I think and let my mind wander.

Why do so many people report having their best thoughts in the shower? Probably because it’s the only time we have without artificial stimulation. If you listen to podcasts in the shower you’re cheating yourself. There’s nothing magic about water, every other piece of “dead time” could be equally valuable if we weren’t so intent on cramming it full of useless trivia.

This isn’t a question, but I’m still not totally convinced.
Seriously, go read the earlier articles:

Aaron Swartz: I Hate the News

Rolf Dobelli: Avoid News

Then go read Andy Matuschak’s Why Books Don’t Work and consider how many of his arguments apply even more strongly to the news.

Should I unsubscribe from Applied Divinity Studies?
I don’t send out emails for all my posts, only the ones I really take pride in. That ends up being about twice a week. If you feel that it’s a serious distraction, you should filter these emails, and read them only when you have time.

I’ve also made an effort to write on things that have lasting importance. Even when I address a recent event, as in Was Vaccine Production Actually Delayed?, it is intended not as an object-level claim, but as a meta-level warning against getting caught up in broader trend without careful thought.

Having said that, I wouldn’t subscribe to my own blog, nor do I subscribe to many of the blogs I like. I read the backlogs, manually check the domain when it comes to mind, and read new posts when it’s convenient for me, without the stress of watching newsletters pile up in my inbox.

That might sound wasteful, but it’s far less wasteful than the alternative.


[1] Having said that, Ghost really does seem good if you want a paid newsletter with flat fees, your own domain, and customization beyond a theme color.

[2] I am also not telling you not to do those things.

[3] I have occasionally cited a mainstream news source at face value. In these cases I am careful to only use it for illustrative purposes such that the quality of the piece as a whole does not hinge on the reliability of a single source.