Thanks to the deliberate ephemerality of modern social media, I have lost track of who pointed me at this “Unbundling the University”… web essay? Manifesto? I’m not quite sure how to place it in genre terms. It’s a big long thing whose core idea is fairly obvious from the title, but also well captured by this bit from the “Executive Summary”
21st century universities have become a massive “bundle” of societal roles and missions — from skills training to technology development to discovering the secrets of the universe. For the sake of many of these societal roles and arguably for the sake of the universities themselves, Universities need to be unbundled and in particular , we need to unbundle pre-commercial technology research from academia and universities.
Universities have been accumulating roles for hundreds of years but the process drastically accelerated during the 20th century. (For an extended version of this story, see section 2. ) A non-exclusive list of these missions might include:
Moral instruction for young people
General skills training
Vocational training for undergraduates
Expert researcher training
A repository of human knowledge
A place for intellectual mavericks/ a republic of scholars
Discovering the secrets of the universe
Inventing the technology that drives the economy
Improving and studying technology that already exists
Credentialing agency
Policy think tank
Hedge fund
Sports league
Dating site
Lobbying firm
Young adult day care
These different missions all come with money, status, and vested stakeholders. Money, status, and stakeholders in turn have created a self-perpetuating bureaucratic mess that many people are unhappy with for drastically different reasons. One point of agreement, regardless of which roles you care most about: Universities are no longer balancing these missions well.
This is in many ways a familiar complaint within academia, though the primary thrust of this particular piece is decidedly non-academic. The main concern of the author, Ben Reinhardt, is what he calls “pre-commercial technology research,” an admittedly somewhat nebulous category that falls in the stage between showing that some phenomenon can be exploited to do a thing that is useful in principle and working out the fiddly details of doing it in a way that is useful in practice and can be scaled up to commercial use. This sort of work goes on at universities, but is a poor fit there in a variety of ways, and thus Reinhardt argues that it needs to be pulled out of academia and supported in some new kind of space.
(He would pretty clearly also be in favor of “unbundling” a bunch of the other functions from that long list, as I suspect would a lot of academics. It’d probably be interesting to map out where they overlap and where they diverge, but that’s a different topic…)
This is supported with a long list of features of the modern university that make it ill-suited to “pre-commercial technology research,” many of which are familiar complaints, chiefly that modern universities are excessively bureaucratic and that academic incentives favor novelty over practicality. I generally agree with a lot of this part. There’s also an extensive discussion of how the existing system evolved into such an awkward state, and I have a few more problems with this part of the argument. In broad terms, I think Reinhardt gives a bit too much weight to a particular kind of incentive structure in a way that skips past some important features of how science actually works.
And, most crucially, I think that there’s an unexamined assumption at the root of this that makes me question the assignment of blame involved in the whole thing.
On the historical side of things, I can’t really speak to the accuracy of the really early bits, but as it gets closer to the 20th century, I know a bit more about the process, and there are some key bits missing. Like a lot of Americans, his discussion gives a bit too much weight to WWII and skips too lightly past WWI. A not-insignificant part of the involvement of research scientists in military activities during WWII was a very conscious move by leaders of the field to try to head off conscription, prompted by the experience of losing promising scientists like Moseley at Gallipoli and Schwarzchild in the trenches of the Eastern Front. They pushed hard to promote their work as militarily useful and thus protect their people (and to some extent their resources) from being thrown into the infantry. According to a colleague in History, this was the primary thing Heisenberg was doing with the German atomic program— he asked for and got protection for his team, but never had anything close to the resources needed to mount a serious push for weapons development.
On the structure-of-research part of this, Reinhardt talks about publication and peer review as primarily a status thing, which to some extent it is, but I think that’s a secondary effect. The primary reason for the emphasis on publishing results is that science is, at its core, a distributed collaborative endeavor: progress isn’t made by individuals working alone, but by a community of people sharing knowledge in a way that allows everyone to build on individual discoveries. The explosion in scientific progress starting in the early modern period is in no small part due to the adoption of the norm of publishing results openly, which moves us from alchemy to modern science. Sure, the individual researchers gain status by being recognized for their contributions, but the community as a whole gains even more by having access to all the individual results.
(I have a book-length version of this argument that you should totally buy.)
Similarly, the institutionalization of scientific research, and the move from individual tinkerers to research laboratories tied to universities is only secondarily about the status conferred by the affiliation. The primary reason science gets more institutional is that as time goes on, the problems defining the cutting edge get harder and harder. Reinhardt laments the way that technology development shifts from individual inventors tinkering on their own to university labs, but this largely just reflects the increased complexity of the problems that are available to study, and the resources required to get started.
State-of-the-art physics research in 1800 is the kind of thing that can be pursued by the idle rich in their free time, but by the early 1900s it’s become an operation that requires more than the pocket change of the landed gentry. By the 1950’s, cutting-edge physics consumes not the full resources but at least the pocket change of nation-states. That necessarily brings with it a lot of institutionalization just to manage operations on that scale. Other sciences follow with some delay, but the general trajectory is the same: the resources needed to make meaningful advances in technology from new discoveries in science have steadily increased, and are now beyond the kind of thing any individual can command.
An obvious rejoinder here would be that “pre-commercial technology research” doesn’t require being at the cutting edge, as it’s just working out the fiddly details needed to apply things that were cutting-edge several steps ago. But that just increases the lag a little, it doesn’t change the general trend: science, and thus the development of technology based on science, has gotten dramatically more expensive because the low-hanging fruit has long since been picked.
And that kind of brings us around to what seems to me to be the crucial unexamined assumption, which is that “pre-commercial technology research” needs a special space. On the contrary, as it’s described it seems like we know exactly where it actually belongs, which is under the auspices of industry.
Reinhardt makes a couple of references to Bell Labs and Xerox PARC and the great corporate labs of yore, mostly to lament that they no longer exist as an alternative to the academic prestige hierarchy. He seems to take for granted, though, that this is the correct state of affairs— that this “pre-commercial technology research” can’t be housed in a commercial enterprise, but must exist in some other kind of space.
But that’s only true if you insist that to be commercially funded the work must be profitable on a very short time scale, too short for the development of new technology. Which is every bit as much a historically contingent choice as any of the decisions made by academics that make doing this sort of development difficult at universities. You can perfectly well imagine a world in which major corporations fund “pre-commercial technology research” in-house, because we were in that world well into the twentieth century. It’s only in the last handful of decades that American businesses have decided to prioritize this quarter’s stock price above all else, and thus dumped all the pre-commercial stages of technology development— from basic research up through initial products— off their own balance sheets.
“Well, sure, but that ship has sailed,” you might say. “You can’t really expect corporations to re-orient their thinking in that way…” And I agree that it would be a heavy lift. But is it a heavier lift than asking for a fundamental overhaul of academic culture? Or the creation of a third kind of space for this stage of development?
And that, for me, is where the whole thing kind of falls apart. I don’t disagree that a lot of the thing Reinhardt laments are a problem, or that it would be nice to spin some of these functions off to a different category of institution. But at the same time, I don’t really buy that this is exclusively a problem of and for universities— if businesses are suffering for a lack of this kind of development, they’re perfectly capable of funding it, rather than expecting some combination of Directional State University and the NSF to pick up the tab.
(This is even before you factor in the absolute clusterfuck of the last couple of weeks in Washington, which calls into question the willingness of the current Powers That Be to have NSF/ NIH/ NASA pay for anything, no matter where the work is done…)
I absolutely agree that academic culture and institutions are a poor fit for “pre-commercial technology research,” but the fact that they’re where this happens is not solely due to academics. In fact, I suspect that a great many people in academia are only reluctantly doing this work in academia. They’re stuck doing it there because the industrial sector has chosen not to pay for doing it in a more appropriate context, and given that, I’m not at all convinced that anybody’s going to be in a hurry to pony up the cash to do this kind of work in a hypothetical third space that doesn’t exist yet.
But, you know, I could well be wrong— Reinhardt is, after all, associated with an organization that’s trying to do this kind of thing, and they’re clearly managed to attract at least enough funding for a spiffy website. I’m just skeptical that there’s any real appetite for doing this at scale, from any of the players in academia, government, or industry.
Been a while since I did a big-picture academic science thing, and there’s probably another coming soon. If you want to see whether that actually happens, here’s a button:
And if you want to take issue with any of this, the comments will be open:
One thing that’s missed here IMO is the giant tech companies, which over the past ~15+ years have in fact been plowing incredible amounts of money into nominally unprofitable research. ChatGPT and friends, for example, originally came out of the somewhat-legendary paper “Attention is All You Need,” published by a group at Google. To say nothing of all the other stuff Google has funded (robot farming, internet from high-altitude balloons, various self-driving car projects)…
When we laud Bell Labs as an example of industrial research, I think people often elide the truly unusual incentives that created it:
- Corporate tax rates were high (~50%) and top individual tax rates higher (~90%), so there was no good way for company executives to use their excess profits to enrich themselves or shareholders. This makes corporate research investment much more favorable - what else is the company going to do with the money?
- AT&T in particular was a highly regulated monopoly, so it had large guaranteed profits and (as above) no great way to spend them.
- As part of its monopoly deal, AT&T was also required to license its patents to anyone for free. This meant there was less reason for Bell Labs to be secretive, and its discoveries were much more able to get out into the world. Nobody was charging royalties on the transistor or laser, or fighting in court to keep Shannon's work on communications theory secret.
No company today operates under these incentives, so in that sense Bell Labs really was a kind of "third space". I think academia is the better seed ground for this kind of work going forward, as awkward as the fit is.