With a worldwide recession on, my impression is that the carnage in research has not been as severe as might be feared, at least in the United States. I know of two notable negative impacts:
- It’s quite difficult to get a job this year, as many companies and universities simply aren’t hiring. This is particularly tough on graduating students.
- Perhaps 10% of IBM research was fired.
In contrast, around the time of the dot com bust, ATnT Research and Lucent had one or several 50% size firings wiping out much of the remainder of Bell Labs, triggering a notable diaspora for the respected machine learning group there. As the recession progresses, we may easily see more firings as companies in particular reach a point where they can no longer support research.
There are a couple positives to the recession as well.
- Both the implosion of Wall Street (which siphoned off smart people) and the general difficulty of getting a job coming out of an undergraduate education suggest that the quality of admitted phd students may increase. In half a decade when they start graduating, we might have some new and very creative ideas.
- The latest stimulus bill includes substantial additional research funding. This is particularly welcome news for those at universities, because it will compensate for other cutbacks which may be necessary there as endowments or state funding fall. It’s also particularly good for young researchers at universities who just got a position or succeed this year, as the derivative on research funding particularly impacts them.
There are two effects going on: Does a recession cause us to refocus on other possibly better ideas? Or does it cause us to focus on short term survival? The first effect helps research while the second effect does not. By far, most of the money invested by governments to fight the recession has gone towards survival, but a small fraction in the US is going towards other possibly better ideas, with a portion of that going towards research.
We could hope for a larger fraction of money heading towards new ideas, rather than rescuing old, but there is a basic issue: the apparatus for creation and use of new ideas in the US is simply too small—it may not be able to effectively use more funding. In order to justify further funding for research, we may need to be more creative than simply “give us more”.
However, this is easy. Throughout much of the 1900s, Bell Labs created many inventions which are fundamental to modern society, including the transistor, C(++), Unix, the laser, information theory, etc… In my view the vital ingredients for success are:
- Access to cutting edge problems. Even extraordinarily intelligent researchers can simply end up working on the wrong problem. Without direct access to and knowledge of such problems researchers can end up inventing their own, which occasionally works out well, but more often does not.
- Free time. This is both obvious and yet a common failure mode. Researchers at universities have many more demands on their time, including teaching, fundraising, mentoring, and running a university. Similarly, researchers at corporations can be sucked into patching an existing system rather than thinking about the best way to really solve a problem.
- Concentration. Two researchers working together can often manage much more than one apart, as each can bring relevant expertise and viewpoints necessary to solve a problem. This remains true up to the point where communication becomes a substantial overhead, which in my experience is about 5, but which we might imagine technology helps improve.
Bell Labs managed to satisfy all three of these desiderata. Some research universities manage to achieve at least access and concentration to some extent, but hidden difficulties exist. For example, professors often don’t work with other professors, because they are both too busy with students and they must make a case for tenure based on work which is unambiguously their own. I’m not extremely familiar with existing national labs, but I believe they often fail at (a)—at least research at national labs have had relatively little impact on newer fields such as computer science.
So, my suggestion would be funding research in modes which satisfy all three desiderata. The natural and easy way to do this is by the government partially subsidizing basic research at those corporations which have decided to fund basic research. In computer science at least, this includes Microsoft, IBM, Yahoo, Google, and what’s left of Bell Labs at ATnT and Alcatel. While this is precisely the conclusion you might expect from someone doing research at one of these places, it’s also what you would expect of someone intensely interested in research who sought out the best environment for research. In economic terms, these companies have for reasons of their own decided to provide a public good. As long as we are interested, as a nation, or as a civilization, in subsidizing this public good, it is desirable to do this as efficiently as possible.
Some people might think that basic research done at a university is inherently more desirable than the same in industry. I don’t see any reason for this. For example, it seems that patentable research is about as likely to be patented at a university as elsewhere, and hence equally restricted for public use over the duration of a patent. Other people might think that basic research only really happens at universities or national labs, but that simply doesn’t agree with history.
Given this, it’s odd that the rules for NSF funding, which is the premier source of funding for basic science in the US, generally requires university participation on proposals. This restriction naturally makes it easier for researchers at universities to acquire grant money than researchers not at universities. I don’t understand why this restriction is desirable from the viewpoint of a government wanting to effectively subsidize research.
Regarding the last point (why shouldn’t NSF fund non-university places), I think the answer is simple: education. Every NSF grant that goes in has to discuss education and my experience on panels is that while this isn’t the most important thing in a proposal, it’s not ignored. NSF (at least) views education, to some degree of undergrads but especially of graduate students, as a top priority. This is something that labs can’t really compete on. After all, most of my budget for grants is to pay for students, not for other things. Which seems like a fairly reasonable model.
I don’t agree with Hal. Through their internship programs, industrial research labs have done and are doing quite a bit for training new academic researchers. Just off the top of my head, Lillian Lee (Cornell), Andrew Ng (Stanford), Ben Taskar (Penn), David Yarwosky and Jason Eisner (JHU) are former Bell Labs/AT&T Labs interns. Back to John’s point, the federal government does subsidize industrial research significantly through R&D tax credits. My experience with government funding of industrial research in the European Union left me wary of the preverse incentives and marriages of convenience that such funding can encourage.
I could see a model where government funds research grants that students apply for in order to become “free labor” at companies that are willing to hire them under conditions that give the public a suitable return on investment, e.g., publication of their results and non-exclusive government rights to the IP.
But I wonder if the funding from such a program addresses the bottleneck of industrial research. From my own experience, the cost of management / training overhead is at least as high as that of compensating interns. Intern programs work best as recruiting vehicles, and I think the main reason we’re seeing less interest in interns is that there’s less urgency about recruiting.
Of course, there’s industrial research beyond internship programs. Smaller companies pursue government funding, e.g., through SBIR/STTR. For larger companies, there are, as Fernando cited, the R&D tax credits.
The opportunities are out there, if not quite as numerous as those of us in industry might like. But I think that’s part of the price of being in the private sector. Research grants are nice, but revenue from paying customers is even better. Should companies really be competing with universities for public funds?
hrm. having never been in an industrial research lab (except as an intern :P), i’m less knowledgeable about money issues than john and fernando… but i would find it hard to believe that lillian, andrew, ben, david and jason all went to bell labs without previously having taken at least one or two years of study at their phd locations; those years of which were probably funded by government money (either through RAships directly via research grants or TAships largely via returned overhead on research grants). i learned a lot when i was an intern (at microsoft though, not bell labs) and i strongly encourage all of my students to do one or two internships during their phd studies. it’s a great learning experience. but it’s not something you do day one — you have to already know a lot in order to get enough accomplished during 3 months. and i don’t think it compares to the 4-6 years of “normal” phd study (including both classes, advising and research projects).
since sbir/sttrs were brought up, my experience (sample of one!) with these has been horrific. a fair number of faculty in my department were approached by a “local business” to apply for one of these. the problem was that the guy wasn’t actually a business. he was just a smart (he had a phd in pure math) entrepreneur who wanted to team up solely for the purpose of getting an sbir/sttr. it was a ridiculous showing of an attempt to game the system. perhaps this is an isolated incident, though. (i hope.)
Getting those cats out of AT&T/Lucent was probably the best thing that ever happened to machine learning. -C.
I’m not long into research and never was at an industry “lab” (but in industry as a developer), but to me it seems fairly obvious why governments historically prefer universities: a) The results are all public and b) they get disseminated while they are still hot (the abovementioned educational advantage).
That said, it seems to me that a) varies widely across disciplines and, further, appears to be declining as universities are ramping up their own IP agendas. If this development continues, it would effectively blur the distinction between industrial and university research and we will see how that affects funding.
apologies for the link bait, but I had a rather long dissenting comment, and ending up reposting it over at my blog. I’ll link to this comment thread.
Having been a professor at Carnegie Mellon, a research staffer at Lucent Bell Labs, and a researcher/developer at a small company partly funded by an NIH SBIR, I can contribute some perspective.
The pre-divestiture Bell Labs is an anomaly. In its heyday of invention, Bell Labs was effectively funded through research grants. As a monopoly, AT&T was heavily restricted by the government (see ATT’s Official History). In particular, the government fixed their profit margin over costs (which included research), and they weren’t allowed to profit by the research they produced without ringing anti-trust bells. Hence, they invented all sorts of things they gave away. By the decline under Lucent (when I was there), the marching orders (research vs. practice) weren’t so clear.
I found academia to be very demanding in terms of teaching, management of our grad program, advising students, and research. I felt I had way more time to devote to research at Bell Labs. I found it easier to get other researcher’s time. What was harder was getting people to work for you or connecting in any way to Lucent’s business.
Despite NSF’s lip service to education, and occasional funding moves in that direction (e.g. Hopkins summer schools), I’ve mainly seen grants evaluated on research grounds. Ironically, that’s also been true of SBIRs — the review boards are mostly professors. Ditto DARPA, who I think of as doing prototype-driven research, not product development.
Ironically #2, when co-authorhing grants for NSF and NIH with Columbia University faculty while at our small for-profit company, we were the ones lobbying for freely open-sourcing results, whereas the professors and research staff wanted to maintain intellectual property. I find this a horrible conflict of interest. We had conflicts at CMU from faculty running companies on the side.
As far as I know, NSF doesn’t require fundees to make their research available free. Certainly the research grants and SBIRs we’ve applied to don’t have this restriction. DARPA tries to extract government-use rights, but not public domain.
The other angle to get a pure research job is to get a research scientist job at a place like Carnegie Mellon. It’s like being a tenure-track faculty member in that you can apply for research funding and supervise students, but there’s no teaching requirements or admin (or tenure).
So maybe what we need is large monopolistic companies funding research. Research labs seem to reach their zenith along with their company’s monopolies.
There are many interesting comments, to which I would like to make a consolidated reply.
I agree with Fernando that having a government funding of industrial research can be badly done. However, that is effectively how Bell Labs worked as Bob points out, so there is at least one approach which was effective.
The Tax Credit approach to funding doesn’t seem very desirable to me, because it’s judgment by tax code of research, which in my view is one of the most intrinsically difficult things to judge. I can easily imagine this approach is abused, and that large parts of the research covered by this have little substantial academic or long term value.
The SBIR/STTR approach is perverse because it forces university involvement. It’s easy to imagine that Hal’s story is repeated many times.
Education is certainly important. To the extent that NSF’s core mission is education reserving some amount of it’s funding for education seems entirely reasonable. However, research is undeniably also a core mission of NSF, and I see little reason to introduce an unnecessary conflation of these goals, as the current process does.
With respect to Suresh, I partly agree that much of the point of a stimulus is about spending money “now”, rather than worrying about the future. However, I’m happy, and I believe it appropriate to fund basic research as well in the stimulus. Without the funding, basic research would necessarily be pinched more by the leaner times, and (frankly) I think that good basic research simply requires long time frames without too much struggling for survival. It’s just not the kind of thing that can be freely stopped and restarted. While you snigger about ATnT’s dedication to research, I think a simpler view is that when the government subsidy for research there (implemented by a regulated monopoly with an upper bound on profits) ended, research declined. It’s just cause and effect, delayed a bit by system inertia.
Ingo suggests that university research is more quickly disseminated. There is some truth to this: research can be immediately taught in a class. Nevertheless, the historic record is that much research of long term value has been done in an industrial setting, and it’s hard to imagine something slowing the dissemination of a new idea more than not having it.
I think my views are most closely aligned with Bob’s, except that I’m not resigned to monopoly based funding. Partly, that’s because I’m at Yahoo, which is not a monopoly, and I see research working effectively for the company and also expect substantial long term impact outside of Yahoo. While I’m not personally keen on grant writing, having a rule that proposals must come from or at least involve universities smacks a bit of protectionism. If people at companies, or even on their own, can effectively compete with research at universities, why shouldn’t they be allowed to, at least with respect to the portion of NSF funding dedicated towards basic research?
The complicated relationship between the Federal government and AT&T was in no way “government funding of industrial research.” Bell Labs research in the public interest was just one of the ways AT&T curried favor with the government to protect its monopoly. R&D tax credits do not judge research one way or the other, they just relate to the total R&D expenditures of the corporation. There may be questions about what expenses are allowable, but that does not impinge on the content of research. Which is a good thing, I believe.
If a government imposes a regulatory structure which makes the effective cost of research zero, it seems like government funding to me. I realize that the money never went through the hands of the government, but this still seems quite similar to a 100% research tax credit in effect.
You seem to be arguing (by implication at least) that NSF, at least for the portion of NSF’s budget being spent on research, should not be judging the content of research in giving awards. That’s a pretty significant policy change.
No, research by Bell Labs was one of the ways AT&T paid the public for the privilege of its monopoly. It wasn’t a subsidy, it was a cost of doing business. Which was increasingly cut back after the monopoly was broken. I didn’t say that NSF should not judge the content of academic research, but I do not believe that the government is particularly good at judging the value of specific industrial research. When the government provides specific research funding to industry, this often distorts the company’s research effort in perverse ways, and the distortions can also spread in various ways to the government itself through regulatory capture.