You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Furthering Human Knowledge

1

A recent graduate with whom I did considerable work recently wrote and asked me the following question:

What are your thoughts about how academia versus industry contribute to the expansion of the world’s knowledge? My thoughts from speaking to people from both sides are that it seems that in academia, knowledge is much more open to sharing, but progress is much slower (because of multiple things: the need to apply for grant money, the multiple roles that professors must juggle, the difficulty of the problems being tackled, the lack of readily accessible consumer data, etc.), and that in industry, there are probably tens or hundreds of companies trying to do the same thing (like spam detection, for example) but who do not want to share their knowledge, but that the competition and that the money incentive makes development faster. Do you think that working in one versus the other is more effective or efficient if the furthering of knowledge is the ultimate goal? This is one of the big questions I’m thinking about as I’m trying to figure out where I best see myself in 10 years.

I’ve been thinking about this problem, and some variants, for a considerable period of time, so the response was fairly length. On the off chance that others might be interested, I post it here:

And this is why I love working with students— they always ask the simple questions :-).

There may have been a time when the answer to this was actually simple. The idea was that academics worked on long-term, highly speculative research topics, generally funded by the government, and published papers for all the world to see. Then industry would pick up some of these ideas, figure out the commercial application, apply the pure research, and take the product that resulted from applying that research to market. Academics worked on the long-term (5-10 year) problems, and industry worked on the short term (.5-2 years). A simple and rational pipeline, with each group knowing what it was doing and how it interacted with the other. If you wanted to work on the high-risk, high-impact (but likely to fail) stuff, you became an academic. If you wanted to have immediate impact on the real world, with some better guarantee of success, you worked in industry.

This is a nice picture, but if it has ever actually been true in the computer/software sector, it was before my time. As long as I’ve been watching and taking part, real research was done all over a combination of academia, industrial research labs (Bell Labs, BBN, Xerox PARC, Digital’s SRC, Sun Labs, MSR) and people in product groups at various companies, both large and small. I remember a time when it seemed that everyone in the industry worked in Research and Development; the difference between being an academic and working for a company was whether you wrote the paper before or after releasing the product.

But there are some changes that have occurred over the past 10 or more years that have made the picture even more muddled than it was, and thus make the question harder to answer.

One of these changes is that the amount of research funding going to academics in the CS field has been going down, and the risk profile of the funding agencies has been going down with it. There was a time when NSF and DARPA would give out lots of money, and be willing to take chances on things (like the internet) that didn’t have any obvious payback and might not have worked at all (there were plenty of research projects that didn’t). As the amount of money has decreased, the willingness for risk has decreased, as well— while there are some projects that seem pretty risky, for the most part the perception is that for a project to get funded, you need to show that it will succeed. This leads to people getting startup funding to do a year or so of research, applying to funding agencies for that work (or a small follow on to the work), and then using the funding to do something new that they can then ask for the next round of funding to support. Again, it isn’t always like this, but it is like this often enough that it can be problematic. By the way, it may not be that the actual amount of money has gone down (I believe that it has), but the amount hasn’t gone up in proportion to the number of academics who are trying to get it. So that makes things strange, as well.

At the same time, the number of industrial research labs seems to be decreasing, along with the funding available for such labs. Big places are willing to do some really adventurous stuff (look at Google X, or at least the rumors from Google X), but the work is not done in the open, may not be shared, and when it is often is covered by patents. Which is natural; these companies want a return on their investment. But it does limit the scope of the spread of the innovation or knowledge. In a lot of cases, companies are now willing to take more of a chance on research than academics, because they realize that the payoff to being the first to figure something out is so huge. So some of the really speculative, long-range work is being done in industry, but you hardly hear about it (think of the self-driving car).

And then there is a third trend. What seems to be happening more and more is that innovation and real research is being outsourced to startup companies. If you have an innovative idea, you start a company and work on the idea. Then if the idea works out, you either take over the world or (more likely) get bought by a larger company. This is a really attractive model for those who fund innovation; they have essentially out-sourced the problem to the VC community. The government doesn’t have to spend any money at all. The existing companies only have to buy when they know that the idea works. And the VCs are willing to put up the initial money, because the payback for the companies that get bought or get to go public is large enough to make the initial investment profitable. This certainly seems to be the way Oracle does research these days (they don’t even do much hiring; most of the people they add to the company come in through company acquisition). Ryan Adams recently had his little company bought by Yahoo, so sometimes the line between academic, startup, and established company can be even more blurred.

Of course, such outsourcing also means that the time depth of start-up company research is dictated by the patience of the VC community, which seems to be limited to a couple of years (at best). And the research better have a clear commercial application. 

All of this has to do only with how the initial research gets funded. The real question centers on how you could most effectively add to human knowledge. Which is a lot harder than just getting funding, because once you get funding, you then need some way to get people to recognize what you have done.

Academics do this by writing papers and giving talks, which sometimes works. But there are a lot of papers that get written and never really get read, a lot of talks that are heard and immediately forgotten. By the same token, there are lots of products that are really great pieces of technology that, for one reason or another, never get adopted. Before inventing Java, James Gosling invented NeWS, an elegant and fully functional window system. But NeWS went nowhere; instead the industry of the time adopted X-windows, which a lot of us thought was not technically as nice. Dick Gabriel and I have been arguing over why LISP lost out to C or Multics lost out to Unix for a long time, but whatever the reason was it was not purely technical. I remember being told by Ivan Sutherland, who has done more great technology than just about anyone I know, that all you can do as a technologist is make sure that the work is good; adoption is a question outside of your control. A hard lesson, but true.

After all of this evasion, let me try to answer the real question, which is what should you do if you want to push forward the boundaries of knowledge? And the answer will depend on how you want to live your life, not on which is more likely to push those boundaries successfully. As an academic, you have much more freedom, deciding on your own research agenda and working with students who you will get to advise and direct. In industry, you probably won’t have that sort of freedom for 5 to 10 years (and that’s if you enter with a Ph.D.); you will be part of someone else’s group for that time, working on their problems. But while in industry you will not have the worries over funding (my funding cycle when at Sun Labs was a couple of weeks), and the people you will be working with will have a higher level of skill and discipline than is generally found in students. But the problems you will work on will be constrained, to some extent, by the market and you may not be able to share all you know. The environment of a startup gives you some of the freedoms of an academic, but also brings the pressures of the market. 

And, of course, a final consideration is just what is meant by “furthering human knowledge.” One way of doing this is to come up with something that no one has ever thought of before, and getting everyone to understand it. This might be a new theorem, a new system, or a better way of doing things. Java, for all its flaws, certainly contributed to human knowledge in some way; the IP protocols did the same. But these sorts of contributions are few and far between. When they happen, it is a combination of insight, perspiration, and luck; no one knows how it really happens, but when it does it is pretty amazing.

But the other way to further human knowledge is to train the next generation in how to further knowledge. This can also be done in all of the contexts spoken about above— I mentored a lot of people when I was in industry, start-ups teach in their own kind of way, and being a professor is (supposedly) explicitly about that. As my career has moved along, I’ve grown to appreciate this way of furthering knowledge more and more; it may not be as romantic or immediately satisfying, but you end up playing the long game, knowing that your contributions will outlast you and not be subject to quite so many whims. Which is the other reason that I love working with students— it is part of my attempted contribution to the expansion of human knowledge.

previous:
It was 20 years ago today…
next:
Back again

1 Comment

  1. Brian Filsoare

    December 18, 2015 @ 1:40 am

    1

    This is a wonderful, thoughtful piece. I find myself quoting it and paraphrasing it whenever I get together with my academic or start-up friends. Thanks for sharing, it is great food for thought, and yet another reminder of what great work comes out of Harvard.

Leave a Comment