NBER WORKING PAPER SERIES ARE IDEAS GETTING HARDER TO FIND?

NBER WORKING PAPER SERIES

ARE IDEAS GETTING HARDER TO FIND?

Nicholas Bloom Charles I. Jones John Van Reenen Michael Webb

Working Paper 23782

NATIONAL BUREAU OF ECONOMIC RESEARCH 1050 Massachusetts Avenue Cambridge, MA 02138 September 2017

We are grateful to Daron Acemoglu, Philippe Aghion, Ufuk Akcigit, Michele Boldrin, Pete Klenow, Sam Kortum, Peter Kruse-Andersen, Rachel Ngai, Pietro Peretto, John Seater, Chris Tonetti, and seminar participants at the CEPR Macroeconomics and Growth Conference, George Mason, Harvard, the Minneapolis Fed, the NBER growth meeting, the NBER Macro across Time and Space conference, the Rimini Conference on Economics and Finance, and Stanford for helpful comments and to Antoine Dechezlepretre, Dietmar Harhoff, Wallace Huffman, Keith Fuglie, Unni Pillai, and Greg Traxler for extensive assistance with data. The Alfred Sloan Foundation and the European Research Council have provided financial support. An online data appendix with replication files can be obtained from . The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.

NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.

? 2017 by Nicholas Bloom, Charles I. Jones, John Van Reenen, and Michael Webb. All rights reserved. Short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including ? notice, is given to the source.

Are Ideas Getting Harder to Find? Nicholas Bloom, Charles I. Jones, John Van Reenen, and Michael Webb NBER Working Paper No. 23782 September 2017 JEL No. O3,O4

ABSTRACT

In many growth models, economic growth arises from people creating ideas, and the long-run growth rate is the product of two terms: the effective number of researchers and their research productivity. We present a wide range of evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply. A good example is Moore's Law. The number of researchers required today to achieve the famous doubling every two years of the density of computer chips is more than 18 times larger than the number required in the early 1970s. Across a broad range of case studies at various levels of (dis)aggregation, we find that ideas -- and in particular the exponential growth they imply -- are getting harder and harder to find. Exponential growth results from the large increases in research effort that offset its declining productivity.

Nicholas Bloom Stanford University Department of Economics 579 Serra Mall Stanford, CA 94305-6072 and NBER nbloom@stanford.edu

John Van Reenen Department of Economics, E62-518 MIT 77 Massachusetts Avenue Cambridge, MA 02139 and NBER vanreene@mit.edu

Charles I. Jones Graduate School of Business Stanford University 655 Knight Way Stanford, CA 94305-4800 and NBER chad.jones@stanford.edu

Michael Webb

Department of Economics Stanford University 579 Serra Mall Stanford, CA 94305-6072 mww@stanford.edu

2

BLOOM, JONES, VAN REENEN, AND WEBB

1. Introduction

This paper applies the growth accounting of Solow (1957) to the production function for new ideas. The basic insight can be explained with a simple equation, highlighting a stylized view of economic growth that emerges from idea-based growth models:

Economic growth

=

Research productivity

? Number of researchers

e.g. 2% or 5%

(falling)

(rising)

Economic growth arises from people creating ideas. As a matter of accounting, we can decompose the long-run growth rate into the product of two terms: the effective number of researchers and their research productivity. We present a wide range of empirical evidence showing that in many contexts and at various levels of disaggregation, research effort is rising substantially, while research productivity is declining sharply. Steady growth, when it occurs, results from the offsetting of these two trends.

Perhaps the best example of this finding comes from Moore's Law, one of the key drivers of economic growth in recent decades. This "law" refers to the empirical regularity that the number of transistors packed onto a computer chip doubles approximately every two years. Such doubling corresponds to a constant exponential growth rate of around 35% per year, a rate that has been remarkably steady for nearly half a century. As we show in detail below, this growth has been achieved by engaging an ever-growing number of researchers to push Moore's Law forward. In particular, the number of researchers required to double chip density today is more than 18 times larger than the number required in the early 1970s. At least as far as semiconductors are concerned, ideas are getting harder and harder to find. Research productivity in this case is declining sharply, at a rate that averages about 6.8% per year.

We document qualitatively similar results throughout the U.S. economy. We consider detailed microeconomic evidence on idea production functions, focusing on places where we can get the best measures of both the output of ideas and the inputs used to produce them. In addition to Moore's Law, our case studies include agricultural productivity (corn, soybeans, cotton, and wheat) and medical innovations. Research productivity for seed yields declines at about 5% per year. We find a similar rate of decline when studying the mortality improvements associated with cancer and heart disease.

ARE IDEAS GETTING HARDER TO FIND?

3

Finally, we examine firm-level data from Compustat to provide another perspective. While the data quality from this sample is not as good as for our case studies, the latter suffer from possibly not being representative. We find substantial heterogeneity across firms, but research productivity is declining in more than 85% of our sample. Averaging across firms, research productivity declines at a rate of around 10% per year.

Perhaps research productivity is declining sharply within every particular case that we look at and yet not declining for the economy as a whole. While existing varieties run into diminishing returns, perhaps new varieties are always being invented to stave this off. We consider this possibility by taking it to the extreme. Suppose each variety has a productivity that cannot be improved at all, and instead aggregate growth proceeds entirely by inventing new varieties. To examine this case, we consider research productivity for the economy as a whole. We once again find that it is declining sharply: aggregate growth rates are relatively stable over time,1 while the number of researchers has risen enormously. In fact, this is simply another way of looking at the original point of Jones (1995), and for this reason, we present this application first to illustrate our methodology. We find that research productivity for the aggregate U.S. economy has declined by a factor of 41 since the 1930s, an average decrease of more than 5% per year.

This is a good place to explain why we think looking at the macro data is insufficient and why studying the idea production function at the micro level is crucial; Section 3 below discusses this issue in more detail. The overwhelming majority of papers on economic growth published in the past decade are based on models in which research productivity is constant.2 An important justification for assuming constant research productivity is an observation first made in the late 1990s by a series of papers written in response to the aggregate evidence.3 These papers highlighted that composition effects could render the aggregate evidence misleading: perhaps research productivity at the micro level is actually stable. The rise in aggregate research could apply to an

1There is a debate over whether the slower rates of growth over the last decade are a temporary phenomenon due to the global financial crisis, or a sign of slowing technological progress. Gordon (2016) argues that the strong US productivity growth between 1996 and 2004 was a temporary blip and that productivity growth will, at best, return to the lower growth rates of 1973?1996. Although we do not need to take a stance on this, note that if frontier TFP growth really has slowed down, this only strengthens our argument.

2Examples are cited below after equation (1). 3The initial papers included Dinopoulos and Thompson (1998), Peretto (1998), Young (1998), and Howitt (1999); Section 3 contains additional references.

4

BLOOM, JONES, VAN REENEN, AND WEBB

extensive margin, generating an increase in product variety, so that the number of researchers per variety -- and thus micro-level research productivity and growth rates themselves -- are constant. The aggregate evidence, then, may tell us nothing about research productivity at the micro level. Hence the contribution of this paper: study the idea production function at the micro level to see directly what is happening to research productivity.

Not only is this question interesting in its own right, but it is also informative about the kind of models that we use to study economic growth. Despite large declines in research productivity at the micro level, relatively stable exponential growth is common in the cases we study (and in the aggregate U.S. economy). How is this possible? Looking back at the equation that began the introduction, declines in research productivity must be offset by increased research effort, and this is indeed what we find. Moreover, we suggest at the end of the paper that the rapid declines in research productivity that we see in semiconductors, for example, might be precisely due to the fact that research effort is rising so sharply. Because it gets harder to find new ideas as research progresses, a sustained and massive expansion of research like we see in semiconductors (for example, because of the "general purpose technology" nature of information technology) may lead to a substantial downward trend in research productivity.

Others have also provided evidence suggesting that ideas may be getting harder to find over time. Griliches (1994) provides a summary of the earlier literature exploring the decline in patents per dollar of research spending. Gordon (2016) reports extensive new historical evidence from throughout the 19th and 20th centuries. Cowen (2011) synthesizes earlier work to explicitly make the case. (Ben) Jones (2009) documents a rise in the age at which inventors first patent and a general increase in the size of research teams, arguing that over time more and more learning is required just to get to the point where researchers are capable of pushing the frontier forward. We see our evidence as complementary to these earlier studies but more focused on drawing out the tight connections to growth theory.

The remainder of the paper is organized as follows. Section 2 lays out our conceptual framework and presents the aggregate evidence on research productivity to illustrate our methodology. Section 3 places this framework in the context of growth theory and suggests that applying the framework to micro data is crucial for understanding the

ARE IDEAS GETTING HARDER TO FIND?

5

nature of economic growth. Sections 4 through 7 consider our applications to Moore's Law, agricultural yields, medical technologies, and Compustat firms. Section 8 then revisits the implications of our findings for growth theory, and Section 9 concludes.

2. Research Productivity and Aggregate Evidence

2.1. The Conceptual Framework

An equation at the heart of many growth models is an idea production function taking

a particular form:

A t At

=

St.

(1)

Classic examples include Romer (1990) and Aghion and Howitt (1992), but many recent

papers follow this approach, including Aghion, Akcigit and Howitt (2014), Acemoglu

and Restrepo (2016), Akcigit, Celik and Greenwood (2016b), and Jones and Kim (2018). In the equation above, At/At is total factor productivity growth in the economy. The

variable St (think "scientists") is some measure of research input, such as the number

of researchers. This equation then says that the growth rate of the economy -- through

the production of new ideas -- is proportional to the number of researchers. Relating At/At to ideas runs into the familiar problem that ideas are hard to mea-

sure. Even as simple a question as "What are the units of ideas?" is troublesome. We

follow much of the literature -- including Aghion and Howitt (1992), Grossman and

Helpman (1991), and Kortum (1997) -- and define ideas to be in units so that a constant

flow of new ideas leads to constant exponential growth in A. For example, each new

idea raises incomes by a constant percentage (on average), rather than by a certain

number of dollars. This is the standard approach in the quality ladder literature on

growth: ideas are proportional improvements in productivity. The patent statistics for

most of the 20th century are consistent with this view; indeed, this was a key piece

of evidence motivating Kortum (1997). This definition means that the left hand side

of equation (1) corresponds to the flow of new ideas. However, this is clearly just a

convenient definition, and in some ways a more accurate title for this paper would be

"Is exponential growth getting harder to achieve?"

We can now define the productivity of the idea production function as the ratio of

6

BLOOM, JONES, VAN REENEN, AND WEBB

the output of ideas to the inputs used to make them:

Research productivity := A t/At = # of new ideas .

(2)

St # of researchers

The null hypothesis tested in this paper comes from the relationship assumed in (1). Substituting this equation into the definition of research productivity, we see that (1) implies that research productivity equals -- that is, research productivity is constant over time. This is the standard hypothesis in much of the growth literature. Under this null, a constant number of researchers can generate constant exponential growth.

The reason this is such a common assumption is also easy to see in equation (1). With constant research productivity, a research subsidy that increases the number of researchers permanently will permanently raise the growth rate of the economy. In other words "constant research productivity" and the fact that sustained research subsidies produce "permanent growth effects" are equivalent statements.4 This clarifies a claim in the introduction: testing the null hypothesis of constant research productivity is interesting in its own right but also because it is informative about the kind of models that we use to study economic growth. The finding that research productivity is declining virtually everywhere we look poses problems for a large class of endogenous growth models. Moreover, the way we have defined ideas gives this concept economic relevance: exponential growth (e.g. in semiconductor density or seed yields or living standards) is getting harder to find.

2.2. Aggregate Evidence

The bulk of the evidence presented in this paper concerns the extent to which a constant level of research effort can generate constant exponential growth within a relatively narrow category, such as a firm or a seed type or Moore's Law or a health condition. We provide consistent evidence that the historical answer to this question is no: research productivity is declining at a substantial rate in virtually every place we look.

This finding raises a natural question, however. What if there is sharply declining

4The careful reader may wonder about this statement in richer models -- for example, lab equipment models where research is measured in goods rather than in bodies or models with both horizontal and vertical dimensions to growth. These extensions will be incorporated below in such a way as to maintain the equivalence between "constant research productivity" and "permanent growth effects."

ARE IDEAS GETTING HARDER TO FIND?

7

research productivity within each product line, but the main way that growth proceeds

is by developing ever-better new product lines? First there was steam power, then

electric power, then the internal combustion engine, then the semiconductor, then

gene editing, and so on. Maybe there is limited opportunity within each area for pro-

ductivity improvement and long-run growth occurs through the invention of entirely

new areas. Maybe research productivity is declining within every product line, but still

not declining for the economy as a whole because we keep inventing new products. An

analysis focused on microeconomic case studies would never reveal this to be the case.

The answer to this concern turns out to be straightforward and is an excellent place

to begin. First, consider the extreme case where there is no possibility at all for produc-

tivity improvement in a product line and all productivity growth comes from adding

new product lines. Of course, this is just the original Romer (1990) model itself, and to

generate constant research productivity in that case requires the equation we started

the paper with:

A t At

=

St.

(3)

In this interpretation, At represents the number of product varieties and St is the aggre-

gate number of researchers. Even with no ability to improve productivity within each

variety, a constant number of researchers can sustain exponential growth if the variety-

discovery function exhibits constant research productivity.

This hypothesis, however, runs into an important well-known problem noted by

Jones (1995). For the U.S. economy as a whole, exponential growth rates in GDP per

person since 1870 or in total factor productivity since the 1930s -- which are related

to the left side of equation (3) -- are relatively stable or even declining. But measures

of research effort -- the right side of the equation -- have grown tremendously. When

applied to the aggregate data, our approach of looking at research productivity is just

another way of making this same point.

To illustrate the approach, we use the decadal averages of TFP growth to measure

the "output" of the idea production function. For the input, we use the NIPA measure of

investment in "intellectual property products," a number that is primarily made up of

research and development spending but also includes expenditures on creating other

nonrival goods like computer software, music, books, and movies. As explained further

below, we deflate this input by a measure of the average annual earnings for men with

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download