We in the modern, western world, prospering for the past two hundred years thanks in large measure to technological creativity, live in exceptional times. Most societies have not welcomed the application of new ideas to production. Vested interests, tradition, and even malnutrition stifle technological creativity and change. Aside from a handful of western nations of the past two centuries, historians know no nation that has been technologically creative for much more than three generations, about seventy-five years. Our society is all the more remarkable for having developed institutions that almost routinize technological progress.
Technological change, however, does not proceed along a predictable trajectory. While it's tempting to view past inventions as fully formed and quickly disseminated, there is nothing natural or inevitable about them. Uncertainty permeates the process of searching for and winnowing all the potential solutions to a problem. Nevertheless, although the path of a particular discovery may be unknown, technology does emerge in distinct patterns: Inventors congregate around bottlenecks in technical systems, and they respond to their society's structure of monetary rewards and prestige, and to its cultural norms.
The overriding characteristic of uncertainty stems from several sources. Looking back over many generations of invention and innovation, Nathan Rosenberg, economist at Stanford University, points out that new technologies typically come into the world in a primitive condition. The telephone, the transistor, the electronic digital computer, and the gas turbine engine are among prominent new technologies of the past century for which leading scientists and businesspeople failed to anticipate future uses and larger markets.
Not only do inventions often emerge in a primitive state, Rosenberg explains, but their eventual success may hinge on complementary inventions. The laser, for example, had to wait for later development of fiber-optic cable to realize applications in telephone signal transmission. New regimes take years before they replace established technology, because it is difficult to conceptualize and build a new system. Restructuring a factory around an electric power source instead of steam or water power, for example, entailed new principles of factory organization, from the layout of machinery to new relationships among employees. An innovative technology, moreover, sometimes accelerates improvements in the existing technology, as when companies made better gas lamps shortly after the introduction of the incandescent electric light bulb.
Over these past two centuries, scientists, engineers, and other technologists have increasingly worked within the context of a complex technical system. The spread of electric motors required an interconnecting array of innovations, including dynamos to generate electricity, techniques to transmit power over long distances, small motors to convert electricity to useful energy, and new alloys. Aircraft and telecommunications networks constitute similar complex systems.
As such systems grow, a constraint on further expansion typically emerges. Thomas Hughes, an historian of technology at the University of Pennsylvania, calls such a constraint a "reverse salient" -- a backward bulge in an advancing front. Inventors often aim to correct a reverse salient, to bring that component in line with the rest of the front. Communities of inventors and innovators congregate at these sites, because a number of companies may experience the same problem at the same time. Given this clustering, technical progress can have a compelling inner logic that identifies future directions for research offering high payoffs. Uncertainty about the path of a particular technology remains, however, and is resolved only through extended trials of competing solutions.
The uncertain path of invention and innovation can be traced through the development of aircraft landing gear during the 1930s, a story that revolves around a reverse salient and a complementary invention. At the time, according to Walter Vincenti, aeronautical engineer at Stanford University, the U.S. aircraft industry consisted of many small companies, all experimenting with various combinations of aeronautical components. The main objective was to raise an aircraft's speed: People wanted to fly from place to place quickly; operators wanted to maximize the number of flights per year; and the military wanted the fastest planes in the sky. Landing gear, and the aerodynamic drag it caused, emerged as one reverse salient in the overall effort to increase speed.
Aircraft designers experimented with a number of possible solutions. Streamlined "trouser"-type metal fairings could enclose the wheels to reduce drag. Alternatively, the landing gear could retract into the wings or fuselage through a system of hand-cranks or motorized lifts. It was far from obvious which approach would prevail. Retraction offered less drag and higher speed, but designers also had to consider cost, reliability, weight, and ease of maintenance.
The community of competing designers, Vincenti says, "felt their way along by small, progressive increments." They mentally conceived variants and made calculations on paper. But theoretical calculation was not sufficient. The companies also had to make extensive trials of each new design, using wind tunnels and the accumulated experience of maintenance crews.
Each variant of landing gear, moreover, bore on other design issues. John Northrop, a self-taught draftsman, engineer, and entrepreneur, had developed a unique multicellular wing construction. The stowage space needed for retractable landing gear would interrupt the multicellular arrangement, which Northrop was reluctant to do. So when early wind-tunnel tests showed that the pants-type gear reduced drag almost as much as retraction, Northrop's course seemed clear: He used the trousers on his airplanes from 1931 until 1934.
Other improvements in aircraft design, however, would challenge this decision. Following his development of a small, fast fighter plane in 1934, Northrop and his team began to see the trade-offs differently and the company gradually switched to retractable gear. Although Northrop did not record the reasons, Vincenti presents several speculations. Flight speeds were rising well above 250 miles per hour, which weighed in favor of retraction. Maintenance and reliability of retracting systems, which early on proved troublesome, were becoming tolerable.
Then a new, seemingly minor component technology tipped the scales. A common method of raising and lowering retractable gear used a hydraulic cylinder. But the sliding leather packings tended to leak, causing costly maintenance problems. The introduction in 1940 of the hard rubber O-ring solved this leakage problem. Niels Christensen, an independent inventor, had perfected the O-ring -- a rubber doughnut nestled inside a grooved metal housing -- while developing automobile brakes. He won a patent in 1937, but could not interest manufacturers until the aircraft buildup of World War II. Christensen sold the military on his invention after tests on a Northrop plane. Thus the O-ring became the critical complementary device that allowed a much larger technology to advance.
The process of winnowing landing gear variations took a decade or so, and settled a fundamental piece of aeronautical engineering: Aircraft at speeds above 200 to 250 miles per hour should have retractable gear. The outcome, Vincenti observes, was determined by a consensus of the design community, which was cognizant of the speed, cost, and maintenance trade-offs. While the aircraft designers were not "blind" as they ran through this winnowing process, they foresaw neither the progression nor its outcome.
TWO PATHS FOR MUTATION
Reverse salients seem to generate two broadly different kinds of responses. Joel Mokyr, economic historian at Northwestern University, distinguishes between macroinventions and microinventions, a description more useful than simply "large" and "small." Microinventions are the incremental steps that improve and adapt existing techniques. Retractable landing gear represented an important microinvention.
When a reverse salient cannot be corrected within an existing system, the problem may require an entirely new approach. The much rarer macroinvention, Mokyr argues in his book The Lever of Riches, constitutes a radical new idea and a break from previous technique. Without clear-cut parentage, it resembles a new biological species. Ballooning, Newcomen's steam engine, the screw propeller, and chemical fertilizers were all radical departures from previous technologies. Until Joseph de Montgolfier's hot-air balloon lifted two men in 1783, previous attempts at flight had sought to imitate a bird's flapping wings and tail. Because microimprovements to existing techniques eventually peter out, says Mokyr, technical progress would slow to a crawl without macroinventions that give birth to new systems.
Macroinventions seem to occur in clusters, after long periods of stasis or microimprovements. Thus the later Middle Ages, a period rich in macroinventions, was separated before and after by more than two centuries of gradual improvements. One force driving these bursts of macro-activity, Mokyr speculates, is the urge to emulate -- if I see someone hit the jackpot with an invention, I try harder to make mine succeed. The interaction of innovators also may make one technology conditional on another. Power machinery made it possible to produce the high-pressure steam engine, which led, around 1800, to an efficient locomotive.
Changes in the social soil also may make an economy more receptive to technological shocks. Certain societies have been far more conducive to technological creativity than others. Mokyr describes the social conditions that set the stage: A cadre of resourceful innovators must be willing and able to challenge their physical environment, and such a group is more likely to thrive in a society that's well-nourished, rational, and open to experiment. The economic and social institutions must encourage innovators with adequate incentives in terms of money and prestige. Otherwise, innovators who might have directed their attention to technology become priests, generals, or poets. The ancient Greeks appreciated sports and learning, but stigmatized production as an inferior activity. Medieval Europeans, by contrast, were more appreciative of useful knowledge, and so created macroinventions of their own and adopted many inventions from elsewhere, including the Islamic world.
Finally, Mokyr argues that innovation requires diversity and tolerance toward the eccentric, since inventors often rebel against the status quo. Bell and Edison had an outsider's mentality and sought the thrill of a major technological transformation -- yet they were financed by that consummate insider, the capitalist J.P. Morgan.
THE SPUR OF ENRICHMENT
Some scholars, including Mokyr, and a popular strain of Americana depict macroinventions as governed by individual genius and luck. Others view macroinventions as the outcome of economic forces, with investments in inventive activity influenced by the assessment of potential financial returns. While this question remains open, both camps agree that the opportunity for the successful innovator to enrich himself has been a critical factor spurring invention in general. The rise of the mass market and the corporate form of business raised the likelihood of enrichment over the past two centuries, and help explain the prolonged stream of technological creativity.
Prominent inventors in early industrial America were highly motivated by financial returns, find economists Zorina Khan of Northeastern University and Kenneth Sokoloff of the University of California at Los Angeles. Patent information from 1790 to 1846 reveals that the inventors' patents were linked to extensive markets, concentrated in southern New England and New York. New England's inventors also excelled in supplying new machinery and techniques to other regions. Financial orientation and experience in an industry seemed to be as important as unique technical gifts: Most of these inventors were merchants, manufacturers, or farmers who actively pursued the returns to their discoveries through royalties and licensing fees.
Until the mid-nineteenth century, this process was largely carried out by men (rarely women) working alone or in pairs or small groups. But in the subsequent few decades, science became increasingly important in stimulating new technology in the United States, and the locus of invention began to shift from the independent workshop to the corporation, reports Leonard Reich, economic historian at Rutgers University. Corporations offered well-paid, secure positions to technologists, and absorbed much of the risk of the inventive process. Industrial research labs were established at the turn of the century; and by 1931, more than sixteen hundred U.S. firms reported that they supported a research lab, employing a total of nearly thirty-three thousand people. Corporate support of scientists became steadily more frequent as companies grew larger and more profitable, and as industrial products and processes took on greater complexity. "Research made more research imperative," Reich says, since companies in a competitive and fluid environment could ill afford complacency. And once a technology had been established, the people within firms who used it were best situated to know which improvements would yield big payoffs.
The large corporation paid employees well but not extravagantly, and excelled at coordinating specialized knowledge. The "technostructure," as Harvard economist John Kenneth Galbraith calls it, took ordinary men, informed them narrowly and deeply, and then combined their knowledge with that of other specialized but ordinary men. "No individual genius arranged the flights to the moon," Galbraith observes. "It was the work of organization -- bureaucracy."
In recent years, enrichment has again become a central feature of emerging, high-tech industries, which use incentive pay and equity stakes for key individuals. While creative people are driven by a love of their work, financial incentives play an increasing role, even in the realm of academic science, if its links to technology are tight. Perhaps no industry is as dependent on science as biotechnology. Here, a small number of highly productive "star" scientists have been central in affecting the diffusion of science and the success of commercial biotech applications. Analyzing patent and journal citations, sociologist Lynne Zucker and management professor Michael Darby of UCLA find that the most productive star scientists have extensive commercial ties, including equity stakes with firms. Stars who affiliate with firms and have patented discoveries are cited in journals over nine times as frequently as their academic peers without patents or commercial ties.
Stars thus can provide value to both their firms and their universities. They tend to found or join start-ups, and their university affiliation is a signal of credibility to investors. These new firms, in turn, provide more resources to scientists, Zucker and Darby report. A scientist working through a firm spends far less time writing grant proposals, and so can make faster progress in bench work. And since techniques for gene replication require a lot of tacit knowledge and hands-on experience, scientists and their firms are able to appropriate the returns from their new developments.
ACTION AND REACTION
For thousands of years, technological change tended to run out of steam and revert to a period of stagnation. The industrial revolution meant that suddenly continual, large-scale change was the norm. The critical invention of the nineteenth century, "how to invent," gave us a set of scientific tools -- theory, careful measurement, and accurate instruments -- and Joel Mokyr believes the process of invention has since become more efficient, in that fewer false turns are taken. New chemical compounds to block disease-causing enzymes can now be drawn from computer models, rather than from cumbersome physical experimentation. Modern industry became a powerful machine for stimulating research, and the rise of large cities and markets allowed innovation to spread more rapidly. Opportunities for enrichment multiplied as well, in both the corporate and the academic worlds.
Until the mid-1970s, the size of firms grew, and the number of self-employed fell. Then the trend of a century was reversed, with big firms shrinking and small ones on the rise. Technological creation has begun moving out of Galbraith's giant corporation and, via the university lab, into the entrepreneurial startup. Invention and innovation allow small firms to enter industries and remain viable where they otherwise would experience an inherent cost disadvantage. Falling communication costs can also lower the barriers to entry and allow more collaboration among small players.
New technological advances are hardly assured, however. Funding for basic research remains critical, since a substantial portion of innovations in high-tech industries such as drugs, instruments, and information processing have been based directly on academic research. Yet recent trends in spending on research and development in the United States suggest a problem. Corporate outlays for R&D have fallen significantly during the 1990s, with most of the decline coming out of basic research. Federal financing, which in the past has sponsored much of the nation's high-risk research, has suffered as well. As a result, total, inflation-adjusted spending on research has been stagnant of late. It may be hard to remain on the frontier of invention without adequate investment.
Hyper-accentuated enrichment and the profit motive, moreover, are not always compatible with the diffusion of knowledge. Josh Lerner of the Harvard Business School argues that the strengthening of patent protection over the past fifteen years, and the subsequent growth of patent litigation in the United States, have created a substantial "innovation tax" afflicting creative small firms. Increased corporate sponsorship of academic research, meanwhile, has led to new restrictions on communicating the research results -- delayed publication, deletion of certain results, and even the refusal to allow publication at all. Protecting turf is no less an instinct than building a better mousetrap, but it can put a damper on a society's receptiveness and response to radical change.
Scientists and engineers, like everyone else, are influenced by their patrons and customers. The cultures of their communities thus affect the pace and direction of technological change.
In the U.S. machine tool industry during the 1950s, argues David Noble, historian of technology at York University in Toronto, an automation technology called numerical control, or NC, won out over rival approaches not because it proved technically or commercially superior, but rather because it gave military planners greater control over production, and came from technicians predisposed to abstract, quantitative solutions.
The U.S. Air Force, developing aircraft with unprecedented machining requirements, wanted to reduce the dependence of contractors on skilled, and strike-prone, labor. The military engaged an MIT laboratory, which developed a way to record the movements of the machine tool numerically. No longer would the skills and tacit knowledge of a machinist determine essential tooling; the interpretation would instead rely on a computer programmer.
Machine tool builders, however, remained reluctant to invest in the costly NC machines. There was too much electronics involved, the programming took weeks, and besides, most jobs fell within the bounds of what a machinist could already do. So the Air Force procured one hundred machines for contractors, and conditioned certain contracts on a commitment to NC technology -- shielding NC from the rigors of the marketplace. Other systems that potentially were simpler, reliable, and cheaper were rejected by the military, Noble contends, as too reliant on human machinists.
Widespread diffusion of NC machine tools did not start until the mid-1970s, when Japanese firms replaced the inflexible hardwired NC circuits with the softwired minicomputer. It's never government's function to pick commercial winners and losers. But in this case, the military and its academic advisors cut off innovations that might have had more general commercial success.
Thomas P. Hughes, The Dynamics of Technological Change: Salients, Critical Problems, and Industrial Revolutions, in Giovanni Dosi, Renato Gianetti, and Pier Angelo Toninelli, editors, Technology and Enterprise in a Historical Perspective, Clarendon Press, 1992.
Joel Mokyr, The Lever of Riches: Technological Creativity and Economic Progress, Oxford University Press, 1990.
David F. Noble, Forces of Production: A Social History of Industrial Automation, Oxford University Press, 1986.
Leonard S. Reich, The Making of American Industrial Research: Science and Business at GE and Bell, 1876-1926, Cambridge University Press.
Merritt Roe Smith and Leo Marx, editors, Does Technology Drive History?: The dilemma of Technological Determinism, MIT Press, 1995.
B. Zorina Khan and Kenneth L. Sokoloff, "Schemes of Practical Utility: Entrepreneurship and Innovation Among Great Inventors in the United States, 1790-1865," Journal of Economic History, Vol. 53 No. 2 (June 1993), pp. 289-307.
Edwin Mansfield, "Microeconomic Policy and Technological Change," presented at the Federal Reserve Bank of Boston's 1996 Conference on Technology and Growth, proceedings forthcoming.
Nathan Rosenberg, "Uncertainty and Technological Change," presented at the Federal Reserve Bank of Boston's 1996 Conference on Technology and Growth, proceedings forthcoming.
Walter G. Vincenti, "The Retractable Airplane Landing Gear and the Northrop 'Anomaly': Variation-Selection and the Shaping of Technology," Technology and Culture 35, No. 1 (1994) pp 1-33.
Lynne G. Zucker and Michael R. Darby, "Virtuous Circles of Productivity: Star Bioscientists and the Institutional Transformation of Industry," National Bureau of Economic Research Working Paper 5342, November 1995.