Industrial Research - Where It's Been, Where It's Going
Fusfeld, Herbert I., Research-Technology Management
Three critical transition periods have affected the nature and conduct of industrial research: 1870-1910, World War II, and a period of declining technical self-sufficiency that began around 1970 (see illustration, next page).
Between 1870 and 1910, industrial research gave the new science-based chemical and electrical industries the more focused and more timely technical support needed to conduct their business and pursue growth opportunities. The companies no loner had to depend on the unpredictable advances generated externally. From this start, and with the added stimulus of World War I, there was a steady growth of industrial research up to World War II.
World War II had three consequences of profound significance for industrial research.
* A great reservoir of technical advances became available for further development in commercial areas.
* Public expectations were raised for the potential of science and technology in new products.
* New techniques of systems development were successful in planning and conducting complex technical programs that required the generation of new knowledge as an integral part of the planned program.
This last point was critical for the growth of industrial research after World War II, particularly for U.S. corporations. A generation of industry executives, from R&D managers to CEOs, had either participated in developments such as radar and the atomic bomb, or they had observed the activity and the results.
Confidence in the ability to include R&D as part of a broader objective within a time and cost schedule was imbedded in corporate management of the 1950s and 1960s. That confidence was a major factor in making technical change the basis for competitive advantage. In addition, management support for R&D was encouraged in those years by relatively low interest rates, which made R&D a good investment and long-term research acceptable. For U.S. corporations, commercialization of R&D was made easier by American dominance of world trade while Europe and Japan were rebuilding their industrial base.
In this favorable climate, the postwar expansion of industrial research included growth of the great corporate central laboratories of AT&T, GE, IBM, Du Pont, and many others. These laboratories provided the concentration of technical resources needed to generate the major advances in electronics, computers, advanced materials, and communications that pushed industrial growth in the late 20th century.
Declining Technical Self-Sufficiency
For 30 years following World War II, corporations were able to plan growth strategies based on technical resources that existed internally or that were easily available and affordable in a reasonable time. In that sense, the corporation was technically self-sufficient.
That situation changed for most corporations sometime between the mid-1970s and the late 1980s, as increasing demands upon industrial research came up against a limiting characteristic of technical progress: In any given field, there is a steady rise in the cost and complexity for generating significant new technical advances. By the end of the 1970s, the demands upon industrial research were growing faster than the increases in R&D necessary to meet them. The ability of a corporation to set forth a growth strategy with confidence that it possessed, or could easily obtain, the technical resources necessary to support that strategy was gradually lessening. In short, the technical self-sufficiency of corporations began to decline.
The past 10-15 years have changed the conduct of industrial research in certain critical ways. Technology managers have had to develop access to external sources of technology, not simply to have more information but to identify partners for future research programs. Increased corporate activity with universities, with government consortia, with cooperative research programs, and with joint ventures, all result from the common pressure to access external technology. …