Scholars Award: The Long Arm of Moore's Law: New Institutions for Microelectronics Research, 1965-2005
Cyrus C.M. Mody Rice University
Project Summary
Moore's Law (the doubling of components in commercial integrated circuits every 24 months) is the metronome of the microelectronics industry, the largest contributor to U.S. GDP. This study extends recent histories of microelectronics by examining Moore's Law's indirect impact on institutions of American science. The study focuses on three such impacts: the emergence of research communities dedicated to radical alternatives to silicon, such as superconducting computing or molecular electronics; the establishment of institutions for the new field of microfabrication; and the adaptation of miniaturization techniques for use by disciplines such as molecular biology.
Intellectual merit
This study's intellectual merit lies in tying progress in miniaturization to the wider context of the late/post Cold War. Previous studies of post-1970 microelectronics focus on corporate actors following economic logic. This study traces the involvement of a more diverse set of actors: creative scientists and engineers proposing radical replacements for silicon; Vietnam protestors on campuses such as Stanford; and scientists, grant officers, and universities coping with the tight budgets of the 1970s. This study will yield a clearer picture of how American science and science policy were shaped in the late/post Cold War by a desire to benefit (from) microelectronics.
Broader impacts
There are two broader impacts of this study. Content-wise, this study gives some ownership of Moore's Law to the public. Technophilic writers often describe Moore's Law as beyond human control, but this study shows that public expectations about how much scientists should interact with the market, other disciplines, and the military formed a background for miniaturization. Methodologically, this study demonstrates the utility of geographically-dispersed and interdisciplinary collaborations for making history of technology policy-relevant. This award augments collaborative work sponsored by the Center for Nanotechnology in Society at UCSB. Research funded by this award will be published in articles co-authored with faculty and students in business schools and departments of English, sociology, history, and science and technology studies.
Moore’s Law is the now half century old observation that the number of microelectronic components that can most profitably be crammed on a silicon wafer doubles about every two years. This trend makes it possible for computer chips to steadily become lighter, smaller, more complex, faster, and cheaper. Without it, we would not have many of the information technologies that have become ubiquitous in many societies: laptops, cell phones (especially those incorporating a camera and other devices), the internet, aerial drones, Big Data, and ubiquitous sensors and surveillance. The use and manufacturing of technologies made feasible by Moore’s Law have transformed many nation’s economies and many social arenas. Less widely acknowledged is that the use and manufacturing of these technologies has transformed the organization and practice of science and science policy in the US and around the world. That American science has changed in the years since Moore’s Law was promulgated is hard to dispute (though it is still an open question how widespread those changes are across the American research enterprise). Significant trends in that time include: increasing academic entrepreneurship and patenting; declining corporate basic research capacity; increasing use of research consortia; declining importance of military markets and R&D funding relative to civilian markets and R&D funding; declining funding and prestige of the physical sciences relative to the life sciences. Most histories thus far trace the origins of these trends to the emergence of a biotechnology industry and the rise of neoliberal economics during the Carter and Reagan administrations. However, my NSF Scholars Award has allowed me to do extensive archival and interview research that demonstrates that the microelectronics industry was as important as biotech and neoliberalism in shaping each of the shifts in emphasis in American science listed above. Specific findings from this project include: Due to cutbacks in defense funding, the US microelectronics industry turned increasingly toward civilian markets in the mid-‘60s, and those markets dwarfed military markets by the mid-‘70s; at the same time, academic microelectronics research turned increasingly toward civilian applications due to both funding cutbacks and political pressure to demonstrate "relevance" to the problems of civil society. An increasingly competitive environment plus the highly visible failure of several attempts to turn basic research discoveries into commercial products in the 1970s led the microelectronics industry to cut back sharply on basic research in the 1980s. The continuing need for basic research led the microelectronics industry to form some of the earliest US research consortia. The increasing expense of tools needed to conduct academic research related to microelectronics led federal agencies to experiment with shared user facilities where equipment could be rented by a broad user base. Often, the same academic microelectronics user facilities supported by the federal government were also members of (or operated as) some of the early microelectronics research consortia. As corporate microelectronics basic researchers left industry for academia, they brought with them an applied focus and network of contacts in industry and government. Often, they founded or led academic centers or facilities that were widely seen as innovative pioneers of university-industry cooperation. Policymakers and journalists cited academic microelectronics centers as models for the successive waves of federally-funded university-industry research centers that began in the mid-‘80s. As semiconductor manufacturing became less vertically integrated and more expensive and specialized in the ‘90s, academic microelectronics researchers’ contributions were pushed further away from the factory floor. At the same time, federal funding for biomedical and life science research accelerated quickly relative to funding for the physical and engineering sciences. Thus, from the late ‘80s onward it became much more common for academic microelectronics researchers to forge collaborations with the life sciences and to form start-up biotech companies. The question of what drove large-scale changes in American science since 1970 is neither idle nor academic – it has deep implications for science policy and for policymakers’, scientists’, executives’, academic administrators’, and the wider public’s views on the proper aims, conduct, and organization of research. As the study supported by this NSF Scholars Award shows, acknowledging and understanding the microelectronics industry’s – and the microelectronics-buying public’s – role in reshaping American science must be part of any conversation about how to make science more efficiently and democratically achieve what society asks of it.