The Regulatory Environment for Science - Princeton

138
The Regulatory Environment for Science February 1986 NTIS order #PB86-182003

Transcript of The Regulatory Environment for Science - Princeton

Page 1: The Regulatory Environment for Science - Princeton

The Regulatory Environment for Science

February 1986

NTIS order #PB86-182003

Page 2: The Regulatory Environment for Science - Princeton

Recommended Citation:U.S. Congress, Office of Technology Assessment, The Regulatory Environment for Science—A Technical Memorandum, OTA-TM-SET-34 (Washington, DC: U.S. Government Print-ing Office, February 1986).

Library of Congress Catalog Card Number 86-600503

For sale by the Superintendent of DocumentsU.S. Government Printing Office, Washington, DC 20402

Page 3: The Regulatory Environment for Science - Princeton

Foreword

The vitality of American science is a principal component of our country’s eco-nomic and intellectual success. Congress, among other institutions, bears responsibil-ity for nurturing creativity and exploration in science while providing appropriate safe-guards. Various direct and indirect forces, however, are continually altering the agendafor scientific endeavor, its methods and practices, and the dissemination of results. Itis therefore essential that Congress, in providing direction and funding for science atthe Federal level, understands these forces and how they are changing.

This technical memorandum, requested by the Task Force on Science Policy of theHouse Committee on Science and Technology, provides a “snapshot” of factors affect-ing science in the 1980s, and focuses on emerging issues that will require considerationby Congress.

- J O H N H . G I B B O N SDirector

iii

Page 4: The Regulatory Environment for Science - Princeton

Regulatory Environment for ScienceOTA Project Staff

John Andelin, Assistant Director, OTAScience, Information, and Natural Resources Division

Nancy Carson NaismithScience, Education, and Transportation Program Manager

Project Staff

Marcel C. LaFollette, Project Director, OTA Fellow

Kathi Hanna Mesirow, Analyst

Barbara Berman, Research Analyst

Eugene Frankel, Senior Analyst

Administrative Staff

Marsha Fenn, Administrative Assistant

Betty Jo Tatum, Secretary

Christopher Clary, Secretary

Gala Adams, Clerical Assistant

Contractors

Michael S. BaramBracken and Baram

Sheldon KrimskyDepartment of Urban and Environmental Policy

Tufts University

Jon D. MillerPublic Opinion LaboratoryNorthern Illinois University

Dorothy NelkinProgram on Science, Technology, and Society

Cornell University

Page 5: The Regulatory Environment for Science - Princeton

Diana DuttonSchool of MedicineStanford University

Regulatory Environment for ScienceAdvisory Panel

Harvey Brooks, ChairmanBenjamin Peirce Professor of Technology and Public Policy

Harvard University

Harold P. GreenAssociate Dean for Post J.D. StudiesThe National Law CenterGeorge Washington University

Clifford GrobsteinDepartment of Biological Science and

Public PolicyUniversity of California, San Diego

Jonathan A. KingProfessor of Molecular BiologyMassachusetts Institute of Technology

Alan McGowanPresidentScientists’ Institute for Public Information

John ShattuckVice President of Government, Community, and

Public AffairsHarvard University

Irving S. JohnsonVice President of ResearchLilly Research Laboratories

Contributors

Philip AlexanderMassachusetts Institute of Technology

Nicholas A. AshfordMassachusetts Institute of Technology

R.J. AugustineNational Institutes of Health

George BushCouncil on Governmental Relations

Daniel CallahanThe Hastings Center

Arthur CaplanThe Hastings Center

Rosemary ChalkAmerican Association for the Advancement

of Science

David CheneyU.S. Congress, Library of CongressCongressional Research Service

Daryl E. ChubinGeorgia Institute of Technology

John DobleThe Public Agenda Foundation

George DummerMassachusetts Institute of Technology

John T. EdsallHarvard University

Howard GobsteinU.S. General Accounting Office

Milton GoldbergCouncil on Governmental Relations

Loren GrahamMassachusetts Institute of Technology

Susan P. HaddenUniversity of Texas, Austin

Judith C. HarrisArthur D. Little, Inc.

Gerald HoltonHarvard University

Eric HoltzmanColumbia University

Mary E. CurtisJohn Wiley & Sons, Inc.

Page 6: The Regulatory Environment for Science - Princeton

Deborah Marquis KellyAmerican Psychological Association

Melvin KranzbergGeorgia Institute of Technology

Sanford P. LakoffUniversity of California, San Diego

Rachel LevinsonNational Institutes of Health

Charles MacKayNational Institutes of Health

Louis MenandMassachusetts Institute of Technology

Alexander MorinNational Science Foundation

Herbert MortonAmerican Council of Learned Societies

Thomas H. MossCase Western Reserve University

Thomas MurrayMedical BranchUniversity of Texas, Galveston

Mark V. NadelU.S. General Accounting Office

Dorothy NelkinCornell University

Robert PearsonSocial Science Research Council

Carroll PursellUniversity of California, Santa Barbara

Harold RelyeaU.S. Congress, Library of CongressCongressional Research Service

Sal RestivoRensselaer Polytechnic Institute

Jurgen SchmandtUniversity of Texas, Austin

Jeffrey K. StineU.S. House of RepresentativesCommittee on Science and Technology

Judith SwazeyAcadia Institute

Ed TocusU.S. Food and Drug Administration

Charles WeinerMassachusetts Institute of Technology

Dael WolfleUniversity of Washington, Seattle

Leo YoungU.S. Department of Defense

Robert NewtonNational Science Foundation

vi

Page 7: The Regulatory Environment for Science - Princeton

ContentsPage

CHAPTER l: Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Forces Shaping Science. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5Importance of This Issue for Congress . . . . . . . . . . . . . . . . . . . . 7

CHAPTER 2: Historical and Political Context for Regulation of Research . . . . 11Changing Political Concerns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13The 1960s: Public Criticism of Science . . . . . . . . . . . 19The “Limits to Inquiry" Debate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Recent Restrictions on Scientific Communication . . . . . . . . . . . . . . . . . . . . 27Political Influences on Regulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

CHAPTER 3: Social and Political Rationales for Controls on Research . . . . . . . . . . . . . . . . . . . . 33Regulatory Forces on the Research Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

To Fulfill Political Objectives or Avoid Political Effects . . . . . . . . . . . . . . . . . . . 34To Avoid Environmental Damage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36To Promote or Avoid Predictable Economic Consequences . . . . . . . . 37To Preserve Moral Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Regulation of Research Procedures and Protocols . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38To Protect Human Health and Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38To Protect Animals Used in Experimentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41To Protect the Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

Regulation of the Dissemination of Scientific Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42To Uphold Scientific Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43To Protect a Professional or Economic Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44To Protect the Health, Privacy, or Safety of Individuals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45To Protect National Military and Economic Security . . . . . . . . . . . . . . . . . . . . . . . . . . 45

Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

CHAPTER 4: The Mechanisms for Direct Control of Research . . . . . . . . . . . . . . . . . . . . . . 51Mechanisms for Individual or Research Group Control . . . . . . . . . . . . 53

Agenda Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53Controls on Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54Communication Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

Regulations Imposed by Institutions . . . . . . . . . . . . . . . . . . . . 55Controls by Professional Organizations in Science and Engineering . . . . . . . . . . . . . . . . . 57Mechanisms for Government Regulation . . . . . . . 58

Agenda Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59Controls on Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62Communication Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

The Role of the Courts in the Regulation of Research. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67Mechanisms for Social Control Outside of Government . . . . . . . . . . . . . . . . . . . . . 68Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

CHAPTER 5: Mechanisms for Indirect Control of Research . . . . . . . . . . . . . . . 73Administrative Requirements on Universities That Accept Federal Grants or Contracts. . . . . 73Protective Legislation and Agency Rulemaking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

Antidiscrimination Statutes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76Small Business Innovation Research Programs and Use of Services Regulations . 76Privacy Legislation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

Environmental Legislation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

Occupational and Public Health and Safety Regulations. . . . . . . . . . . . . . . . . . . . . . . . 78

Right-To-Know Legislation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78CHAPTER 6: Institutional Differences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85

Profile of a Representative Industrial Laboratory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85Control of the Research Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85Controls on the Research Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Management of Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86Restrictions on Communication of Information . . . . . . . . . . . . . . . . . . . . . . . . . 87

vii

Page 8: The Regulatory Environment for Science - Princeton

Page

Profile of a Representative Nonprofit Independent Laboratory . . . . . . . . . . . . . . . . . . . . . . . . . . 88Control of the Research Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88Control of the Research Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Management of Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91Controls on Communication of Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

Profile of a Representative University Laboratory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

Controls on the Research Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

Controls on the Research Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93Management of Risks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93Restrictions on Communication of Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95

CHAPTER 7: Community Control of Research: Two Case Studies . . . . . . . . . . . . . . . . . . . . . . . . 99Research Involving Recombinant DNA Molecules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99Testing Chemical Warfare Agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................101Comparison of the Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. ... ... ... ..103

Origins of Local Regulations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....103

Types of Local Interventions, . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .104

Stage of the Scientific Enterprise Affected . . . . . . . . . . . . . . . . . . . .. .. .. .. .. .. ... ... ... ..105Social Risk Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ... ..105Parties Affected by the Proposed or Actual Regulations . . . . . . ........................106Legal Issues. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..., . . . .107

Impacts of the City’s Interventions Beyond Its Borders, . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...108Arguments For and Against Regulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .............109

Recombinant DNA Controversy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .........109The Case of Chemical Weapons Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .........110

General Policy Implications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...................112CHAPTER 8: Research Policy Issues That May Warrant Congressional Attention

in the Future. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . , . .117Who Bears the Burden of Proof?.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .................117The Scientist’s Role in Assuring Safe Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......118Ex Post Facto Restrictions on Research Communication . . . . . . . . . . . . . . . . . . . . . . ..........119“Gray Areas” of Sensitive Information and Broader Classification . ......................120Impact of New Communications Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......120Patents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...................121Public Education on the Basis and Procedure for Regulation. . . . . . . . . . . . . . . . . . . . . . .. ....121Shift in the Jurisdiction for Regulation . . . . . . . . . . . . . . . . ..........................122General Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ... ... ... ..122

APPENDIX A: The Regulatory Environment for Science . . . . . . . . . . . . . . . . . . . . . ............127APPENDIX B: Public Attitudes Toward Science . . . . . . . . . . . . . . . . . . . . . . ..................130APPENDIX C: Environmental Concerns and Laboratory Siting:

The Morris Township-Bellcore Case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .............136REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....143

List of TablesTable No. P a g e

2-1. Funding of Research and Development by Source in 1960 and 1985. . . . . . . . . . . . . . . . . . . 123-1. Justifications for Control of Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33B-1. European Attitudes Toward Science, 1978 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. ...131

B-2. Expectations for Future Scientific Achievements, 1979-83 . ...........................132B-3. Attitude Toward Contribution of Science, by Attentiveness, 1957-83 . . . . . . . . . . . . . .. ...132B-4. Expectation for Future Achievements, by Attentiveness, 1979-83 . . . . . . . . . . . . . . . . . .. ...133B-5. Public Concerns About Science, 1957-83 . . . . . . . . . . . . . . . . . . . . . . . . . .................133B-6. Concerns About Science, by Attentiveness, 1957-83 . . . . . . . . . . . . . ....................134B-7. Public Assessment of the Risks and Benefits of Science, 1972-85 . . . . . . . . . . . . . . . . . .. ...135B-8. Public Confidence in Science and Selected Other Institutions, 1973-84 . . . . . . . . . . . . . ....135C-1. Comparison of Three Cases Involving Local Control of Research . . . . . . ..............140

Vlll

Page 9: The Regulatory Environment for Science - Princeton

Chapter 1

Executive Summary

Photo credit: National Institutes of Health

Page 10: The Regulatory Environment for Science - Princeton

Chapter 1

Executive Summary

This OTA technical memorandum examines thesocial and legal forces that act to restrict or regu-late scientific and engineering research in theUnited States today. Recent controversies over theuse of animals in experimentation, the risks asso-ciated with recombinant DNA research, and na-tional security controls on scientific communica-tion have focused congressional attention on thepolicy issues raised when government intervenesin the research process. As each issue has arisen,Congress has been called on to decide when andwhere intervention is appropriate, and how tostructure intervention so as to protect publichealth and safety or national security without un-duly retarding scientific progress. At the requestof the Task Force on Science Policy of the HouseCommittee on Science and Technology, OTAlooked at the entire “regulatory environment” forresearch, with the goals of analyzing the struc-tures and mechanisms for regulation and of iden-tifying significant policy issues that may requirecongressional attention in the future.

Although scientists have always exercised re-straints on their work, the present system of gov-ernment-based, legally enforceable regulations isrelatively new (ch. 2). Until 1945, constraints werelimited to social prohibitions on sensitive topics,

some controls on agricultural research, and na-tional security controls on technical communica-tion during wartime. Post-1945 arrangements forthe support of science treated it as distinct fromother types of government programs in that itshould be free from direct government control oreconomic self-interest and that scientists could betrusted to govern their own affairs.

The uncovering of a number of examples ofabuses of human subjects, growing fears that sci-entific research was posing high risks to humanhealth, the identification of research with govern-ment-sponsored activity, and the social and po-litical climate of the 1960s and 1970s (ch. 2), ledto a series of regulatory actions that began to con-strain not just what topics scientists should pur-sue, but also how they should be pursued and theresults disseminated. More recent controversies

over controls on scientific and technical informa-tion deemed vital to national military or economicinterests indicate further erosion of the trust inscientists’ governance which characterized thepostwar arrangements. The increased regulationmay also indicate that science is simply includedas a target of society’s increasing willingness toregulate all types of institutions, professions, oractivities.

A wide spectrum of social and political ration-ales (ch. 3) may justify controls linked to a spe-cific part of the research process: selection oftopic, experimentation or other procedures, anddissemination of results. The moral and ethicalconcerns expressed in attempts to restrict researchare not new. What is new, however, is the rais-ing of such concerns to the level of governmentaction or legally enforceable regulations. Somegovernmental regulations manifest concerns aboutthe potential risks of a line of research; they dem-onstrate that society wants to protect the healthand safety of experimental subjects. Governmentrestraints on the communication of scientific andtechnical information seek to protect militarilysensitive information or to curtail economic lossesassociated with international technological com-petition. Public opinion data show that such reg-ulation may reflect the American public’s willing-ness to restrict research when the risk is perceivedto be too great, despite the concurrent existenceof widespread public support for science as a cul-tural activity deserving Federal support.

Analysis of the mechanisms by which restraintsare imposed at the laboratory, institutional, orgovernmental levels (ch. 4) shows that controlsin the modern research environment are wide-spread, synergistic, and cumulative. They affectevery stage of the research process—what topicsmay be pursued, how they may be pursued, andwhen and to whom the research may be dissemi-nated. Institutional mechanisms include formaladministrative policies, institutional review com-mittees, or institutional cooperation with exter-nal requests for constraints. Professional socie-ties set up codes and guidelines and may cooperate

3

Page 11: The Regulatory Environment for Science - Princeton

4

with government attempts to impose dissemina-tion controls. Government control mechanismsinclude: review commissions and ethics advisoryboards, legislative review of proposals or projects,moratoria, regulations on the use or possessionof substances used in research, interpretation ofagency regulations, contract provisions, and dis-semination or publication controls. The channelsthrough which government can affect research—legal regulations, formal administrative controls,judicial actions, priority-setting through budgetallocations—have increased in the last decade,largely because of increased Federal support ofscience (and, therefore, increased channels for im-plementation of regulations), but also because ofgeneral demands for accountability and the wid-ening impact of science on society. The very mul-tiplicity of mechanisms for restraint increases thepossibility that such regulations will be imple-mented piecemeal, in isolation, and without co-ordination, and that they therefore may producean adverse synergistic effect on the progress ofscience and the research base for innovation.

As the case study in chapter.5 shows, the reg-ulatory effects, especially on the research proc-ess, are not confined to basic research in univer-sities, even though most of the discussion of andcomplaints about overregulation has been con-centrated there. Many of the mechanisms de-scribed in chapter 4—e.g., controls on researchmaterials, human subjects regulation, dissemina-tion controls—apply with equal force to researchin industry and private laboratories.

In many cases, science may not have been somuch singled out for control, however, as sim-ply sharing in society’s growing propensity forregulating all types of specialized institutions oractivities. Such regulations include administrativereporting requirements for Federal grants and con-tracts, social programs legislation (e.g., affirma-tive action), Occupational Safety and HealthAdministration (OSHA) regulations and right-to-know laws, and laws and policies relating tointernational diplomatic relations. Although theseactions are designed to serve a public objectiveand not to restrain research, they can add to anexisting financial and administrative burden of in-tentional controls and their effects can be muchmore difficult to avoid after they are legally in

force. Such regulations could have a long-termadverse effect on innovation and progress in re-search, There is clearly a need for better documen-tation and monitoring of possible unintentionaladverse effects on research and, in some cases,there may be a need to consider specific legisla-tive exemptions for research.

The local government interventions describedin chapter 7, and in the case study in appendixC, point to a potential for increased confronta-tions between State and local authorities and theFederal Government regarding the jurisdiction forregulation. Should science be controlled througha combination of self-regulation and broad Fed-eral oversight, insulated from local laws? Theemergence of a number of cases in which researchfacilities have been the subject of local protestsindicates that research no longer wears a mantleof unquestionable civic respectability. Instead, itis subject to the same political influences and atti-tudes at the local level as are other institutions.

Given these circumstances, several changes mayoccur in the near future (ch. 8). One is a shift inwho must bear the burden of proof for controlof research. That responsibility is increasinglyshifting to the regulated researcher, who mustprove that the research is safe or anticipatewhether dissemination of the research results mayhave some adverse effect on the national inter-est. As this situation changes, the Federal Gov-ernment will increasingly have to consider theappropriate role for scientists in the regulatoryprocess itself. How much should be left to infor-mal practice and how much required through leg-ally enforceable regulation? Congress can also ex-pect to confront a number of communications-re-lated issues in the future. These issues relate tothe need to protect both freedom of speech andthe freedom of scientific inquiry necessary to cul-tivate progress and innovation. How should thesefreedoms be balanced with the very real need toprotect national military and economic interests?Whether through ex post facto restrictions onhitherto unclassified research or through thebroadening of “gray areas” of sensitive informa-tion, the short-term goals of communication con-trols must be balanced carefully against their long-term effects on the Nation’s science and technol-ogy base and on opportunities for U.S. scientists

Page 12: The Regulatory Environment for Science - Princeton

to benefit from interactions withleagues.

foreign col-

communica-next decade

Computerization of the scientifiction system may also raise in theequally difficult issues regarding not only the pro-tection of intellectual property but also the easeand speed of classification of information. Issuesof patent reform will continue to create the po-tential for significant secondary effects on the re-search system—both in what type of basic re-search is sponsored by industry and in interferencewith intercollegial communication of ideas. Final-ly, the apparent increase in regulatory activity atthe State and local level may be an indication ofa jurisdictional shift in the initiative for regula-tion, from Federal to State or local, with the ac-companying potential for “Balkanized” regula-tions and differential strictness of regulation.

This new “regulatory environment for research”raises many important questions for the Science

FORCES SHAPING SCIENCE

Scientific research* —i.e., the organized, sys-tematic search for knowledge about, insight to,or understanding of a subject—is significantly in-fluenced by its social and political context. Forexample, the pressures of U. S, economic compe-tition in world markets and the linking of researchaccomplishments to national stature affect whichresearch is funded and which research results maybe widely disseminated. Increased public aware-ness of the negative side effects of the researchresults or processes have created pressure for gov-ernment control. Thus, scientific research can beconstrained both for political and social purposes,and when it is regarded as a negative force outof control or a force that may become negativeif allowed to continue.

No matter what the field or institutional set-ting (university, government, or industry), the re-search process has certain common characteris-tics—e.g., in the use of a scientific knowledgebase, in the methods of investigation, and in the

● The report is not concerned with controls on the application ofresearch knowledge in medical practice, commercial developmentof a product, or similar exploitation of research results.

5

Policy Task Force and for Congress. First, howcan Congress assure balance among the protec-tion of public health and safety, the rights ofcitizens to govern their local communities, andthe freedom of individual scientists, whether ofspeech or action? Second, how can the regulatoryprocess and the opportunities for public discus-sion of regulation be structured so that compet-ing interests are negotiated before issues reach astage of controversy and hostility? Third, whichissues should receive congressional or State or lo-cal attention and which are best left to the self-regulation of the research communities? And,fourth, what can Congress do to assure that thisenvironment does not unduly erode innovationand creativity in U.S. industry or unreasonablydamage the Nation’s investment in universityresearch?

training and education of its participants—thatare independent of specific project goals. Re-straints on research may affect the choice of whichsubject to investigate or which to fund (controlson topic); the method by which that investiga-tion proceeds, including the tools of research andthe objects or animals manipulated during the re-search (controls on procedure); and the timing ofand audience for descriptions of the research andits results (controls on communication). This re-port analyzes the influences at each of these stages.

Different social or political mechanisms can in-fluence the research process in different ways.Public approval or disapproval of research topicsor procedures may be expressed in political dem-onstrations against laboratories, through refer-enda and local initiatives, as well as through moralcondemnation and social pressure.1

More formal control is exercised through, forexample, laws passed specifically to direct some

‘Loren R. Graham, “Concerns About Science and Attempts toRegulate Inquiry, “ Limits of Scientific lnquir~, Gerald Holton andRobert S. Morison (eds. ) (New York: W.W. Norton & Co., IQ7Q),pp. 1-22.

Page 13: The Regulatory Environment for Science - Princeton

6

aspect of the research process, through Federalinterpretation of the language of such laws, orthrough the provision or denial of research funding.

Other economic or political forces can affectresearch through government actions intended tohave some other effect. Economic considerations,the need to protect proprietary interests, Federalprotections on public health and safety, and otherFederal and State legislation may influence indus-trial or other nonacademic research, For exam-ple, the Food and Drug Administration’s (FDA)regulatory requirements, which govern the intro-duction of new drugs, are reported to have slowedpharmaceutical industry research on new drugs,particularly on “orphan drugs” (drugs for rare dis-eases), z where the cost of those regulations hasnot been outweighed by favorable economic andmarket conditions. In contrast, however, the evenmore stringent requirements of Nuclear Regula-tory Commission and FDA regulations on radio-pharmaceuticals appear not to have affected thatresearch adversely. More favorable economic andmarket forces allow these firms to overcome theeffect of any regulatory burden.

The attitudes and professional values of the sci-entific community itself have played a prominentrole in influencing and sometimes constraining re-search activities. Self-imposed constraints wereused in the 1970s, for example, during the debateover recombinant DNA. Molecular biologists ex-ercised “restraint and caution” in their researchprocedures and adhered to a voluntary morato-rium on recombinant DNA research, even thoughthey “had no certain proof that the need for limi-tation existed or that the consequences of it wouldbe positive. ”3

Finally, control on the communication of scien-tific and technical information may be imple-

IBarry S. Roberts and David Z. Bodenheim, “The Drug Amend-ment of 1962: The Anatomy of Regulatory Failure, ” Arizona State[Jw Journal, vol. 1982, No. 3, 1982, p. 587,

‘Clifford Grobstein, A Double Image of the Double Helix (SanFrancisco, CA: W.H. Freeman & Co., 1979), p. 2 .

mented for reasons associated with economic ormilitary protection. For both basic and appliedresearch, such controls may take the form of aprior restraint on research publication or a denialof access to laboratories. When controls are im-posed on basic research in universities, however,the benefits of such controls may not be perceivedby the institution as outweighing the adverse ef-fects on the education and training of students.Because such restrictions often appear to violatetraditions of academic freedom, universities mayoppose their implementation. Critics of sweepingcontrols argue that, in the long run, such restraintscould harm the quality of the scientific work force,the traditional climates for creativity, and theprogress in basic science which is necessary totechnological advancement.

This OTA report takes a look at the entire rangeof social, political, and economic forces that re-strain all stages of the research process, in all typesof institutional settings, and that prompt changesin research projects or create sufficient politicalpressure for the development of legislation or ad-ministrative controls. In examining this “regula-tory environment, ” the OTA project attempts tolocate the common ground, the similarities amongwhat on the surface may seem to involve dramat-ically different issues and controversies. Restric-tions on communication, for example, are notonly confined to basic researchers in universities.Controls and regulations—both internally and ex-ternally imposed—also affect scientists in indus-try and in government at all stages of research.The OTA study looks at research—regardless ofwhere or by whom it is conducted—as a univer-sal activity, searching for common factors in themechanisms, justifications, and effects of the reg-ulatory environment. A few restrictions applyequally to all parts of the research system, to in-dustries as well as universities; others apply to spe-cific parts of the process or only to one field. Thedifferences may be only in the extent to which re-strictions are enforced or publicly discussed.

Page 14: The Regulatory Environment for Science - Princeton

7

IMPORTANCE OF THIS ISSUE FOR CONGRESS

Increased awareness of how science and tech-nology affect both social structure and socialvalues and vice versa has prompted increasedpressure for political intervention in what hereto-fore has been a decisionmaking activity domi-nated by scientists or science managers; but suchinteraction worries researchers who are accus-tomed to substantive control over all aspects oftheir own work. So there is a search underwayfor institutional forms that could permit morepublic involvement in critical policy decisions andyet still preserve “the flexibility y needed for the pur-suit it of scientific research.”4 Congress may desireor may be asked to play a role in developing thesenew arrangements.

Agency regulations-and many of the second-ary controls on research —are also related directlyto the amount of Federal support available to sci-entific and engineering research and to prioritysetting for allocation of that support. As the TaskForce document, An Agenda for a Study of Gov-ernment Science Policy, states:

. . . the immediate goals to which science can beexpected to contribute, such as improved health,a cleaner environment, and enhanced technologi-cal innovation, cannot be considered in isolation.Broader societal goals . . . should be taken intoconsideration when formulating the goals forscience.5

Another aspect that relates directly to the workof Congress is the suggestion that some regula-tions instituted for legitimate and laudable socialor political reasons may be having secondary, un-anticipated, and adverse effects on the quality ofscience and may thereby diminish science’s use-fulness to society. Regulation, according to theTask Force Agenda, “is one of the few areas inwhich the aims of science and the aims of societyare not necessarily congruent. The manner inwhich these conflicting aims are accommodatedis of significant importance to both science and

‘1’~ncl on .R-len(c and Te(hn(>logJ,: Science and IIangrrh (\\’a\h-]ngt(~n, I)C I’re\]dent’s C (,mm]sti(,n for a N“ational Agenda for theElxhtl[>> 1 Q80J, p 1 ~.

[ I $ ( (~n~re~>, I {OLIW C(\mm]ttc,e on Sclt, nce and Technology,rc14. F(~rce (Jn Sclcnce I)OIIC~ .An Agenda t(lr J .Studjr of C;OL’ern-rnt’nt S[ It’nc e 1‘(1/IcIr Q8t h Con~ ~c~ ~c~ss ( \\rash I n~t (}n, I )( : [~ .$.( ,(~lernmc’nt [’r]nt;ng (l ft]ce, 1 Q85 }, p 8

society. . . .“ The Task Force has focused on twoaspects of this issue in particular: 1) how to shapethe future regulatory environment for sciencewhile still responding to the necessity to avoid theill effects arising from regulating science;6 and 2)how “the legislative and regulatory authoritiesrepresenting society as a whole can protect pub-lic health, safety, and values while avoiding theimposition of unnecessary restraints on science.”7

The topic of the regulatory environment for re-search thus involves discussion of some of themost basic questions of American political phi-losophy: public control v. self-rule, Federal v. lo-cal jurisdiction, the feasibility of regulation andthe importance of consent by the regulated, gov-ernment regulation v. individual liberty, and howto balance the conflicting rights and values ofdifferent social institutions. The events and thedebate will continue. Congress can expect to con-front these issues again and again.

For many research areas, the question in the1980s is not whether there should be any limitsbut, instead, “what those limits should be. . . .[A]nd, if we can define those boundaries, whatcontrol options will maintain them most effec-tively?” 8 The organized scientific community nowappears to acknowledge the need for “some po-litical and public input to the setting of the gen-eral directions and agenda of scientific research, ”just as the political sphere appears to have ac-cepted the importance of “some degree of self-governance and internal agenda-setting” by thescientific community. The real issues have be-come, as Harvey Brooks notes, “where the linesshould be drawn and the appropriate processesby which the scientific and political communitiesshould negotiate the scientific agenda.”9

Page 15: The Regulatory Environment for Science - Princeton

Chapter 2

Historical and Political Contextfor Regulation of Research

Page 16: The Regulatory Environment for Science - Princeton

Chapter 2

Historical and Political Contextfor Regulation of Research

In the context of the current extensive and oftengenerous support for scientific and engineering re-search in the United States, it is easy to forget that,although scientists have always exercised a vari-ety of restraints on their own work, the presentgovernmentally imposed, legally enforceable con-straints on research topics, procedures, and com-munication are relatively new. Most of the cur-rent regulatory schemes were developed alongsidethe post-World War II arrangements for Federalfinancing of research and were influenced by thepolitical attitudes and assumptions governingthose arrangements and how they were instituted.

Before World War II, many scientists consid-ered Federal research grants to private universi-ties—where most basic research was conducted—to be improper if not unconstitutional. ’ In the1930s, for example, the leaders of the NationalAcademy of Sciences “objected on principle toletting private universities accept governmentfunds. ”2 In part, this attitude had to do with thescientists’ fears of losing autonomy. Some univer-sity research was supported by the professorsthemselves. They were not required to accountto the government for their time or for minor ex-penditures; “They simply did what research theirother duties and their pocketbooks allowed themto do. ”3 But objections were also linked to con-cern that government funding might provide theopportunity for restraints on research, as had hap-

ID(~n K. Price, “Enclless Frontier or Bureaucratic Morass?” TheLrnits ot .%entific inquiry, Gerald Holton and Robert S, Morison(eds. ) (New York: W,W. ,Norton & Co., 1979), pp. 75-92.

‘Ibid, In 1445, Frank B. Jewett, President of the National Acad-emy of Sciences and also of Bell Labs, opposed the creation of theNational Science Foundation on these grounds. In letters to Vanne-var Bush, ]ewett stated that private initiative should furnish themeans for fundamental research:

Ever\ d]rect or ]ndlrect subvention by Government is not only coupled]newltab[y w]th bureaucratic types of control, but []kewwe w]th polit]-cal cent rol and w] th the urge to create pressure groups seek]ng to ad-vance \peclal Interests

Merton J, England, A Patron for Pure Science (Washington, IXNational Sc]ence Foundatlorr, 1Q82), p. 35,

‘James I’enick, Jr., et al. (ecis. ), Politics of American Science, 1930to the Present (Cambridge, MA: The MIT Press, 1972), p. 7,

pened during wartime. During World War I, forexample, scientists had accepted military restric-tions on their communications; they were subjectto censorship and were, in some cases, persuadedto delay publication until the end of the war.4 TheAmerican Chemical Society had even opposedPresident Wilson’s order transferring gas warfareresearch from Bureau of Mines control to War De-partment control, not out of anti-war fervor butbecause it “feared the numbing effect of the . . .‘red tape’ of War Department methods upon thespirit of originality, daring and speed in follow-ing new trails, so essential to the successful prose-cution of research. ”5 The chemists predicted a“national disaster” if the “fast machine” of gas re-search was slowed. Such attitudes of arms-lengthcooperation with government were prevalent inthe scientific community during the first part ofthe century.

Before the 1940s, industry supported a substan-tial proportion of the Nation’s research and de-velopment (R&D) effort; the Federal Governmentplayed a relatively minor part. Scientists in the1930s felt confident in asserting that “most of ourgreat advances in the past have been through pri-vate initiative, “ including both industry and pri-vate foundations.6 Even as late as 1940, the Fed-eral Government paid only for about 19 percentof the Nation’s $345 million expenditures for scien-tific research and development.7

Since 1940, these funding patterns have changeddramatically and, along with them, the regula-tory environment for U.S. research in science and

4A. Hunter Dupree, Science in the Federal Government (Cam-bridge, MA: Harvard University Press, 1957); also see Harold C,Relyea, “Increased National Security Controls on Scientific Com-munication, ” Government Information Quarter/j’, vol. 1, No. 2,1984, pp. 187-188.

‘David Rhees, American Philosophical Society, personal commu-nication, 1985,

‘Robert H. Kargon (ed. ), The Maturing of American Science(Washington, DC: American Association for the Advancement ofScience, 1974).

7John R. Steelman, ScJence and Public Policy, vol. 1 (New York:Arno Press, reprinted from 1947), p. 11,

11

Page 17: The Regulatory Environment for Science - Princeton

engineering. As the Federal Government has as-sumed an ever larger share of all U.S. researchfunding, the institutional responsibility fornourishing the research system has begun to shift.In 1960, the government was funding about 57percent of all basic and applied research in theUnited States; industry, 37 percent; universities,3 percent. By 1985, the government’s share wasnearly 50 percent; industry’s, over 41 percent; anduniversities’, 6 percent. The responsibility forbasic research also appears to be shifting to theFederal Government (from 60 percent in 1960 toalmost 67 percent in 1985) while the responsibil-ity for applied research has shifted to industry(from 40 percent in 1960 to nearly 55 percent in1985). (See table 2-l. ) Who funds and sponsorsresearch can have considerable impact on the lo-cus for regulation and on the type of regulatorymechanism chosen. The shifts of funding sourcein the last 5 to 10 years, therefore, may be oneexplanation for the signs of strain described in thisreport, as industry becomes subjected to regula-tions originally intended for basic research con-ducted in a university setting (e.g., recombinantDNA regulation), and universities are asked tocomply with regulations originally intended forindustry.

Table 2-1 .—Funding of Research and Development bySource in 1960 and 1985

1960 1985

Basic research:Federal Government. . . . . . . . .Industry . . . . . . . . . . . . . . . . . . .Universities/colleges . . . . . . . .

Applied research:Federal Government. . . . . . . . .Industry . . . . . . . . . . . . . . . . . . .Universities/colleges . . . . . . . .

Basic and applied research:Federal Government. . . . . . . . .Industry . . . . . . . . . . . . . . . . . . .Universities/colleges . . . . . . . .

Development:Federal Government. . . . . . . . .Industry . . . . . . . . . . . . . . . . . . .Universities/colleges . . . . . . . .

Research and development:Federal Government. . . . . . . . .Industry . . . . . . . . . . . . . . . . . . .Universities/colleges . . . . . . . .

60.00/028.56.0

56.040.0

2.0

57.037.0

3.0

68.031.6

64.633.4

1.0

66.60/018.710.0

39.754.6

3.6

49.541.4

6.0

45.254.3

46.750.0

2.0SOURCE: Division of Science Resources Studies, National Science Foundation,

1985

Prior to the postwar infusion of Federal funds,government aid to science in the universities hadalso been managed with a philosophy of “loosecontrol’’—sponsors of unclassified research left theresearchers more or less free to conduct their re-search as they believed scientifically appropriateand free to disseminate their results, subject to mi-nor supervision and general accountability. Thescientists perceived any threat to their autonomyas a questioning of their authority and expertise,During World War II, of course, that autonomyhad been curtailed, but after the wartime secu-rity restrictions were lifted, government controlof research tended to return to the prewar man-agement model, expressing a basic political trustin the productiveness and reliability of scientists.George Pimentel, Professor of Chemistry at theUniversity of California at Berkeley, character-izes that post-1945 philosophy as one of “fundcreative people, but don’t tell them what to do.”8Especially over the last 40 years, however, the cli-mate of unassailable autonomy has evolved intothe current climate of strong economic supportcoupled with attentive direction. Researchers nowoperate in a mixed environment of incentives andrestrictions, 9 which often replace scientists’ ownprofessional judgments about what subjects towork on and how to proceed.

A quite different shift in emphasis has also oc-curred in where the research is performed andtherefore in who actually does the research. In1940, 70 percent of government-funded basic re-search took place in government facilities. By1944, only 30 percent was performed in govern-ment facilities; 50 percent was performed by pri-vate firms; and 20 percent in universities.10 Afterthe war, the pattern of funding again changed,but the distribution among performers remainedsimilar. Industrial spending for R&D began to in-crease. By 1982, industry was carrying out evenmore of the Nation’s research (72 percent); univer-sities, 9 percent; Federally Funded Research and

8U. S. Congress, House Committee on Science and Technology,Science Policy Task Force, Hearing, Feb. 28, 1985.

‘Thane Gustafson, “Survey of the Structure and Policies of theU.S. Federal Government for the Support of Fundamental Scien-tific Research, ” Systems [or Stimulating the Development of BasicResearch Washington, DC: National Academy of Sciences, 1978),p. I-82.

1ODavid Noble, The Forces of J’roductjon (New York: Alfred A.Knopf, Inc., 1984).

Page 18: The Regulatory Environment for Science - Princeton

13

Development Centers, 3 percent; nonprofit insti-tutions, 3 percent; and government labs, 13 per-cent. ” The impact of this shift was to extend gov-ernment control of research—through grant andcontract provisions—into the private sector.

Finally, the discovery during World War II–and in the subsequent U.S. nuclear program—that basic research could have considerable valuefor maintaining the Nation’s military security led

1 I William C. Boes m a n, Science Policy Research Division, “U.s.Civilian and Defense Research and Development Funding, ” ReportNo, 83-183, Congressional Research Service, Aug. 29, 1983.

CHANGING POLITICAL CONCERNS*

to a fourth change in how science was funded andorganized. Increased Department of Defensespending led to an increased proportion of re-search either totally classified—and hence per-formed away from traditional research networks—or else having the potential for classification (orsimilar control) because of its potential militaryapplications. The course of the last 40 years hasalso seen significant shifts in the proportion ofbasic research sustained by the defense agenciesand, as a consequence, shifts in the climate ofmore or less classification of new areas of basicresearch.

Before World War II, national politics had onlyminimal influence on the research agenda for sci-ence and engineering. Because the Federal Gov-ernment funded very little university research, forexample, it did not have the administrative mech-anisms for exerting influence. Only in a few se-lected fields, such as agriculture, did the agendarespond to political influence. 12 Moreover, evenif scientists wanted society to benefit from theiractivities, the traditions of science offered no pat-terns to guide them and few mechanisms throughwhich to provide advice to society. Scientists whowere distrustful of government argued that finan-cial dependence on government might damage the“autonomy of their intellectual activities” in un-predictable ways.13

This independence was put aside temporarilyduring World War II, when thousands of scien-tists and engineers worked, either as military orcivilian personnel, in such government researchprojects as the Committee on Medical Research(part of the Office of Scientific Research and De-velopment (OSRD)) and the Manhattan Project.Most were required to shift to a different line ofresearch. They conducted their inquiries undergovernment control and with government fund-

*This section benefits from work done at OTA by George Hoberg,Massachusetts Institute of Technology, in August 1984.

IZAndr~ Mayer and Jean Mayer, “Agriculture: The Island Em-pire, ” Daedalus, vol. 103, summer 1974, pp. 83-96,

1 ~Lewis E, Auerb ach, “Scientists in the New Deal, ” fkfjnerva, vol.

3, summer 1965, pp. 457-482.

ing. Although new and different, this relationshipwith government proved to be successful for bothparties.

As the war was ending, the scientists who hadbeen administering the Federal research effort be-gan to discuss how the science-government rela-tionship might be sustained and structured afterthe war, The incentives were many. Not only hadWorld War 11 fostered the creation of a formaladministrative relationship between governmentand science but the results of scientific projectssuch as radar and penicillin had also demonstratedthe power and potential of government-funded,government-directed science. By and large, thecommunity of scientists and friends of scienceagreed on the need for a government agency tochannel funding for basic research. They dis-agreed, however, about the institutional structureof such an organization and about who would ex-ercise (and to what extent) political control overthe research agenda .14

There were two well-defined perspectives onhow the postwar relationship should be struc-tured. The most prominent spokesman for amodel of loose Federal control was VannevarBush, a Massachusetts Institute of Technologyengineer who was Director of the wartime Officeof Scientific Research and Development, Presidentof the Carnegie Institution of Washington, and

IJNoble, op. cit., p. 192.

Page 19: The Regulatory Environment for Science - Princeton

14

a principal science advisor to President Roosevelt.Actively opposed to the Bush position was Sena-tor Harley Kilgore, (D-WV), who was supportedby such scientists as Harold C. Urey, Edward U.Condon, and Harlow Shapley. A group withinthe executive branch, led by Presidential Assis-tant John R. Steelman, also opposed the Bush po-sition and participated in the postwar debate onhow the National Science Foundation would bestructured.

Bush’s perspective on control of research wasmost clearly articulated in the 1945 report Science—The Endless Frontier. Written at PresidentRoosevelt’s request, the “Bush report” outlined aplan for organizing science after the war. Bushwanted to create a secure funding base for Amer-ican scientific research while protecting science’straditional independence in matters of agenda,procedure, and communication. Because therehad been such clear separation between scienceand government before the war (and because thecircumstances that had brought them togetherduring the war were clearly unusual), the Bushreport had to construct a basic argument for sup-port. It found the justification in a classic Amer-ican metaphor: “Basic United States policy” hadtraditionally been to advance all types of front-iers, thus the Federal Government must take onnew funding responsibilities to assure adequatecultivation of those “areas of science in which thepublic interest is acute” but where private sourcesmay not supply sufficient resources.15 “Scientificprogress is essential, ” the Bush report stated, towage war on disease, to assure the future of Amer-ican industry, and to prevent future military con-flicts. 16

In its plan for how such responsibilities wouldbe fulfilled, the Bush report provides a measurefor subsequent change in the regulation and con-trol of research. The report proposed five princi-ples to guide the government’s new role in science:1) Whatever the extent of support may be, theremust be stability of funds over a period of yearsso that scientists may undertake long-range re-search programs. 2) The agency to administer

“Vannevar Bush, Science—The Endless Frontier, a report to thePresident on a Program for Postwar Scientific Research (Washing-ton, DC: National Science Foundation, 1980 (reprinted from Of-fice of Scientific Research and Development, 1945)), p. 12.

‘* Ibid., p. 5,

such funds should be composed of citizens selectedonly on the basis of their interest in and capacityto promote the work of the agency. They shouldbe persons who understand the peculiarities ofscientific research and education, but need not bescientists. 3) The agency should promote researchthrough contracts or grants to organizations out-side the Federal Government, but should not oper-ate any laboratories of its own. 4) Control of pol-icy, personnel, and the method and scope ofsupported research should be left to the researchinstitutions themselves. 5) And finally, the agencyshould be responsible to the President in that pol-icies and procedures would be guided by the ex-ecutive branch. The advocated policy was that“scientists should have control over how thesefunds were distributed, to ensure that the bestscience was supported as it had been by OSRDduring the war."17 Bush was not, however, “ask-ing for free access to the Treasury; funds expendedin this way would represent only a small propor-tion of those spent on research and developmentthrough the mission agencies of the ExecutiveBranch . . . .“18

Although the report acknowledged the neces-sity of wartime security restrictions, it advocatedthat, when the war was over, scientists shouldonce again enjoy freedom of inquiry. Controlsshould also be lifted on scientific information—especially that related to medicine—of potentialuse to civilian institutions. 19 Bush believed thatremoving the wartime controls would help to re-cover “that healthy competitive scientific spirit sonecessary for expansion of the frontiers of scien-tific knowledge. ”20 Scientific progress, the reportcontinued, results from “the free play of free in-tellects, working on subjects of their own choice,in the manner dictated by their curiosity for ex-ploration of the unknown. ”21 And open publica-tion of the research would be to the benefit of theNation.

1’Alex Roland, testimony before the U.S. Congress, House Com-mittee on Science and Technology, Science Policy Task Force, Mar.7, 1985.

1‘Ibid.,~Eush, op. cit. r P. 28’201 bid., p. 12.* i Ibid,

Page 20: The Regulatory Environment for Science - Princeton

15

One of the committees that assisted Bush, theCommittee on Science and the Public Welfare, *gave strong support to the idea that traditionalmodels of university research be preserved. Uni-versity research must not be “distorted” by thegovernment’s encouragement to examine short-range problems at the expense of more fundamen-tal problems, for “. . . the freedom of the scien-tist may be decreased by the introduction of somedegree of commercial control. ”22 Society mustguard science against too much control by indus-try as well as by government. That committeeurged the new agency “to devise ways and meansof allocating funds in large measure without de-termining what particular problems are to beworked on and who is to carry them out. ” “Va-riety” and “decentralization” foster novelty, theywrote. 23

The Medical Advisory Committee voiced sim-ilar concerns. If Federal aid was “misdirected, ” itcould do “serious harm” to the development ofmedical science. Therefore, the new agency’s di-rection and policies should be administered bypeople “who are experienced in research and whounderstand the problems of the investigator. “24

The government should encourage “individual ini-tiative and freedom of research. ” Control that istoo close (or, in the Committee’s words, “regimen-tation”) could lead to “mediocre work” and “dis-astrous impairment . . . of research itself. ”25

Industry-based research, if it was to flourishafter the war, also required some special arrange-ments. Patent laws designed to “stimulate newinventions” would “make it possible for new in-dustries to be built around new jobs and newprocesses” and would help small industries, theBush committees asserted. Although they wereconcerned about the domination of markets bybig industry, the committees did not support gov-ernment ownership of patents;26 patent policy was

‘Chairman of the Comrnlttee was Isaiah Bowman, I)resldent Okthe J[>hns Hopk]ns University.

‘2 Bush, (>p. cit , p. Q]“lbld,, p Q4‘Ibid. , p. 02‘ Ibid p 03‘bl)anlel Ke\les, The [%~wclsts (New }’ork: Alfred A, Kn~]pf, Inc ,

1Q78), pp. 342-344,

to be left to the discretion of the new science agen-cy’s governing board .27

These and many other of the Bush report’s rec-ommendations were subsequently incorporated inlegislation (which Bush helped to draft) introducedby Senator Warren Magnuson (D-WA) in 1945.

The Bush report did not, however, representa consensus in either the scientific or political com-munities. During the war, Senator Harley Kilgoreheld a series of hearings on post-war planning forscience, before a subcommittee of the SenateCommittee on Military Affairs. And in 1945, heintroduced a bill which expressed his ideas for anational science foundation. He favored a direc-tor who was appointed by and much more poli-tically responsible to the President than had beenadvocated in the Bush report. Moreover, Kilgore’sposition was that “organizations receiving fundsshould be free to conduct their research and de-velopment work in a manner which they thinkmost productive, subject only to a routine super-vision and review by the foundation. “28

Soon after the publication of Science— The End-less Frontier, President Truman’s Scientific Re-search Board, objecting to what they consideredto be an “underlying anti-democratic sentiment”in the Bush report, issued their own report. TheWhite House study, directed by John R. Steelman,placed the basic questions of science policy in apolitical context: “Public policy cannot be shapedin a vacuum and recommendations for a nationalpolicy on science must necessarily reflect manyconsiderations but remotely connected with thelaboratory. ”29

The “Steelman report” was just as effusive asthe Bush report in its praise of the social benefitsemanating from scientific advance and in the toneof its underlying rationale for support of science:“It is difficult to think of any other national activ-ity which more directly benefits all the people orwhich makes a larger contribution to the national

" 3 0 B u t , d e s p i t e a g r e e m e n twelfare and security.on the need for a significant Federal role in fund-

‘-blerton J. England, A Patron tor I’ure Science (Washington, DCNational Science Foundatl[)n, 1Q82 }, p . 14

2HI’en]ch, e t a l . , op. c]t,, w>] 1 , p. O“Steelman, op cit., vol. 1, p. b,“’Ib]d., v()] 1 , p , Zb.

Page 21: The Regulatory Environment for Science - Princeton

16

ing basic science and in training scientific person-nel, the reports differed on the organization offunding and on the control of the research proc-ess. The Steelman report did not object to fund-ing research in government laboratories; it rec-ommended maintaining the extant distribution offunds among universities, industry, and govern-ment labs31 and it recommended that the newagency’s director be appointed by and responsi-ble to the President.

The Steelman report also concluded that gov-ernment security regulations should not be appliedwidely but instead should be applied “only whenstrictly necessary and then limited to specific in-struments, machines or processes. They shouldnot attempt to cover basic principles or fundamen-tal knowledge. ”32 In the conclusion to Volume I,“A Program for the Nation, ” the report states:33

. . . it is sometimes argued that . . . the worldis in its present state because the physical scienceshave developed too rapidly and have unleashedforces too strong for us to control. It has evenbeen suggested that a moratorium should becalled in science, while we catch our breaths.

This is a doctrine of weaklings and of men oflittle faith in the ultimate capacity of our peo-ple. There can never be too much knowledge,though it can be discovered at uneven rates invarious fields. The cure is not to slow down therunner who is ahead—but to extend a helpinghand to those who are behind.

The differences between the Bush and Steelmanreports represented more than the usual politicaldisagreements about the administration of a newagency. They reflected fundamentally differentconceptions of the relationship between govern-ment and science. The political perspective rep-resented by the Kilgore hearings and the Steelmanreport regarded science as a special interest. Al-though large-scale government support for sciencewas a new phenomenon, science was not consid-ered to be sufficiently different from other pol-icy areas to warrant any special political relation-ships. 34 The charac ter i s t ics of science were notbelieved to “justify a departure from our tradi-

Sllbid,, vO1. 1, p. 27.~ZIbid, , vol. 3 p. 37.Sslbid,, VO1. 1, p. 68.JdNob]e, o p . cit., P. 15

tions of democratic government or from testedprinciples of administrative organization, ”35 in-cluding the principles of close accountability andavoidance of the concentration of power.

The conservative view represented by the Bushreport regarded government intervention as a po-tential threat to scientific liberty,36 an attitudeviewed by some as reflecting a lack of faith in thecompetence of government administrators. Butthe Bush report supporters were also convincedthat science was distinct from other types of gov-ernment programs, that it must be free from po-litical control, and that, to be successful, scien-tists should be able to direct their own affairs.Non-scientists might administer the foundationbut it would be the scientists who, through advi-sory groups and a system of review by scientificpeers, would decide how research should be con-ducted and would influence the research agenda,This demand to have “support without control, ”according to one commentator, amounted to “be-stowing upon science a unique and privilegedplace in the public process—in sum, for sciencegoverned by scientists, and paid for by the pub-lie. ”37

On July 22, 1947, Congress passed legislation(S.526, National Science Foundation Act of 1947,80th Congress, 1st session) to establish a NationalScience Foundation (NSF), This legislation con-tained no patent provisions, no authority for sup-port for the social sciences, no mechanisms forgeographical distribution, and a large degree ofautonomy from Presidential control .38 The gov-erning structure was the most important point ofargument, however. When President Harry Tru-man vetoed this first NSF legislation, he objectedprimarily to the bill’s provisions for lack of politi-cal control. In his veto message, Truman stated:39

. . . this bill contains provisions which representsuch a marked departure from sound principles

——- -——‘5 Sleelman, op. cit., vol. 1, p, 31.‘bEngland, op. cit., pp. 35-36,“Daniel S. Greenberg, The Politics of Pure Science (New York:

New ,4merican Library, 1967), p. 107, There was precedent for thisarrangement in the National Advisory Committee on Aeronautics(NA(’A), of which Bush was chairman in 1939. See James Killian,Sputrik, Scientists and Eisenhower (Cambridge, MA: The MITPress 1977).

3aEng]and, o p . Cit., Pp. 78-80.

3’Congressiona/ Record, vol. 93 (Washington, DC: U.S. Govern-ment Printing office, Nov. 17, 1947), p. 10568.

Page 22: The Regulatory Environment for Science - Princeton

17

for the administration of public affairs that I can-not give it my approval. It would, in effect, vestthe determination of vital national policies, theexpenditure of large public funds, and the admin-istration of important governmental functions ina group of individuals who would be essentiallyprivate citizens. The proposed National ScienceFoundation would be divorced from control bythe people to an extent that implies a distinct lackof faith in democratic processes.

Three years later, after extended debate and po-litical maneuvering, another bill was passed byCongress (National Science Foundation Act, May10, 1950, 64 Stat. 149) and signed by PresidentTruman. This bill represented a compromise be-tween the opposing political groups, but probablyreflected the preferences of a substantial portionof the scientific community. The director of theNSF would be appointed by the President, andthe bill included a mandate for evaluating and co-ordinating all Federal research efforts. It also pro-vided that these responsibilities be shared with apart-time National Science Board, organized alongthe lines suggested by Bush. The bill did notchange patent granting procedures.

The Foundation’s first director, Alan Water-man, was previously the chief scientist at the Of-fice of Naval Research (ONR). He considered any“centralized evaluation of Federal research impos-sible and inappropriate. ”40 His experience at ONRundoubtedly influenced the shape he gave to thenew foundation, for that agency had maintainedan unusual contract research program, especiallyin basic research. Established by an Act of Con-gress in 1946, ONR was to “provide scientific liai-son with the War Department and with that noveland highly effective civilian organization, the Of-fice of Scientific Research and Development. ”41

The philosophy that guided ONR was best under-stood in its view of the basic researcher as one“motivated by curiosity and interest in sciencerather than applicability, ” and the administratoras influenced by the agency’s “practical mission. ”The key was to keep these perspectives separate:“In this way selected mission-related basic research

dOKev]es, op. cit., p. 360; England, oP. cit., P. 149,

‘]Alan T. Waterman, “Pioneering in Federal Support of Basic Re-search, ” Research in the Service of National Purpose: Proceedingsof the Office of Naval Research Vicennial Convocation, F. JoachimWeyl (cd. ) (Washington, DC: Office of Naval Researchr 1966), p. 3,

may be supported. . . without controlling or dis-turbing the aim of the investigator or the courseof the research. “42 The ONR contracting systemextended the traditional military R&D contract-ing with industry to research establishments, par-ticularly academic institutions, thereby enablingthe government to utilize the most skilled scien-tists and engineers available to do weapons re-search.

After several years of debate between the BudgetBureau, NSF, and other agencies over NSF’s rolein Federal science policy, on March 19, 1954,President Eisenhower issued Executive Order10521 that established the new agency’s role.43

NSF’s role in policy development and evaluationwas to be “cooperative rather than . . . regula-tory. ” NSF was not made the principal Federalsponsor of basic research; instead, the order sanc-tioned a pluralistic system of Federal support. Itencouraged other agencies to sponsor basic re-search that was “closely related to their mis-sions. “44

The Order declared that one of the purposesfor NSF’s establishment had been to develop andencourage pursuit of an appropriate and effectivenational policy for the promotion of basic researchand education in the sciences. From time to time,NSF would recommend to the President Federalpolicies that would strengthen the national scien-tific effort and it would furnish guidance towarddefining the responsibilities of the Federal Gov-ernment in the conduct and support of scientificresearch. The Foundation, in concert with eachFederal agency concerned, would review the scien-tific research programs and activities of the Fed-eral Government in order to formulate methodsfor strengthening the administration of such pro-grams and activities by the responsible agencies;it would study areas of basic research where gapsor undesirable overlapping of support may exist;and it would make recommendations to the headsof agencies concerning the support given to basicresearch.

dlwarren w e a v e r , quoted in Weyl, oP. cit., P. 5

‘3 England, op. cit., ch, 10.4 41 bid., ch, 15.

Page 23: The Regulatory Environment for Science - Princeton

18

Sharp divisions over the political control of re-search were not unique to the debate on the Na-tional Science Foundation. In the late 1940s and1950s, intense debate preceded the creation ofboth the Atomic Energy Commission (AEC) andthe National Aeronautics and Space Administra-tion (NASA), debates that also focused on pat-ent policies, communication restrictions, andmechanisms for political control of each agency.

The legislation creating the Atomic EnergyCommission–the Atomic Energy Act of 1946[Public Law 585] –gave the Federal Government“an absolute monopoly over all aspects of atomicenergy research, development, and production, ”including provisions to control the disseminationof data related to atomic weapons and the pro-duction, or use of fissionable material .45 This tightFederal structure essentially removed control ofeven peacetime atomic energy research, or re-search directed at civilian power applications,from the scientific community that had developedthe research field in the first place. Moreover, itcreated a situation in which all atomic weaponsor atomic energy information was “born classi-fied.” As political analyst Harold Relyea andothers have pointed out, these provisions meantthat no special governmental effort was necessaryto bring such information under the statute’s “um-brella of secrecy.”46 The Act also prohibited theissuance of patents for inventions useful in theproduction or utilization of fissionable material.Although these patent provisions were relaxedsomewhat in the subsequent Atomic Energy Actof 1954 and although that revision also author-ized the controlled involvement of private indus-try in nonmilitary atomic technologies, many ofthe most stringent controls on research initiatedby the original Act—including those on who maydo such research or have access to technical in-formation for such research—remained in force.

The initial legislation in 1958 to create the Na-tional Aeronautics and Space Administration con-tained language that would have created a muchlooser policy on patents for that agency. But draftbills in both the House and the Senate, which

45 Harold C. Relyea, “Information, Secrecy, and Atomic Energy, ”New York University Review of Law and Social Change, vol. 10,No. 2, 1980-1981.

401 bid., p. 269.

modeled their patent provisions on the AtomicEnergy Act, would have enabled the governmentto maintain ownership of patents generated byNASA-funded research. The final legislation gavepatent ownership to the government, but alsogave the administrator of NASA the authority towaive title. Similar discussions and debates overthe political control of research or of researchproducts took place during the development ofother Federal agencies and programs.

Another critical outcome of the postwar sup-port of science was the burgeoning of the NationalInstitutes of Health (NIH), which before the warhad been a small “oldline” Federal health researchorganization. The Public Health Service had beencreated in 1912 to increase biomedical researchdirectly related to large public health problems.At the end of the 1920s, an effort to establish NIHpromoted a stronger Federal role in the encourage-ment of research, and in 1930, the Ransdall Actexpanded and redesigned the Hygenic Laboratoryof the Public Health Service into NIH. Public andcongressional desire to find a cure for cancer re-sulted in the creation of the National Cancer In-stitute in 1937.

During World War II, advances in biomedicalresearch had helped to demonstrate dramaticallythe effects of Federal funding of biomedical re-search. This success reinforced intensive lobby-ing during the 1950s, by public interest groups anda number of powerful individuals, for aggressiveNIH-funded research focused on specific healthproblems, Congress often responded to this pres-sure by identifying and funding research areas thathad broad public appeal, but that were scientifi-cally misunderstood. The director during thisperiod, James V. Shannon, was able to moder-ate, however, between the call for targeted re-search and the need for basic medical research;he persuaded Congress that funding of both basicand applied research was essential to reach thegoals of diagnosing and curing disease.

Reflecting on the sweep of events during the for-mation of the current bureaucratic arrangementsfor science, Don K. Price has observed that thescientists engaged in constructing these arrange-ments adopted a three-part tactic to avoid, in par-ticular, the constraints in choice of topic whichhad characterized pre-war agricultural research .47——

~7PriCe, op. cit., p. 77.

Page 24: The Regulatory Environment for Science - Princeton

19

First, they sought to combine research withuniversity teaching. They regarded such an ar-rangement as “the best way of strengthening basicresearch in the one setting most free of commer-cial self-interest or political pressure—the univer-sity, ”48 thereby obtaining a stable base from whichto defend science’s independence against “popu-lar passions or economic self-interest. ” Second,they focused on the mechanism of the projectgrant, because it “offered a tactic to avoid detailedcongressional control of funds” and also allowedFederal support to universities “without adopt-ing a general program of aid to higher educa-tion. ”49 And third, the pattern of organization

——“Ibi~ , p. 78“Ibid,

proposed by the scientists gave them a politicalauthority not dependent on popular votes .50 Inmany cases, they gained this control through agrowing system “of policy planning by part-timeadvisers under government grants and contracts. ”But as Price notes carefully, the authority gainedby the scientists was not the type defensible asa Constitutional right; rather, it was a delegatedauthority. “[I]t depended on the continued confi-dence among elected politicians in the assumptionon which the tacit bargain was founded—thatbasic research would lead automatically to fruit-ful developments. ”51

‘“Ibid.“Ibid., p, 80.

THE 1960s: PUBLIC CRITICISM OF SCIENCE

In the 1960s, questioning of several of thesebasic assumptions began to shape a new politicalreceptivity to science. More and more questionswere raised about the negative effects of scientificknowledge, including both its informative valueand its use as technology. News reports of cal-loused abuse of human subjects in scientific ex-perimentation led to political calls for increasedsocial accountability. Vigorous criticism of sciencecame from a number of quarters: intellectual andtheological questioning of the philosophical foun-dations of science; the linking of science with war(which came out of the protest against the Viet-nam war and nuclear escalation); concerns aboutscience’s “technological side effects” on the envi-ronment; and ethical questions about researchprocedures. As a 1971 Organisation for EconomicCo-Operation and Development report, Science,Growth, and Society, observed, “Scientific re-search itself became associated in the minds ofmany with war, and with environmental and so-cial deterioration resulting from the large-scale ap-plication of technology.”52

It is important to recognize, however, that the-ological or political efforts to control or regulateresearch are neither unique to the United Statesnor new. In 1927, for example, an English cleric

“Science, Growth and %ciet} (Parisr France: Organisation forEconomic Co-Operation and Development, 1971 ).

suggested that “every physical and chemical lab-oratory be closed for about ten years to enablesociety at large to assimilate the staggering amountsof new scientific knowledge. ” Although he report-edly spoke partly in jest, the shock of such a sug-gestion produced considerable reaction in theUnited States as well and the Bishop’s remark be-came the stimulus for debate over the “primacyof ends over means” and the moral depth ofscience. 53 In the 1930s, many scientists expressedtheir apprehension about “anti-intellectuals whowish to impose ideological or theological con-straints on research, ”54 and humanists and the-ologians voiced their concern that science was likean engine out of control. Many of these same con-cerns cropped up again in the 1960s in severalwidely-circulated intellectual criticisms of thescientific establishment and social values, such asHerbert Marcuse’s One-Dimensional Man (1964).

An early example of the 1960s questioning wasthe controversy that arose over Project Camelot ,55

“Carroll Purse]], “ ‘A Savage Struck by Lightning’: The Idea ofA Research Moratoriumr 1927 -37,” Lex et Scierrtia, vol. 10, October-December 1974, pp. 146-158.

Siprice, Op. cit., P . 76.5’For more extensive discussion of Project Camelot, see TecAni-

cal information for Congress, report to the U.S. Congress, HouseCommittee on Science and Technology, Subcommittee on Science,Research, and Technology, July 1979, pp. 145-179. Also see IrvingL. Horowitz, “The Life and Death of Project Camelot, ” Trans-action, November-December 1965, pp. 4-10.

Page 25: The Regulatory Environment for Science - Princeton

20

In his 1961 message on the defense budget, Presi-dent John F. Kennedy, motivated by the firstCuban crisis and growing instability in some de-veloping countries, had proposed to increase theU.S. capability in dealing with “guerilla forces,insurrections, and subversion, ” by strengtheningmilitary resources of anthropological, cultural,and other social science data in relevant geo-graphic regions. The result of this proposal wasProject Camelot, a Department of Defense (DOD)project in applied research in the social sciences.The project would have attempted to study thepolitical, economic, and social preconditions ofinstability and potential Communist usurption ofpower in several developing countries. Politicalreaction to the project, however, was strong andsignificantly negative. Congress opposed DOD in-trusion into foreign policy and the military take-over of foreign policy research, and feared the po-tential damage in foreign relations with LatinAmerican countries. Social scientists were con-cerned about military sponsorship of social scienceresearch and, more generally, about the relation-ship between the Federal Government and the so-cial science community in the utilization of so-cial science research and data in serving nationalpurposes. As a result of the controversy, ProjectCamelot was eventually suspended.

Later in the decade, as the universities becamethe institutional arena for protest against the Viet-nam War, some of that activity was directedagainst university involvement in scientific re-search supported by the Department of Defense.Boycotts and petitions were spearheaded by Scien-tists and Engineers for Social and Political Action,later renamed Science for the People. One of itsfounders, Charles Schwartz, described the animat-ing beliefs of this group as a reaction to the “spe-cific corruption of science”: “Science as a wholeis being abused by the powerful political, indus-trial and military interests, and we are all losers. ”56

This political movement was strengthened whenscientists at over 30 schools, following the leadof scientists at Harvard University and Massachu-setts Institute of Technology, held a work stop-page for one day on March 4, 1969, interruptingtheir research to protest the war and the use of

s~penjck, et al,, op. cit., p. 430.

science for military purposes .57 At some institu-tions, the result of such activities was that clas-sified weapons research was transferred to lab-oratories that were off-campus and separatelyadministered.

Concerns about military domination of aca-demic science were not confined to campus ac-tivists and scientists. Senator William J. Fulbrightproclaimed in a 1967 Senate speech: 58

The universities might have formed an effec-tive counterweight to the military-industrialcomplex by strengthening their emphasis ontraditional values of our democracy, but manyof our leading universities have instead joinedthe monolith, adding greatly to its power and in-fluence.

Acting on these concerns, Congress passed anamendment in August 1969 to the Defense Au-thorization Bill of 1970 which included languageprohibiting DOD from funding basic research notdirectly related to a specific military function oroperation. Called the “Mansfield Amendment” af-ter Senator Mike Mansfield (D-MT), who was acosponsor and one of its most outspoken de-fenders, Section 203 of Public Law 91-121 soughtto realign the funding patterns for basic science :59

The intent of the provision is clear. It is a man-date to reduce the research community’s depen-dence on the Defense Department when it ap-pears that the investigation under considerationcould be sponsored more reasonably by a civil-ian agency. After all, the National Science Foun-dation was created by Congress back in 1950 spe-cifically to channel federal funds into basicresearch.

Mansfield proclaimed that the amendment was“neither anti-military nor anti-research”; rather,its intention was to reinforce the role of the NSFas the “primary source” of basic research funds,because the role of DOD in sponsoring basic re-search was “intended to be incidental rather thanpredominant.“ 60

5’Jc~nathan Allen (cd.), A4arch 4: Scientists, Students, and Soci-ety (Cambridge, MA: The MIT Press, 1970).

s8Davjd Dickson, The New Politics of Science (New York: pan-

theon, 1984).59U. S. Congress, House Committee on Science and Astronautics,

Subcommittee on Science, Research, and Development, “NationalScience Policy, ” Hearings on H. Con. Res. 666, U.S. House of Rep-resentatives, 91st Cong., 2d sess., 1970.

~orenjck et a]., Op. cit., pp. 344-346. Also sw Rodney W. Nichols,“Miss~on-Onented R& D,” Science, vol. 172, Apr. 2, 1971, pp. 29-37.

Page 26: The Regulatory Environment for Science - Princeton

21

The direct and immediate effect of the Mans-field Amendment was not very great. It was le-gally in effect for only 1 year and was not re-newed. Only 220 of the 6,600 research projectsthat were reviewed were affected by the amend-ment, involving a total of $8.8 million, or only4 percent of Defense funds for academic research.In the following year, the amendment’s languagewas changed from “direct and apparent relation-ship” to military needs, to “in the opinion of theSecretary of the Defense, a potential relation-ship. “61 Nevertheless, the Mansfield Amendmentdid signify a change in policy toward the supportof basic science, and in the words of formerPresidential science advisor, Edward E, David, Jr.,“its influence has continued to be felt throughoutthe Department of Defense . . . [and] it has drasti-cally reduced the willingness of many other Fed-eral agencies to fund basic scientific work that can-not be clearly related to their current missions. “62

The amendment appears to have created a climateof greater caution and uncertainty in making re-search grants, not only in DOD but in otheragencies.

By the 1960s, the Federal agencies were alsoplaying an active role in shaping the Nation’s envi-ronmental future, often without any clear state-ment of national environmental policy. The U.S.Bureau of Reclamation, the U.S. Corps of Engi-neers, the U. S. Department of Agriculture, theFederal Highway Administration, and similaragencies had been reshaping the landscape.63 Theiractions, however, were not always benign; someappeared to result in inadvertent, unanticipateddegradation of the environment or destruction ofplant or animal species, or chemical contamina-tion of waterways. Although some of the envi-ronmental degradation may have actually beendue more to deployment and expansion in thescale of old technologies, or due to actions takento accomplish political goals, the negative effectswere often blamed on science and technology.

“Dickson, op. cit., p. 122.~ZEdward E, David, Jr., “The Federal Support of Mathematics, ”

Scientific American, vol. 2.52, May 1985, p. 45.b ~Lynton K, Cajdwe]l, Science and the ~ationa~ En vironmenta)

Policy Act; Redirecting Policy Through Procedural Reform (Univer-sity, AL: The University of Alabama Press, 1982), p. 8.

A legislative result of this concern was the pas-sage of the National Environmental Policy Act of1969 (NEPA), which focused on the Federal Gov-ernment’s role in shaping and protecting the envi-ronment. The legislation gave a message that therewas a need to anticipate environmental impactand to do some of that through research. LyntonCaldwell has observed that: “The task set for itsauthors was to redirect national policy toward theenvironment, ” but “the method was proceduralreform, ” including the instigation of research.64

In attempting to make Federal agencies account-able for “actions that significantly affected thequality of the human environment,”65 NEPA re-quires Federal agencies to prepare environmentalimpact assessments for all major actions signifi-cantly affecting the environment66 and it createsadministrative requirements that agencies eithercite research knowledge as evidence for decisionsor, as necessary, commission and conduct re-search of their own. NEPA, Title 1, Section 102mandates agencies to “utilize a systematic, inter-disciplinary approach which will insure [sic] theintegrated use of the natural and social sciencesand environmental design arts in planning anddecisionmaking. ”67 One of the responsibilities ofthe Council on Environmental Quality created byNEPA was “to conduct investigations, studies,surveys, research, and analyses relating to eco-logical systems and environmental quality.”68 TheCouncil on Environmental Quality later specifiedin new regulations that “if scientific uncertaintyexists but can be cured by further research, theagency must do or commission the research.”69

In recent years, the courts have taken a more ac-tive role in requiring the agencies to fulfill thismandate.

The increased environmental regulation of in-dustry—as well as other social legislation—hada number of unplanned effects on the scientificresearch system. * In the late 1960s and early

b41bi~, p. 9,b51bid.‘dMark Reeve, “Scientific Uncertainty and the National Environ-

mental Policy Act—The Council on Environmental Quality’s Reg-ulation 40 CFR Section 1502 .22,” Washington Law Review VOI. 60,No. 1, 1984-1985, p. 101.

b’CaldwelI, op. cit., p. 12.b~Nationa] Environmental Policy Act, Title 11, Section 205(5).b~Reeve, Op. cit., p. 101.

*To be discussed in more detail in ch, 5.

Page 27: The Regulatory Environment for Science - Princeton

22

1970s, Congress created the Environmental Pro-tection Agency, the Occupational Safety andHealth Administration, the National HighwaySafety Commission, the Consumer Product SafetyCommission, the Mining Safety and EnforcementAdministration, and the Equal Employment Op-portunity Commission. In addition, the jurisdic-tion and enforcement powers of several existingFederal agencies (such as the Federal Trade Com-mission) were expanded. This explosion in pro-tective regulation can be attributed to manythings, such as changes in the underlying technol-ogy of industry or changes in perception aboutwhat constitutes potential risk, combined with in-creasing awareness and growing intolerance ofrisks. Traditional protections, especially marketand liability laws, seemed inadequate to encour-age socially responsible behavior. Moreover, thescientific research system—as it participated in theregulation—was becoming increasingly visibleand more federally dependent. In addition to itsuse in forming regulatory policy, science was ex-pected to comply with protective regulations andsocial programs originally directed at industry orthe professions.

Perhaps the most significant science policy de-bate of the 1960s-–in its long-term effects on pub-lic and political attitudes toward research and inresulting regulation—surrounded the use of hu-man subjects in scientific experiments. The U.S.mass media had in the 1960s carried a number ofreports about unsavory situations—here andabroad—in which prisoners, children, the poor,and the elderly were exposed to unwarranted risksin the name of “experimentation. ” The issue waspolitically volatile, and it touched on fundamen-tal questions of who should set the standards forcontrol of scientific research. In a 1966 article thatcaptures the spirit of that debate, Henry K. Beecherwrote: 70

. . . it is absolutely essential to strive for [in-formed consent to experimentation] for moral,sociologic and legal reasons. The statement thatconsent has been obtained has little meaning un-less the subject or his guardian is capable of un-derstanding what is to be undertaken and unlessall hazards are made clear. If these are not

70 Henry K. Beecher, “Ethics and Clinical Research, ” The New Eng-land Journal of Medicine, vol. 274, June 16, 1966, p. 1360.

known, this, too, should be stated. In such a sit-uation the subject at least knows that he is to bea participant in an experiment. . . . Ordinary pa-tients will not knowingly risk their health or theirlife for the sake of “science.”

Prior to 1963, investigational or experimentalnew drugs, for example, could be used in researchinvolving human subjects if the drugs were labeledand intended solely for investigational use.71 TheFood and Drug Administration (FDA) had no di-rect control over the drugs, the investigators, orthe research to be done; “there was no require-ment that a patient be told that he or she was toreceive an investigational drug. ”72 That autonomychanged in 1963 when FDA, in response to the1962 Drug Amendments (known as the Kefauver-Harris Amendments), issued Investigational NewDrug regulations requiring that “any person ormanufacturer seeking to study a new drug in hu-man subjects . . . prepare and present to the FDAan acceptable plan for the investigation .”73 I n1966, the U.S. Public Health Service began to re-quire all institutions (e.g., universities, commer-cial laboratories) to which it made grants to estab-lish boards to review investigations involvinghuman beings: “. . . to safeguard the rights andwelfare of research subjects, to ascertain whetherthe methods used to gain their consent were ap-propriate, and to evaluate the risks and benefitsof the experiment .“74 One analyst has character-ized events of this time as the “legalization of ethi-cal choices. ”75 By the early 1970s, several legisla-tive and administrative actions had attempted toimplement such safeguards. Requirements for in-stitutional review of research involving humansubjects had gone from a matter of agency pol-icy to one of Federal law. The institutional reviewboards required for each institution receiving De-partment of Health and Human Services (DHHS)

‘] Alexander M. Schmidt, “The Politics of Drug Research and Devel-opment, ” The Social Context of Medical Research, Henry Wechsler(cd. ) [Cambridge, MA: Ballinger Publishing Co. r 1981), p. 243

7zIbid.‘31bid., p. 253.74 Stanley Joel Reiser, “Human Experimentation and the Conver-

gence of Medical Research and Patient Care, ” Annals of the Ameri-can Academy of Political and Social Science, vol. 437, May 1978,p. 18.

psFrank p. Grad, “Medical Ethics and the Law, ” Annals of theAmerican Academy of Political and Social Sciences, vol. 437, May1978, pp. 19-36.

Page 28: The Regulatory Environment for Science - Princeton

23

funds became the principal means for enforcing regulate so that detailed Federal Government reg-national political and social expectations. The ulations might not be necessary.76

7’Dael Wolfle, Emeritus Professor, Graduate School of Public Af-searchers and the grantee institutions to self- fairs, University of Washington, personal communication, 1985.

THE “LIMITS TO INQUIRY” DEBATE

These various controversies, protests, and po-litical debates took their toll on both the com-placency and the autonomy of the scientific com-munity. And in the 1970s, many researchersthemselves became actively engaged in an intensedebate revolving around social accountability andthe acceptability of limits on scientific inquiry.The 1960’s “human subjects” debate was by nomeans resolved and had stimulated regulation ofresearch procedures at both the Federal and Statelevels. Advances in molecular biology raised newquestions about the risks of genetic manipulation.Social surveys of public opinion were showingthat the American people did not have an unqual-ified faith in science and were willing to supportsome controls on the research process. And, fi-nally, general political calls for increased fiscal ac-countability in government accounting led somepoliticians to focus attention on shortcomings inthe research grants and contracts system. Theseand many other issues and controversies becamethe fodder for discussions throughout the scien-tific community—in journals and at meetings—about science’s social responsibility, about ethi-cal behavior of researchers, and about the appro-priateness of limitations on scientific inquiry.

A central focus for one of the debates was howto balance the potential risks and benefits of the“new” biology. The controversy was heightenedby two factors: the rapidity of advances in theresearch, and the connections—often pointed outby the researchers themselves—between the po-tential applications of the research and publicpolicy. When, for example, a research team atHarvard Medical School successfully isolated ahuman gene in 1969, a scientific frontier with un-usual potential had been extended, but biologistson that team also recognized that misuse of thetechniques of genetic manipulation would be un-desirable. A member of the team, Jon Beckwith,in fact, publicly voiced his concern over undesira-

ble side effects. As molecular biologists began todevelop exciting laboratory techniques for manip-ulating and recombining DNA across species bar-riers, more and more biologists began to discussthe potential outcomes. These discussions led toa dramatic example of self-regulation by the scien-tific community.

In 1973, immediately following a major re-search conference, biologists Maxine Singer andDieter Soil wrote a letter to Science 77 in which theyappealed to the National Academy of Sciences toestablish a committee to study various problemsof recombinant DNA research and to recommendspecific actions or guidelines in the light of po-tential hazards .78 That committee recommendedthe instigation of a voluntary moratorium on cer-tain forms of rDNA research and the formationof what later became a national committee toreview proposals for research using these tech-niques, the NIH Recombinant DNA MoleculeProgram Advisory Committee, organized in 1974.Molecular geneticists who voluntarily imposed themoratorium asked others in the world to do like-wise. Fears that flaws in these techniques mightallow ecological disaster or create “new diseases”were also the impetus for proposals for regula-tion at State and local levels. 79 In February 1975,however, an international group of biologists,meeting at the Asilomar Conference Center in Pa-cific Grove, California, agreed that the voluntarymoratorium be lifted and that future research beconducted under a set of rigid guidelines to be de-veloped by the NIH Advisory Committee.

“’’Letters to the Editor, ” Science, vol. 181, Sept. 21, 1973.‘“Daniel Callahan, “Recombinant DNA: Science and the Public, ”

Hastings Center Report, vol. 7, April 1977, p. 20.“qJudith A. Johnson, “Regulation of Recombinant DNA Products, ”

Issue Brief 85090, Library of Congress, Congressional Research Serv-ice, Science Policy Research Division, Apr. 3, 1984; and SheldonKrimsky, Genetic A)chemy: The %cia) Histor}’ of the RecombinantDNA Contrivers.v (Cambridge, MA: The MIT Press, 1982).

Page 29: The Regulatory Environment for Science - Princeton

24.

The justification for that moratorium—and forsubsequent regulation of the research—is dis-cussed in chapter 3, but it is important to empha-size that, in this case, researchers were generallysupportive of formal government commissionsand legislation; most accepted some regulation asinevitable and realized the importance of shap-ing the controls to fit their research needs. Therewas also relatively little public input to the earlystages of the debate.80 The result of the nationalscientific debate and congressional attention wasthe implementation in 1976 of a National Insti-tutes of Health document Guidelines fez-ResearchInvolving Recombinant DNA Molecules, which:1) imposed restrictions on the types of experimentsthat might be performed at NIH grantee institu-tions, and 2) specified minimum levels of physi-cal and biological containment for permissible re-combinant DNA experiments. In 1977, in com-pliance with the National Environmental PolicyAct, NIH adopted an environmental impact state-ment for the 1976 Guidelines.

At about this same time, a number of publicopinion surveys appeared to be indicating a de-cline in the public’s traditionally high support forand confidence in science .81 (See app. B for a gen-eral discussion of public attitudes toward science. )Some survey data indicated that more and morenon-scientists were inclined to question science’straditional autonomy or to express a lack of con-fidence in science’s ability to solve social prob-lems through research. People seemed to confusescience with technology, and to see science “in avery technological, instrumental light. ”82 As a partof its new Science Indicators series, the NationalScience Board (NSB) decided to include a chap-ter on public attitudes toward science. Using datafrom Opinion Research Corporation surveys in1972, 1974, and 1976, the chapters of the NSBreports described a public that, although still hold-ing science and technology in high regard, wasmuch less supportive than it had been in 1957,the date of the last previous comprehensive sur-vey. While 90 percent of the public thought thatthe world was “better off” because of science in

Soca]lahan, op. cit., p. 20.

“Amitai Etzioni and Clyde Nunn, “The Public Appearance ofScience in Contemporary America, ” Daedalus, vol. 103, summer1974, pp. 191-206.

‘ZIbid., p. 203.

1957, only 70 percent of the public held the sameview in 1972. 83 Similar results were obtained inthe 1974 and 1976 studies.84 The percentage ofthose willing to say that the world was worse offbecause of science did not increase significantly,but the percentage of persons who were uncer-tain, undecided, or felt that things were aboutequal did increase substantially over the 15-yearperiod spanned by the four studies.

In a 1979 national study also sponsored by theNational Science Board, several of the 1957 ques-tions were repeated, offering an opportunity forcomparison across two decades. In 1979, 81 per-cent of the public still agreed that scientific dis-coveries were making their lives “healthier, easier,and more comfortable” and 86 percent expressedthe view that scientific discoveries were “largelyresponsible” for the standard of living in theUnited States.85 In a comparable national studyin 1983, Jon D. Miller found that 85 percent ofAmerican adults continued to agree that sciencemade their lives healthier, easier, and more com-fortable. 86 Contradictory evidence, however, wasprovided by other surveys sponsored by NSB,which had asked respondents to assess the rela-tive benefits and harms of science and to weighthe two.

The data from the 1970s thus indicated that al-though only about one in 20 Americans believedthat science does more harm than good, aboutone-third were not sure where the balance fell.Some of this uncertainty may have reflected awary attitude toward science; some may havebeen due to a lack of interest or information. Inthe 1970s “limits of inquiry” discussion, however,the scientists found the potential “wariness”frightening.

If there was a significant change in public opin-ion, why did it occur? Some argue that respect

~JScjence ~ndicators—~ 972 (Washington, DC: National Science

Board, 1973).EdScience zn~icators — z 974 (Washington, DC: National Science

Board, 1975); and Science Indicators–1976 (Washington, DC: Na-tional Science Board, 1977).

8SJ(>n D. Miller, et al., The Attitudes of the U.S. Public Toward

Science and Technology (Washington, DC: National Science Foun-dation, 1980).

SbJon D. Mil]er, “A National Survey of Adult Attitudes TOWardSScience and Technology in the United States, ” Annenburg Schoolof Communications, University of Pennsylvania, Philadelphia, 1983.

Page 30: The Regulatory Environment for Science - Princeton

25

for scientists diminished because the scientificestablishment became “identified with the generalpower structure”; others believe that it was be-cause of “an exchange of roles between scienceand religion in relation to the stability of theprevailing political system.”87 Some senior scien-tists, unaccustomed to public criticism, sincerelybelieved at the time that scientific values and tra-ditions were under serious attack.

Many linked the change to science’s new sta-tus as a visible target in the Federal budget. Con-gressional attitudes—which were moving awayfrom relatively unquestioning support—may havebeen influenced by the social discussion. Somecriticism was undoubtedly prompted by the scien-tists’ own doubts about “the omnicompetence ofscience in human affairs. ” But political scientistDon K. Price speculated in 1972 that the politi-cians’ questioning resulted most “from the nor-mal disposition of anyone who lends or grantsmoney to want to know what use is being madeof it, and whether the terms of the bargain arebeing kept."88

Such normal policy questions received wide-spread publicity when, in 1975, Senator WilliamProxmire (D-WI) launched an unprecedented at-tack against National Science Foundation fund-ing of social science research. Proxmire, thenChairman of the Senate Appropriations Subcom-mittee that reviews the NSF budget, establishedthe “Golden Fleece of the Month” awards to il-lustrate what he regarded as examples of wastein the government—occasionally attacking proj-ects that he asserted “at best, of nominal valueto the American taxpayer. ” Scientists, angered atthe attack, countered that projects with obscuretitles and subjects may nevertheless deal with rele-vant and important problems, They feared a dan-gerous precedent if immediately applicable sciencewas perceived to be the only worthwhile science.

The political momentum of the “Golden Fleece”awards was eventually slowed when a behavioralscientist who had received such an award filed suitagainst Proxmire, arguing that, as a result of Prox-mire’s actions, he had suffered a loss of respectin his profession, was “held up to public scorn,

‘ ‘ Limits of Scientific [nquir}, ” Daedaius, spring 1~78, p. vli,‘fiI>rice, op. cit., p. 80,

and suffered a loss of income and ability to earnincome in the future. ” A Federal district court inMadison, Wisconsin, granted a summary judg-ment in Proxmire’s favor on the grounds that heenjoyed absolute immunity under the Speech andDebate Clause of the Constitution; but in 1979the Supreme Court held that the researcher wasnot a public figure simply by virtue of receivingFederal funding, and that congressional immu-nity did not extend to statements made outsideCongress.

Despite a lessening of political criticism follow-ing the Supreme Court decision, the scientificcommunity reacted as if the integrity of all sciencein general had been questioned. Many research-ers seemed to be underestimating the demand foraccountability inherent in acceptance of publicfunding .89 “Those [scientists] who came of ageduring the fifties and sixties, ” Robert S. Morisonobserved in the 1970s, “may never quite under-stand why they have suddenly become ‘account-able’ to a ‘participatory democracy’.“90

Insensitivity may not be the entire explanation,however, for public perceptions of science in gen-eral were also changing. Only a decade before,science had had unquestioned social authority inthe culture and its research funding was ample,growing, and relatively easily acquired; by the1970s, research was being conducted in a socialclimate that admitted scientific authority to ques-tioning and in which other demands were bitinginto science’s portion of the Federal budget. Scien-tists who had been trained before the war prob-ably also could not have imagined the extent ofregulation of the research process which had be-gun to occur.

Some scientists who took part in these discus-sions reacted negatively to the proposal of citi-zen participation in what were traditionally con-sidered to be “scientific” matters.91 “How, ” onecommentator asked, “can public participation bearranged without clashing with the very mean-ing of science as a consensual activity among

“9Robert S. Morison, “Commentary on ‘The Boundaries of Scien-tific Freedom ’,” Newsletter on Science, Technology. & HumanI’alues, June IQ77, pp. 22-24.

‘L’ Ibid., p. 2-I.‘]’’ Limits of Scientific Inquiry, ” op. cit., p. viii

Page 31: The Regulatory Environment for Science - Princeton

trained specialists?”92 Scientists who had previ-ously assumed total control over their researchnow talked about being at the “mercy of citizens’groups” who were seeking input to the decision-making process .93

Despite this reaction—or perhaps because ofit—many leaders of the scientific community ap-pear to have believed that public scrutiny inevi-tably implied restrictions. According to DorothyNelkin, by the 1970s, it was no longer a questionof whether there would be public control, but ofwho would participate, how control would be or-ganized, and how much they would influence re-search decisions.94 The discussions then movedto consideration of how the situation could beshaped to “protect” basic science, and to whenand how much the public representatives wouldactually be involved.95

The scientists, philosophers, and policy analystswere not, however, always in agreement aboutthe question under debate. Some regarded it asa debate over “limits to free scientific inquiry, ”viewing proposed regulation as an attempt to in-hibit researchers’ freedom to pursue intellectualinquiry. Others began to frame the debate in termsof funding priorities. Andre Hellegers once madethis point forcefully, arguing that freedom of in-quiry was not under assault because science was,in fact, “royally” funded.96 Hellegers cited the twocentral issues for science as: 1) “Given finite re-sources, how much should the public invest in anenterprise such as science?; and 2) “How far (ifat all) should any enterprises, in the name of free-dom of inquiry, be allowed to infringe on the free-

——. .———“Ibid., p. 232.‘31 bid., p, viii.94 Dorothy Nelkin, “Intellectual Property: The Control of Scien-

tific Infor,mation, ” Science, vol. 216, May 14, 1982, p. 207.“Andre Hellegers, “The Ethical Dilemmas of Medical Research”

and Barry Casper “Value Conflicts in Restricting Scientific Inquiry, ”l?egu~ation of Scientific Znquiry, Keith M. Wulff (cd. ) (Boulder, CO:Westview Press, 1979).

“Ibid., p. 9.

dom of others?”97 Should low priority be givento those activities that have adverse consequences?98

To Hellegers, the real topic of importance was“the ordering of priorities in things which affectboth science and human values.’’”

Don K. Price has observed that in the 1970sscientists were often inclined to blame their prob-lems on politicians. This tendency was exacer-bated by historic differences in the outlooks ofthe two groups. Scientists and politicians oper-ate in different time frames—Congress in the shortterm and scientists in the long term. Conflictsoften arise from the imposition of a political para-digm onto the agenda of the scientific community.But Price believes that the problems themselvesevolved from three other factors. First, the polit-ical strategy for the support of science was de-vised by scientists themselves and was based onthe experience of private philanthropy beforeWorld War II. Second, the political authoritieshad accepted science “as the dominant intellec-tual approach to public issues, which scientistsand other liberal intellectuals agree must there-fore be regulated in the public interest.” And third,the U.S. constitutional structure is “too decentral-ized to sustain the integrated and long-term viewof public policy which might justify the supportof science as an intellectual and educational en-terprise. “100

The academic debate over “limits to scientificinquiry” can be seen as a response to social pres-sures, to events and progress within science, andto the scientists’ fear that public support forscience was declining. The academic scientificcommunity believed that it was necessary to de-fend the very core of science—which they per-ceived as under attack. They saw the humanists’criticism and the attempts to regulate as threatsto the legitimacy of modern science.

“Ibid., p. II.‘sIbid., p. 22.9gIbid., p. 29.]~oprice, Op, cit., p. 76.

Page 32: The Regulatory Environment for Science - Princeton

27

RECENT RESTRICTIONS ON SCIENTIFIC COMMUNICATION

From the earliest days of the Nation, Federalpolicy has largely been supportive of open com-munication, free exchange of information, andwide publication in scientific research.101 Fromtime to time, however, recognition has been givenin Federal law to “circumstances that constitutewhat have been thought to be obvious and com-pelling reasons for imposing official secrecy onresearch or restrictions on the dissemination ofcertain kinds of research findings .’’102 For exam-ple, the first War Powers Act, signed 11 days af-ter Pearl Harbor, gave the President the author-ity to censor all communications with foreigncountries. But the scientific community has alsomade some attempts at voluntary control. In1940, for example, editors of various professionaljournals cooperated with a special committee ofthe National Research Council to review papersfor possible defense information. ’”’ This combi-nation of Federal support for open communica-tion, defense-related restrictions imposed on acase-by-case basis, and occasional voluntary co-operation by the scientific community continuestoday.

Because of this history, it is noteworthy there-fore that the most controversial regulatory issuefor science in the 1980s has been the impositionof restrictions on the communication of basic sci-ence. In part, the new restrictions have resultedfrom the changing nature of information, espe-

IL’i Harold C. Rel yea, “Shrouding the Endless Frontier—ScientificCommunication and National Security: The Search for Balance, ”S&Aing a Balance: Nationa] Secufity and Scimtifjc Freedom, HaroldC Relyea (cd. ) (Washington, DC: American Association for theAdvancement of Science, 1985), p. 75.

‘L” Ibid,IL’’ Harold C. Relyea, “Increased National Security Controls on

Scientific Communication, ” Government information Quarterly,vol. 1, No, 2, 1984, pp. 187-188, See also Michael M. Sokal, “Re-strictions on Scientific Publication, ” Science, vol. 215, Mar. 5, 1982,p. 1182; and Michael M. Sokal and Janice F. Goldblum, “From theArchives, ” Science, Technology, & Human Values, vol. 10, spring1Q85, pp. 24-27.

cially its status as a valuable property or nationalcommodity, and from the growth in modes of dis-semination of information. The decreasing distinc-tion between basic and applied research added tothe difficulty of assigning national security clas-sification according to the information’s poten-tial for application. And, especially in the last fewyears, there is an increased perception that the ex-port of U.S. technology is weakening this coun-try politically and economically on a worldwidebasis. 104

This series of disputes first arose in the late1970s, when the National Security Agency (NSA)and later the National Science Foundation at-tempted to prevent university-based cryptologyresearchers from publishing their unclassifiedwork on encryption schemes. In these cases, theFederal Government invoked the Invention Se-crecy Act and the International Traffic in ArmsRegulations, regulations intended to control theexport of munitions and related technology. Theresult of discussions between the universities andthe government was the adoption of a voluntaryprepublication review process, under which co-pies of manuscripts on cryptology are sent to NSAat the same time they are circulated to colleaguesor submitted to journals.105 This system of volun-tary prior restraint was endorsed in 1980 by theAmerican Council on Education, and in 1981,NSF amended its policies on research grants torequire similar prior restraint on “potentially clas-sifiable results .’’106

At about the same time, the Department of De-fense—concerned especially about the leak of in-formation on Very High Speed Integrated Cir-

l~ilnterjm Report ~)f the Committee on the Chanb~inA~ Nature ofInformation (Cambridge, MA: Massachusetts Institute (>t Techn<~l-ogy, Mar. 9, 1983).

l~SJohn C, Cherniavsky, “Case Study: Openness and Secrecy inComputer Science Research, ” Science, Technology,, & HtlmanValues, vol. 10, spring 1985, pp. 99-104.

‘“’ See sec. 794 on National Security in the NSF Grant PolicyManual.

Page 33: The Regulatory Environment for Science - Princeton

28

cuits —began to use the Export AdministrationRegulations to restrict public communication ofresults and to control the access of foreign scho-lars to U.S. university research, ’07 The presidentsof five major universities objected to these restric-tions and to the trend of increased control thatthey represented. That protest and the prior de-bate over restrictions on cryptology research wereprincipal factors in the initiation of a special Na-tional Academy of Sciences-National Academy

— .1071 bid., p. 102.

of Engineering-Institute of Medicine panel, underthe direction of Dale Corson, which issued itsseminal report “Scientific Communication andNational Security” in 1982. 108 The Corson panelreport has been a touchstone for subsequent re-action to government actions to restrict scientificand technical communication.

— —‘O”Scientific Communication and National Security, report by the

Panel on Scientific Communication and National Security Commit-tee on Science, Engineeringr and Public Policy (Washington, DC:National Academy Press, 1982).

POLITICAL INFLUENCES ON REGULATIONThe science policy developed over the last 40

years reflects certain assumptions about the na-ture of science, the character of scientists, and thepolitical management of science. There have beenimportant assumptions about the ability of scien-tists to govern their own affairs; often, discussionof the peer review system will figure prominentlyas evidence of whether or not this governance“works.” Other assumptions are made aboutwhether, given the current structure of science,effective and equitable regulation is possible; andrelated to that are a host of assumptions aboutthe nature of expertise—especially the belief that,on scientific matters (even those with heavy pol-icy components), scientists alone can best iden-tify promising projects and areas of research.Historian Alex Roland, in his March 7, 1985, tes-timony to the Science Policy Task Force of theHouse Committee on Science and Technology, ar-ticulated this perspective well when he observedthat “scientists understand nature’s laws betterthan anyone else; they are in the best position tosee the potential applications of their under-taking. ”

Some assumptions relate to the conduct of re-search—such as where it is best performed—orto the appropriate relationship between researchand university education. Other assumptions re-late to the process of regulation. There are, forexample, strong opinions about how far govern-ment “interference” should extend in all aspectsof science policy and about who should partici-pate in the development of policy about controls.Assumptions about who should control researchand at what stage are also inextricably linked to

the question of who is the best judge of science,who is the expert, and who evaluates science.

How are changes in these assumptions—and inthe social relations of science—affecting the in-tensity and extent of the regulatory environmentfor research?

Without doubt, there is new pressure for a bal-ance between the push for scientific and techni-cal progress and the demand for regulation. Con-gressional management of science and technologytoday may require special legislative effort toreconcile the complexity and sophistication of newtechnological challenges with society’s regulatorycapabilities.

Another important force shaping enforcementof Federal regulation on scientific research is a na-tional fear of failure, especially in internationaltechnological competitiveness. Science policyleaders argue for increased funding in order tokeep the United States from “falling behind” incertain scientific fields. But these same argumentsare used by the executive branch to justify in-creased restrictions on scientific communication.Such attitudes have repercussions on scientific re-search through the setting of national research pri-orities and through pressures to achieve competi-tive status (or to “maintain the lead”) in all areasof science.

One of the most visible changes—to be dis-cussed in chapter 4—is in the creation of specificpolitical or bureaucratic mechanisms for imple-mentation of social controls on research. As ahandle for enforcing regulation, the requirement

Page 34: The Regulatory Environment for Science - Princeton

29—

for financial accountability inherent in the re-search system has been successful. The post-warcontract between scientists and government hadallowed the scientists, through the process of peerreview, to make decisions about the allocation ofgovernment funds to specific projects but requiredthe universities to be accountable financially tothe government. Thus, regulatory requirements—committees to review research for ethics, regula-tions on the disposal of hazardous materials—could be tied to the award and management ofmoney.

Changes have also occurred in the amount andtype of public participation in decisionmaking onissues related to science and technology. Changesin public beliefs about the value of “expert” v. layopinions on political or social issues involvingscience or technology have reinforced the trendtoward less autocratic control of science by scien-tists. Until the last decade or so, when policy-makers turned to scientists for advice in makingdecisions on technically-intensive public policy is-sues, the practice was to distinguish between thetechnical and the political, or normative, aspectsof a problem. 109 Today, the involvement of morelaypersons in that decisionmaking process on reg-

‘(’’ I.(lren R. Graham, ‘Comparing L’ S. anci Soi’]et Experiences:Science, Citizen>, and the Policy hlaklng Procms,” En\’ironrnent,~ro] ~b, %ptembvr 1 Q84, p 8 ,

ulating research has not only shifted some con-trol from the scientists but has introduced moresensitivity to normative concerns,

These and other influences on U.S. researchhave helped to change the nature and characterof the politics within which research is conducted,When the Bush and Steelman reports outlinedtheir visions for how the Federal Governmentshould sponsor and finance a national structurefor scientific research, there was little reason tobelieve that those same arrangements could be-come the vehicles through which research mightbe regulated according to prevailing social or po-litical attitudes. Science was to be managed withloose reigns. It was not perceived as either requir-ing suspicious administration or warranting ex-ternally-imposed controls. The specific links be-tween the events that stimulated much of thecurrent regulation and the concurrent shifts inpublic attitudes are not well understood, but itis clear that, in the 40 years since Bush, somethinghas changed. That shift is linked in some way tothe original assumption that guided the design ofthe system as well as to the assumptions that nowunderpin priority-setting and funding today.Science is now clearly conducted within a regu-latory environment that affects its agenda, its pro-cedures, and its communications. The next fourchapters describe the existing situation—why reg-ulation occurs, how it occurs, and where it occurs.

Page 35: The Regulatory Environment for Science - Princeton

Chapter 3

Social and Political Rationalesfor Controls on Research

Page 36: The Regulatory Environment for Science - Princeton

Chapter 3

Social and Political Rationalesfor Controls on Research

Harold Green has observed that, all things be-ing equal, there is no question that “a scientist hasthe freedom to think, to do calculations, to write,to speak and to publish’’—as long as these activ-ities remain within the area of abstractions.1 Re-search, of course, involves more than abstractthinking. Scientists experiment, observe subjects,record data, and describe their work to others.These activities can be affected by social moresand customs and can in turn affect society, theenvironment, or the people and objects involvedin the research. When research violates the socialnorms, or when society perceives risks or dangersin the research process, then restrictions may beimplemented either by society or voluntarily bythe research community.

This chapter develops a typology of the vari-ous reasons used to justify either legally enforce-able regulations or social restraints on scientificactivity. In most cases, the rationale described isone that is used to explain why scientists shouldnot do research on a certain topic or in a certainway, or should not describe their results to a par-ticular group of people. In a few instances, thejustification may be used to discourage scientistsfrom pursuing one research line or encouragethem to pursue another, or to protect governmentor commercial rights in scientific information, *

‘Harold I’. Green, “The Boundaries of Scientific Freedom, ” Reg-ulation of Scientific Inquirj’, Keith hf. Wulff (cd. ) (Boulder, CO:\\’e\tv]em I’res>, 1Q7QI, p. 140; reprinted from Newsletter on Science,Technc)lc)g~’ & Human I’alues, June 1977, pp. 17-21.

‘Is>ues or events are described in this chapter as examples of whena particular justlticatlon or rationale may have been employed. Asclent itic topic or project may have been regulated for more thanone reason, but all these reasons are not necessarily listed for eachexample.

This chapter describes the rationales or legaljustifications for two types of regulation of scien-tific research. One type includes legal barriers,incentives, or other actions that have some bind-ing or controlling effect. The other type of regu-lation includes forces or actions—ranging fromchanges in funding to negative public opinion—that do not have the force of law but may havesome important effects because of the way inwhich science is funded and is dependent on po-litical and social support for the continuation ofthat funding.

Table 3-1 .–Justifications for Control of Research

Regulatory forces on the research agenda:Ž To fuIfiII political objectives● To avoid environmental damage. To promote or avoid specific economic consequences• To preserve moral values

Regulation of research procedures and protocols:● To protect human health and safety. To protect an i reals used in experimentation. To protect the environment

Regulation of the dissemination of scientific knowledge:. To uphold scientific standards. To protect a professional or economic interest. To protect the health, privacy, and safety of

individuals● To protect the national miIitary or economic security

REGULATORY FORCES ON THE RESEARCH AGENDA

Attempts to control the research agenda of a are convinced that application of the researchfield or laboratory may take one of two forms. could bring harm. Others may be unwilling toOpponents of a research topic may wish to sup- grant legitimacy to a morally (or politically) ob-press a project or a line of research because they jectionable idea by implying that it is worthy of

33

Page 37: The Regulatory Environment for Science - Princeton

34

scientific attention. Proponents of a research topicmay seek to alter the research agenda to includethat topic.

For these reasons, justifications are likely to beaffected by beliefs about the probability and typeof any eventual application. The regulator is con-vinced that, should the research ever be done,some imaginable (or predictable) result or find-ing would be unacceptable or undesirable. Thepotential of research for application—and the reg-ulator’s ability to envision such application—thuscan have considerable effect on the predispositionto control. Harvey Brooks points out, “the regu-latory climate for research which is influenced byits potential applications will depend on theuniqueness of the relation of these applicationsto the substantive content of the research, sincethe amount and richness of the applications varyconsiderably among types of research.”2 Some ob-servers, however, argue that all regulation occursbecause of the anticipation of some effect, al-though they may distinguish between attempts tolimit inquiry because of: 1) “anticipated deleteri-ous consequences of the inquiry itself” (e. g., ef-fects of the research procedure on experimentalsubjects); and 2) “anticipated deleterious conse-quences of applications of knowledge obtained bythe inquiry.”3

The most common justifications for restraintson research agenda are political, environmental,economic, and moral concerns.

To Fulfill Political Objectives orAvoid Political Effects

Political reasons may underlie both the encour-agement and the suppression of research, whensociety perceives that research could achieve a spe-cific advantage or result in a negative effect. Theprotection of national economic or military secu-rity, for example, may justify either the redirec-tion of research toward military goals or theinhibition, discouragement, or prohibition ofweapons development research outside of govern-ment control. Both justifications were used dur-

IHarvey Brooks, Benjamin Peirce Professor of Technology andPublic Policy, Harvard University, personal communication, 1985.

3Barry M. Casper, “Value Conflicts in Regulating Scientific In-quiv, ” Regulation of Scientific Inquiry, Keith M. Wulff (cd. ) (Boul-der, CO: Westview Press, 1979), p. 15.

ing the 1970s controversy over research on thelaser separation of isotopes of uranium. That re-search topic became an active area of controversyin 1976, when experiments in both governmentand private industry labs showed a promising newapproach to laser isotope separation. A fewmonths later, a private consortium, Jersey Nu-clear-Avco Isotopes, Inc. (JNAI) applied to theU.S. Nuclear Regulatory Commission for a licenseto build a $15 million facility for large-scale ex-periments using one of these approaches.’ Becauselaser isotope separation was believed to promisea cheaper, easier way to obtain enriched urani-um—for both nuclear powerplants and weap-ons—these new developments provoked bothconsiderable controversy and attempts to classifythe work. Many observers believed that existenceof a perfected process would increase the risk ofunintentional proliferation of nuclear weapons,would undermine existing international safe-guards, and could aid terrorists. In proposing amoratorium on further research and development,physicist Barry Casper argued that “there is stilltime to stop and consider whether laser enrich-ment should be developed, in light of its broaderconsequences.”5 Proponents of the research ar-gued that laser isotope separation required sophis-ticated facilities and was not a “garage” technol-ogy adaptable by terrorists, and that thereforethose fears were groundless. *

At the international level, the nuclear nationshave tried to curb the proliferation of nuclearweapons by preventing additional countries—andespecially countries considered to be politicallyunstable—from doing nuclear research that couldproduce weapons-grade plutonium. Internationalpolitical objectives have also justified governmentactions that discouraged or denied permission toforeign students from certain countries whowanted to study nuclear engineering in the UnitedStates. By preventing access to advanced train-ing in certain fields, the United States was effec-tively attempting to control the other country’sresearch agenda.

4Barry M, Casper, “Laser Enrichment: A New Path to Prolifera-tion?” Bufletin 01 the Atomic Scientists, January 1977, p. 29.

‘Ibid.*In fact a special panel of consultants appointed by JNAI con-

cluded I hat the JNAI process was probably lms proliferation-pronethan th~ centrifuge process which was being commercialized, or thanthe process being developed at Los Alamos.

Page 38: The Regulatory Environment for Science - Princeton

35

National economic priorities and international sities should accept Department of Defensestanding may justify redirecting research toward funding. 8

topics related to technological competition. Whena country decides to shore up its prestige in theinternational scientific community, it often con-centrates on achieving or maintaining superiorityin some but not all scientific fields. Such justifi-cations may be discerned in, for example, the cur-rent debate on funding high energy physics andthe superconducting supercollider. Belief that thesuccess of U.S. industry in competing in worldmarkets is increasingly tied to research hasprompted several regulatory actions in the fieldof biotechnology. In 1984, for example, the Cab-inet Council on Natural Resources and Environ-ment asked 14 agencies to develop a frameworkfor the regulation of gene splicing. ’ George Key-worth, President Reagan’s Science Advisor, hasalso suggested that the National Institutes ofHealth should broaden its mission by paying moreattention to the needs of the biotechnology indus-try through more funding of generic applied workin biotechnology, promotion of intellectual sup-port for biotechnology companies, and trainingof bioprocess engineers and other needed person-nel. 7 International relations may also affect re-search when science and technology are used astools for political diplomacy, as in scientific ex-change programs.

Government actions may also be directed at en-couraging or discouraging research related to spe-cific domestic political goals. During the ReaganAdministration, the regulatory system itself hasbeen used to influence the funding of research per-taining to specific regulatory issues. Executive Or-der 12485 (Jan. 4, 1985) instituted an office ofManagement and Budget (OMB) review of regu-lation-related research proposed by executivebranch agencies. Through its power to approvethe appearance of any research on the regulatorycalendar, OMB can control the agencies’ researchagenda before funding. In the late 1960s, both do-mestic and international politics related to U.S.involvement in the Vietnam War shaped a num-ber of acrimonious debates over whether univer-

‘An Acfmlnl>tration official was quoted as saying that the frame-w.[)rk was aimed at avoiding “federal actions that could aitect thelnd ust ry’s competitiveness. l?usiness J$’eek, May 21, 1984, p. 40,

‘Barbara Cull]ton, “NIH Role in Biotechnology Debated, ” Sciencevo]. 22Q, July 12, 1~85, pp. 147-148.

Special difficulties arise when the justificationsfor control are linked to a controversial social orpolitical issue. Various forms of research to de-tect XXY and XYY chromosomal aberrations,such as the screening of male newborns to iden-tify and study prospectively the development ofthose with a XYY karyotype, combine basic epi-demiological research with longitudinal followupof “experimental” (XYY) and “control” groups,including potential therapeutic intervention.9 Inthe mid-1970s, at Harvard Medical School, ob-jections by Harvard University faculty membersand by geneticists elsewhere in the Boston aca-demic community resulted in the voluntary ter-mination of a research project on XXY and XYYchildren. 10 The project staff argued that researchshould proceed because of its potential therapeuticvalue to the patients. They were sincerely attempt-ing to advance science and “to bring what theyperceived as the benefits of science to the resolu-tion of a social problem.”11 Their opponents, withequal sincerity, sought to expose and stop whatthey perceived as a “misuse or abuse of scientifichypotheses and techniques.”12 Scientists criticalof the research topic argued that there was no sci-entific evidence linking XYY and antisocial behav-ior, 13 and that the research should be stopped be-cause its goals directly contradicted Americanpolitical beliefs about the rights of individuals.Other critics believed that the research had thepotential of being just the first step in an attemptto determine a genetic basis for antisocial behav-ior. Infants tagged as having such a trait mightbe treated differently all their lives and thereforeidentification might become a self-fulfilling pro-

‘Dorothy Nelhln, The llni~ersitj and hfilitar?’ Research: ,Jlc)ralI’oiitlcs at ,IIIT ( I thaca, ,~}’ Cornell Uni\’ersit}r I>ress, 1 ~72 )

“Dorothy Nelkin and ludith A. Sw.ve}, ‘Science and St>c ial C(~n-trol: Controversies O\rer Research on L’io]ence, ” Report No 1970,conference proceedings, Institute for Studie+ In Research and HigherEducation, Norwegian Research Council for Science and the Hu-manities, p, 5.

1OBarbara J, Cu]]iton, “XYY: Harvard Researcher Under Fire st(~psNewborn Screening, ” Science, VO]. 188, ]une 27, lQ75, pp. 1284-1285; Frederick Hecht, “Biomedical Research Ethics and Right\, ”Science, VC)I. 188, lC175r p, 502, and Loretta Kopelrnan, “Ethica[ Con-troversies in hledica] Research: The Case of XYY Screen ing, ” Per-specti~’es In I?mlogjr and ,Iledicine, winter 1978, pp. 1 %204.

I ]Ne]kln and Stvazeyr [~p. clt., p . 211.“Ibid.1’Culliton, “XYY,” op. cit., p, 1 , 2 8 4 .

Page 39: The Regulatory Environment for Science - Princeton

36

phecy, ” And finally, some asserted that the re-search should be stopped because the benefits tosociety were dubious .15 Defenders of the researchcalled this latter argument “a misplaced ideolog-ical approach. ”16 In this case, both proponents andopponents had to weigh the importance of pro-tecting the rights of individuals against the im-portance to society of predicting (and thereforepossibly preventing) criminal behavior, the im-portance to future generations of developing chro-mosome screening for the detection of geneticallylinked illnesses, ’7 and the importance to currentpatients should reliable therapy ever becomeavailable.

Political concerns can also drive the researchagenda when an individual or a group of research-ers attempt to redirect the agenda of an institu-tion or a field away from one topic and towardanother considered to be more socially or politi-cally acceptable. Most often, such actions occurat the individual or personal level; but on occa-sion there have been loosely coordinated actionsby groups. In the 1950s, for example, the Soci-ety for Social Responsibility in Science and theCommittee for Social Responsibility in Engineer-ing “required their members to take a pledge uponjoining the organization which stated that theywould not engage in research for destructive pur-poses.” 18 In the 1960s and 1970s, many scientistsswitched fields rather than work on topics con-nected to weapons or to the military. Some at-tempted to choose research topics that they con-sidered to be more socially relevant or moreexpressive of their own moral or political philos-ophy. Some rejected certain topics out of protest(again, on moral or political grounds) to U.S. mil-itary action in Southeast Asia or because they es-poused general pacifist objections to their coun-try’s military research agenda. Decisions to rejecta line of research were, however, more often re-lated to the proposed military sponsorship of theresearch than to any specific application of theparticular investigation. In the late 1960s and early1970s (as discussed in ch. 2), during controversy

liKope]man, op. cit,, notes 11 mcf 13.“Ibid., p. 200.“Ibid.‘7 Culliton, “XYY,” op. cit.; and Hecht, op. cit., p. 502.)~Rosemary chalk, “Drawing the Line: Science and Military Ile-

search, ” unpublished manuscript, May 1983, p. 8.

over the presence of classified military researchon university campuses, for example, the orga-nization Scientists and Engineers for Social andPolitical Action actively attempted to persuaderesearchers to forego participation in war researchor weapons production.19

Rejection of a research line by individuals orgroups can be a form of “conscientious objectionin science. “2° Individuals who “draw the line” inthis way may simply decide to have nothing todo with research linked to the military or, morespecifically, with nuclear weapons or chemical-biological warfare. Many physicists, whose lineof interest and expertise would fit them notablyfor the scientific task involved, justify their re-fusal to work on nuclear weapons research onmoral grounds. More recently, a few graduate stu-dents in the field of artificial intelligence—wherethe proportion of Department of Defense fund-ing is increasing—are reported to have eitherswitched their thesis topic to one unrelated to mil-itary applications or, in an extreme case, leftschool or switched fields altogether.

In the 1980s, social anxiety about the nucleararms race has had a direct effect not in inhibitingbut in stimulating research. Funding for—and re-searchers’ interest in—arms control research hasincreased. * Physicians, psychiatrists, and othermedical professionals have encouraged and sup-ported new research efforts on the medical con-sequences of nuclear war or the psychological ef-fect of the nuclear arms race on children.

To Avoid Environmental Damage

Environmental concerns that provoke the im-position of regulation can trigger similar conflictsin values. At issue here is the narrowness of therelation of the potential application to the over-all substance and goals of the research. Does reg-ulation undertaken because of the fear of one par-ticular application serve to deny the potentialbenefits to society of other possible applicationsperhaps not now clearly visible?21 This justifica-

—l+lbid p, B: See also Colin Norman, “Classification Dispute Stalls

NOAA” Program, ” Science, vol. 227, Feb. 8, 1985, p. 155.20 Chalk, op. cit.*For example, the International Security Program of the John D.

and Catherine T. MacArthur Foundation.“Brooks, op. cit.

Page 40: The Regulatory Environment for Science - Princeton

37

tion underpins, for example, the legal action tohalt deliberate release of genetically altered organ-isms, In a suit discussed in more detail in appen-dix A, the Foundation on Economic Trends hascharged that the National Institutes of Healthfailed to evaluate adequately the environmentalimpact of experiments involving the release of ge-netically altered organisms into the environment.The plaintiffs are seeking to halt the work alto-gether because they are convinced that the poten-tial long-range benefits of such research are sim-ply not worth the potential risks to the environ-ment.

To Promote or Avoid PredictableEconomic Consequences

International competition in trade has been usedto justify suspending one line of research (or tocut back on its funding) because another line ap-pears more promising. Such a situation currentlyexists in the field of silicon electronics; work inthat area has been so successful that research onalternative technologies has been cut back.

In another recent case, predictions of adverseeconomic effects alleged to result from the even-tual application of research projects have stimu-lated protests that may yet lead to restraints. In1980, California Rural Legal Assistance filed alawsuit on behalf of 19 farm workers, whichcharged the University of California “with unlaw-fully spending public funds on mechanization re-search that displaced farm workers. “22 The plain-tiffs believe that the research—intended to developlarge, more efficient agricultural machines andnew farm methods—would reduce the need forhuman labor in agriculture. They are convincedthat such innovations would have an adverse eco-nomic effect on the workers displaced by machin-ery, on small farms, and on consumers, and there-fore that public funds should not be used tosupport such research. Defenders of this researchargue that mechanization research should contin-ue “in order to create more desirable jobs and tokeep the American fruit and vegetable industrycompetitive in the international economy.”23 (Seeapp. A for further discussion. )

‘lI)h]l]p I Llartln and Alan L. Olmstead, ‘The Agricultural hfech-an i7a t i (ln C<) n tr(~ver~}’, Science, vol. 227, Feb. 8, 1Q85, p. bOl

1‘Ibid., p o(M.

To Preserve Moral

In some instances, a

Values

society or a group withinthe society may perceive the very exploration ofa topic (or the legitimacy granted to the topic bya serious research effort) as a threat to moral orsocial beliefs. That is to say, the research hypothe-sis contradicts the social or political beliefs of theopponents. Early 20th century attitudes to humansexuality, for example, acted to inhibit all typesof research relating to sexuality, contraception,and reproduction.24 Research was discouraged be-cause of fear that it might encourage or condone“immoral” behavior; religious and moral leadersobjected to laboratory consideration of what wereconsidered to be private, personal matters.

Such objections continue to be raised today. Inpublic opinion polls run in 1983, approximatelyone-quarter of the adult population of the UnitedStates were willing to endorse the statement thatone of the “bad effects of science” is that it breaksdown people’s ideas of “right and wrong.’’”

On occasion, therefore, opponents of a researchtopic or hypothesis believe that any exploration(however well-controlled) might endanger the so-cial cohesion of the community. Such concernsfuel contemporary objections to research thatwould attempt to link human intelligence to ge-netic inheritance. American psychometrician Ar-thur R. Jensen sparked a controversy in 1966 whenhe argued that IQ is genetically fixed. ” Jensen pro-posed that social intervention aimed at boostingminority students’ IQ scores—e.g., Headstart andother compensatory educational measures—werea waste of time and money. Opponents of the Jen-sen research are convinced that even to considersuch research as scientifically legitimate andmorally acceptable would be a racist act. The

‘“For a brief review of this history, see Emily H. Mudd, “The His-torical Background of Ethical Considerations in Sex Research andSex Therapy, ” Ethical Issues in Sex Therapy and Research, WilliamH. Masters, Virginia E. Johnson, and Robert C. Kolodny (eds. ) (Bos-ton, MA: Little, Brown & Co., 1977), pp. 1-10.

“ion D. h!iller, A Nationaj Survey of Adult Attitudes To\\rardScience anci Techno)oH’ in the United States (Philadelphia, I’A: An-nenberg School [Jf C[)mm un icat i ens, Uni\ersit}’ of Pennsylvania,1983).

‘pSee Richard l.e~vcmtin, “Race and Intelligence, ” l?u~letin of theAtomic Scientist\, hlarch 1Q70. Also see Arthur R, lensen, “HO W

N!uch Can \$re Boost IQ and Scholastic Achie\’ement?” Har~rardEducational l<e~ie~j’, VOI, 39, 1 %q, pp. 1-123.

Page 41: The Regulatory Environment for Science - Princeton

38

mere existence of the research project was per-ceived as an insult to members of certain minor-ity groups. The objections can also go beyond thedesire to avoid offending certain social groups.“The critics of such research, ” Harvey Brookswrites, “believe that the risks of political misuseof the resulting knowledge outweigh any possi-ble social benefits.”27 In some cases, then, the prin-cipal objection may be to undesirable applicationof the research knowledge; the secondary objec-tion, offense to a social or cultural minoritygroup.

On occasion, however, the very idea of doingsuch research on a taboo subject has been suffi-cient to warrant social regulation. This justifica-tion plays a role in the regulations promulgated

1“Brooks, op. cit.

by the Department of Education (ED) to imple-ment the 1978 Amendments to Section 439 of theFederal Education Provisions Act, commonly re-ferred to as the “Hatch Amendments” after theiroriginator, Senator Orrin Hatch. The ED languageaims to prevent specific subject matter, teachingmethods, psychological tests, or educational re-search from being utilized or conducted withoutparental knowledge or consent. It would prohibit“research” designed to “reveal” such things as: po-litical affiliations; mental or psychological prob-lems potentially embarrassing to a student or hisor her family; sex behavior and attitudes; illegal,antisocial, self-incriminating, and demeaning be-havior; critical appraisals of other individuals withwhom respondents have close family relation-ships; legally recognized privileged and analogousrelationships such as those of lawyers, physicians,or ministers; or income.

REGULATION OF RESEARCH PROCEDURES AND PROTOCOLS

With the exception of protests over the use ofanimals, social criticism in the early 20th centurywas more likely to be directed at the topics thanat the procedures of research. The moral and ethi-cal concerns expressed in attempts to control howscientists conduct their research are not new, how-ever. What is new is the raising of such concernsto the level of government action or legally en-forceable regulation.

In these cases, the rationale for external con-trol is most often that the scientific community’sown safety procedures have been or are predictedto be inadequate or insufficient to prevent harmto human beings, animals, or the environment.The motivations for managing the risks inherentin the research process are straightforward: tocomply with Federal, State and local laws and reg-ulations and thereby to avoid enforcement actionsor civil or criminal sanctions for noncompliance;and to comply with common law duties (e. g., toact with due care) and thereby to avoid personalinjuries or environmental degradation, as well asany liability or duty to provide compensationwhich could arise from claims brought by the in-jured parties.

To Protect Human Health and Safety

By far the most visible and vocal science pol-icy debates on regulation have been those sur-rounding how to protect human health and safety.Although the preoccupation with safety is a re-cent phenomenon, “it has taken hold so univer-sally and absolutely that this operation hardly rec-ognizes the possibility of a different world.”28

Regulations set by local or Federal authorities toprotect the health and safety of workers (e.g., re-quirements for certain types and amounts of safe-ty equipment for persons working with hazard-ous chemicals) apply with equal force to labora-tories and, in some cases, may have been writtenspecifically to apply to laboratory workers. TheOccupational Health and Safety Act of 1970 (dis-cussed in more detail in ch. 5) includes protec-tions for research workers who might be subjectedto unnecessary hazards on the job.

28 Rc)[)ert L, Sproul], “Federal Regulation and the Natural Sciences, ”

Bureaucrats and Brainpower: Government Regulations of Univer-sities, Pau] Seaburg (ed. ) (San Francisco, CA: Institute of Contem-porary Studies, 1979), p. 86.

Page 42: The Regulatory Environment for Science - Princeton

39

Emotional controversy has surrounded effortsto extend special legal protections to the humansubjects of experimentation (see box in ch. 4 forthe specific regulations). Human subjects are usedin all parts of science. They are used to test newforms of diagnostic procedures, treatments, ormedicines. Carefully controlled clinical trials indrug research are necessary to prove effectiveness,to set dosage, and to uncover unknown side ef-fects before drugs may be licensed for general use.Human subjects must be observed for research onmental disorders. Private industry uses them totest new consumer goods, or in research on howto make products more useful.

The types of experimental situations involvinghuman subjects can be classified generally intofour

1.

2.

3.

4.

Inonly

categories:

experiments done to test physiological statesand environmental manipulation, both in-ternal and external, in “normal” subjects;studies of human performance and process—e.g., memory or vision;the trial of new methods, procedures, ordrugs on persons who are ill; andthe use of terminally ill patients to test po-tentially dangerous drugs or procedures .2’

the latter case, research may be conductedas a “compassionate” procedure not requir-

ing the review of a local ethics board if it is an“emergency” treatment with potential therapeu-tic value and involving a new or investigationaldrug or device.

Historically, the impetus for controls on the useof human subjects has been either the documen-tation of abuse of subjects or questions raisedabout potentially risky research. In the 1960s and1970s, studies such as the Milgram “psychologyof obedience” project (in which subjects were en-couraged to act with increasing severity againstother subjects),30 or the Public Health ServiceTuskegee study experiments in which over 300black prisoners with syphilis were examined andtested but not treated for more than 40 years in

“R{~bert E. Hodges and \V]lliam B. Bean, “The Use of Prisonersfor hledlcal Research, ” The lournal of the American Afedical Assc)-cidt][)n, v()], 202, N’ov. 6, 1967, p, 177,

‘OStanley Nlllgram, Obedience to Authoritj (New York: Harper& Row, 1974).

order to observe the complications arising dur-ing terminal stages of the disease,31 served tofocus public and congressional attention on theneed for more formal governmental surveillanceof research on human subjects. In these and othercases, critics were able to show that human be-ings were subjected, usually without their know-ledgeable consent, to the risk of some potentialharm: death, physical abuse or injury; psycho-logical abuse or injury; damage to interpersonalrelations (e. g., loss of trust in others); legal jeop-ardy (e.g., creating and revealing a record of crim-inal behavior); career damage; economic harm;or invasions of privacy.32

Of all aspects of the human subjects debate per-haps the most sensitive has been the use of sub-jects with “limited civil freedom, ” a classificationthat includes prisoners, residents of institutionsfor the mentally ill and retarded, and children/minors .33 As research institutions are often locatedin large urban areas, subjects are frequently drawnfrom the disadvantaged in those cities. Hospi-talized or incarcerated subjects also provide a con-venient, stable population that can be monitoredwith ease.34 The large U.S. prison population, infact, makes it possible for a research project tochoose subjects with any necessary characteris-tic. Proponents of the use of such subjects arguethat, moreover, there are also considerable ad-vantages to society: prisoners are provided witha break from monotony, a feeling of altruism, andsome monetary reward; research on mental ill-ness and retardation cannot proceed without ac-cess to such patients,

Foremost in the discussion of whether minorsor institutionalized subjects should be used is thequestion of coercion; for these subjects’ peculiarposition renders their “consent” to participationquestionable and may also lead to subtle, andoften unintended, abuse by experimenters.35 Op-

‘] James H. Jones, Bad Blood: The TusAegee Sjpbi/is E~perinlent(New York: The Free Press, 1981 ).

“Donald P. Warwick, ‘Types of Harm in Social Research, ‘ EthicalIssues in Social Science Research, Tom Beauchamp, et al. (eds. ) (Bal-timore, MD: The Johns Hopkins University Press, 1982 ), Pp,105-110.

‘} Robert ]. Ilvine, Ethics anci Regulations of Cljnical Research(Baltimore, hlD: Urban & Schwarzenberg, 1Q81 ),

“Ibid.“Alexander Nl, Capron, “Nledical Research in Prisons, ” The Ffast-

jngs Center Report, June 1~73, p. 4,

Page 43: The Regulatory Environment for Science - Princeton

40

ponents argue that because such research may becarried out in prisons or mental hospitals, it doesnot receive the scrutiny and criticism by colleagueswhich may be routine or required in normal re-search settings.

Concern that special populations might not beadequately covered by existing regulations led inthe 1970s to the suggestion of a moratorium onresearch involving prisoners. In some countries—e.g., England—prisoners may not be used as sub-jects of experiments. The World Medical Associ-ation’s Declaration of Helsinki (1964, revised in1975) states that the only appropriate subjects arethose “in such a mental, physical, and legal stateas to be able to exercise fully [their] power[s] toconsent. ” Dissension over exactly how to treatprisoners has apparently stymied recent Food andDrug Administration efforts to finalize its regu-lations on such research.

Additional questions may be raised about howhuman subjects are used in social science research.Quite a bit of controversy arose in the 1960s and1970s over deception that occurred in such re-search. Because they used stooges or engaged incovert observation of unsuspecting people, somesocial scientists appeared to be using “dubiousmeans to achieve questionable ends. ” The re-searchers insisted that deception was only usedto advance human understanding and thus wasbeneficial to human welfare, that it helped in thestudy of “underdog” social groups such as homo-sexuals, and that deception in research—just asdeception in muckraking journalism–could helpto expose the unethical conduct of the power elite.Critics argued that any study that involved theviolation of moral norms could not advance thehuman welfare, that “a chain of lies was notmorally justified, “36 and that no gain could off-set the magnitude of potential discomfort to thesubject .37

The tremendous acceleration of medical re-search—e.g., in immunology, genetics, and bio-medical engineering-has created new controver-sies for those fields. Many medical researchers

“Donald P. Warwick, “Social Scientists Ought to Stop Lying, ”Psychology Today, February 1975, pp. 38-40, 105-106.

‘“See Tom L. Beauchamp, et al. (eds. ) Ethical Zssues in SocialScjence Research (Baltimore, MD: The Johns Hopkins UniversityPress, 1982).

would like to use new techniques or technologieson patients before they have been fully tested.They believe that if a procedure could help a pa-tient, then they have a responsibility to try it, evenif they are not sure it will work. Others believethat the physician’s responsibility is to be certainthat a technique will result in some benefit. Theconflict between these two perspectives raises suchquestions as: Who is or is not an experimental sub-ject? What are the justifications for delay in usinga new technique? Ethicist Thomas Murray haspointed out, in discussion of the “Baby Fae” ba-boon heart transplant, that even in a desperatetherapeutic situation, certain rules should be fol-lowed. He suggests that four questions should beasked before an experimental treatment is used:Is the scientific background right? Is the next ex-perimental subject naturally a human being? Isthere no superior alternate therapy available? Andcan the researcher get truly informed consent—or informed consent from a guardian or parent?38

Similar questions are being raised for the useof human somatic-cell gene therapy39 when op-ponents ask whether such research is playing withthe very essence of humanness, or when animalrights groups object to the use of primates as asubstitute for human experimental subjects .40Those arguing for proceeding with the researchcite the potential benefit to existing patients. The“bottom line” for that debate—as with manyothers—has become, as Alexander Capron haswrit ten, “when may a society, actively or by ac-quiescence, expose some of its members to harmin order to seek benefits for them, for others, orfor society as a whole?”41 In some other contexts,society has already answered Capron’s questionby putting some people in jeopardy to protect thewhole population. We select firemen and mem-bers of the military forces, sometimes by conscrip-tion, sometimes by lottery, sometimes by offer-ing incentives. We have also used some of these

‘8’’The MacNeil ‘Lehrer News Hour: The Baby Fae Case, ” tran-script 42385, Thirteen, Nov. 16, 1984, pp. 1-9.

3’See “Points to Consider in the Design and Submission of Hu-man Somatic-Cell Gene Therapy Protocols, ” prepared by the Work-ing Grc}up on Human Gene Therapy of the NIH, Ftderal Register,vol. 50, No. 14, Jan. 22, 1985.

40Jud[th A. Johnson, “Human Gene Therapy, Updated 10/30 /84,”Library of Congress, Congressional Research Service, Science Pol-icy Research Division, Apr. 3, 1984.

i Ical)ron, op. cit. I P. 4‘

Page 44: The Regulatory Environment for Science - Princeton

41—

means—incentives and lotteries—to select subjectsfor experiments .42

In the debates over experimentation on fetuses(either still in the womb or newly aborted or mis-carried), the emotionally charged issue of abor-tion—as a potential “source” of fetuses or fetaltissue—has often been the implicit or explicit jus-tification for controls. (See ch. 4 for discussionof specific regulations on use of fetuses. ) Similardebates are now raging in England. ’3

On occasion, objections to research have fo-cused on accusations that a city or special popu-lation might be “experimented on. ” Concern thatresearch might jeopardize the health and safetyof the general public was, for example, expressedduring the 1970s’ recombinant DNA controversyin Cambridge, Massachusetts (see ch. 7).44 Theargument was not that such research was intrin-sically “bad” or that it might not result in posi-tive gains for society, The argument was that thesafety of the research procedures was untested andthe consequences of an accident —even if only re-motely possible—were potentially so negative thatthe community might be unwilling to risk anymistake. In such cases, until the procedures canbe proved to be reliable, the public and the legis-lative bodies have acted to suspend research tem-porarily—until public study and debate can takeplace. The Catch-22 in this scenario is often thatthe procedures cannot be proven to be safe with-out trying them in some way.

To Protect Animals Usedin Experimentation

The first attempts at social regulation to pro-tect the welfare of animals (to obtain legal pro-tection for members of nonhuman species) dateto 19th-century England, although social con-cern—in the form of cultural reverence for someanimals, and/or repulsion at cruel treatment of

animals—may be found in many countries forhundreds of years.45

The controversy over the experimental use ofanimals is characterized by certainty of moral po-sition on both sides. Thomas H. Moss of CaseWestern Reserve University observes that:

. . . those who are convinced that laboratory ani-mals are cruelly or unnecessarily used have oftencharacterized the scientific establishment as insen-sitive to animal pain, and lacking in basic com-passion toward living creatures. Those who areconvinced that animal experiments are naturaland appropriate tools to serve the advancementof science . . . have often characterized the ani-mal welfare movement as irrational and as blindlymyopic in the sense of moral outrage at animalsuffering but lack of recognition of human healthneeds .46

In its attempt to abolish totally the use of animalsin experiments, the animal liberation movementis saying:

. . . that animals and humans have similar inter-ests . . , those interests are to be counted equally,with no automatic discount just because one ofthe beings is not human.

This argument extends to such similarities asavoiding physical pain.47

The justifications for the various positions arebased on a number of philosophical argumentsrelated to perceptions of the appropriate relation-ship between humans and animals. Members ofthe animal rights community believe that animalspossess a consciousness and certain attributes—e.g., symbolic communication, self-awareness,and anticipation of future events—which imbueanimals with rights as exercised by humans. Theysee the animal rights movement as the progenyof the “humanitarian” movement, as the logicalsuccessor to the civil rights movement and thefeminist movement. In comparison, the animalwelfare movement, which includes some animal

“Dael Wolile, Emeritus Professor, Graduate School of Public Af-fairs, University of \4rashington, personal communciation, 1985.

~ 3See ~~ar~, ~j’arnoc~, A @estjon of life: The Warnock Reporton Human F’ertilizat~on and Embr}~ologJ (New York: Basil Blach-well, Inc., 1~851.

“Sheldon Krimsky, Genetic Alchemy: The Social History of theRecombinant D,\rA Controversy (Cambridge, MA: The MIT Press,19821.

“Harriet Ritvo, “PIus a Change: Anti-\ ’i\’isection Then and~

Now, ” Science, Technology, & Human l~alues, VOI. ~, spring 1Q84,pp. 57-66.

“Thomas H, hloss, “The Nlodern Politics of Laboratory AnimalUse, ” Science, Technology?’, & Human \’a/ues, vo]. ~, spring 1~84,pp. 51-56.

“Peter Singer (cd. J, In Defense of Animals (New York: Basil Black-well , Inc. , 1~8.5 }, p. ~.

Page 45: The Regulatory Environment for Science - Princeton

4 2

researchers, believes that as humans, in the wordsof Arthur Caplan, we hold a certain “moral ste-wardship” over animals, which requires that wetreat them with respect even in the service of hu-mans. The concept of moral stewardship infusesa spectrum of regulatory activity, ranging fromoutright bans on the use of certain animals (orof any animal in certain types of research) to Na-tional Institutes of Health regulations governingthe treatment, handling, and use of animals in lab-oratories. (See ch. 4). *

To Protect the Environment

If research involves the use of toxic chemicalsor biological materials known or suspected ofcausing some adverse effect on the environmentby altering the natural composition of air, water,or soil, or by destroying or altering the ecologi-

‘Concern about this issue has also lead to a National Academyof Sciences study of the numbers of animals used in the United Statesin research and testing and to an OTA report on Alternatives tO

Animal Use in Research, Testing, and Education, OTA-BA-273, Jan-uary 1986,

cal balance, then regulation of the research proc-ess may be implemented with the specific inten-tion of protecting the environment.

The dangers in the introduction of new plantor animal species or new genetic forms have beenobvious for decades in the destruction caused by,for example, the introduction of such nonnativespecies as kudzu vine and gypsy moths. Compar-ison of the effect of a current line of research tothe past adverse effects of nondeliberate altera-tions of the balance between species or environ-ments forms the basis of many attempts to regu-late research on environmental grounds .48 Becauseit is virtually impossible to design tests to predictthe ecological risk from a nonnative species, theseconcerns have been raised again and again andhave now reached the courts via legal action toprevent agricultural research involving deliberaterelease of genetically engineered bacteria.

48see fc)r examp]e, Winston J. Brill, “Safety Concerns and Genetic

Engineering in Agriculture, ” Science, vol. 227, Jan. 25, 1985, pp.381-384.

REGULATION OF THE DISSEMINATION OFSCIENTIFIC KNOWLEDGE

Open communication, through such things aspublications, symposia, and face-to-face meetings,has always been an essential aspect of scientificendeavor. Unrestricted interaction sets forth aframework within which peer review, criticism,and data sharing can occur; it provides the arenafor cross-fertilization of ideas, and helps avoidduplication of effort .49 As the American Associa-tion for the Advancement of Science, Commit-tee on Science in Promotion of Human Welfare,stated in a 1965 report:

Each separate study of nature yields an approx-imate result and inevitably contains some errorsand omissions. Science gets at the truth by a con-tinuous process of self-examination which reme-dies omissions and corrects errors. The processrequires free disclosure of results, general dissem-ination of findings, interpretations, conclusions,

~~NATO Science Committee, “open Communication in Science,

NATO Science & S<xiety, 1983.

and widespread verification and criticism of re-sults and conclusions. 50

Because openness in science also encourages un-inhibited dissemination of results outside of thelaboratory, the justifications for restraining suchcommunication center primarily around the po-tential effect of the information, the information’s“value” (economic or otherwise), and who is per-ceived to “own” the information (e. g., the scien-tist or the organization that supported the scien-tist’s work).

Regulation of scientific communication is farfrom simply a process of stamping a label on adocument, however; it involves restraints or con-trols on, for example: 1) who may know certain

‘“Harold C. Relyea, “Shrouding the Endless Frontier—ScientificCommunication and National Security: The Search for Balance, ”Striking a Balance: National Security and Scientific Freedom, HaroldC, Relyea (cd. ) (Washington, DC: American Association for theAdvancement of Science, 1985), p. 76.

Page 46: The Regulatory Environment for Science - Princeton

scientific data or information, 2) the dissemina-tion of printed documents, 3) who has access toan electronic communication system, 4) descrip-tions of processes or computer programs, and 5)even who may share or receive certain cell linesor biological strains. 5

1 Physicist Lee Grodzins haspointed out that the appropriate point of classifi-cation often may not be a specific formula or in-struction but the knowledge that a result can beaccomplished, for “once it is disclosed that some-thing can be done, then someone will be able toduplicate it. ”52 Joan Bromberg, a historian ofscience, adds that “keeping secret that a researchprogram exists is one way to hold the edge in afield, ” because such “revelations also give hintsat the correct direction for research.”53

Because of the differing values of the groupsinvolved in the communication of science, the in-formation developed during research frequentlybecomes the object of dispute or tension betweenthose who sponsor and those who conduct re-search. In general, this tension derives from con-flicting desires to disseminate and to restrictaccess to information, As each actor defines dif-ferently the areas of restriction, then the tensiongrows.

This tension is particularly apparent in contem-porary restraints on communications relating tonational security and commercial property rightsin such things as biological materials. In thesecases, a lack of consensus on boundary definitionshas resulted in increasingly large “gray areas” ofinformation perceived as possible candidates forrestriction in the future. The more that militarysystems depend on advanced technology-includ-ing such things as large-scale integrated circuitry,space technology, and microbiology54—the morethat basic research appears to have the potentialfor military importance. For technology relatedto international industrial competition, similar un-certainty about what may prove to be importantin the future has stimulated restrictions on the

“1’atrick D, Kelley and Ernest G. Jaworski, “Agreements Cover-ing Exchanges 0[ Biological Materials, ” American Association forthe Advancement of Science, annual meeting, New York, May 1Q84.

““Openne\s and $ecrcc} in Scientific and Technical Communi-c a t i o n , Sernlnar, [Xc, ] 1, I Q84, hlassachusetts Institute (}[ Tech-nolo~}’, C a m bnd~c NIA

‘ ‘Ibid.“NATC) Science C(~mmittcw, op. cit.

43

sharing of information with citizens of other coun-tries and on the freedom of industrial scientistsor industry-supported university professors toconverse openly with their colleagues about theirwork,

Four main rationales may underpin actions torestrict communication:

to uphold scientific standards;to protect a professional or economic interest;to protect the health, privacy, or safety ofindividuals; andto protect national military and economicsecurity.

Uphold Scientific Standards

Within science, both traditions and good lab-oratory practice govern the flow of disseminationof research results—who can communicate, whowill receive communications, and when and wherecommunication takes place. A junior member ofa research team may be restricted from discuss-ing his or her own original work until the team’spublication is ready, or scientists in one labora-tory group may refrain from discussing their workwith colleagues elsewhere in the organization orwith journalists until a writeup is submitted to ascientific journal. The ultimate justification formost such controls—whether reinforced by lab-oratory “rules” or by the pressure of tradition—is to uphold scientific standards, to assure thatonly verifiable and replicable science is presentedas legitimate science.

The peer review system in scientific journals,for example, seeks to filter out reports of scien-tific work that do not meet the highest standardsof research in the field, Readers must be able toaccept publication confidently as a seal of legiti-macy and accuracy, thereby allowing them totrust the author’s conclusions without replicatingthe experiment or redoing the research. In theory,the norms of good scientific practice justify accept-ance or rejection of communications; in practice,the current agenda and occasionally the biases ofthe research field may determine which topics arefavored as well as which determine the mode ofpresentation.

Page 47: The Regulatory Environment for Science - Princeton

4 4

The goal of preserving the quality and integrityof science, and the goal of protecting the publicare both used to justify an unusual but effectiverestriction on the timing and, in a few cases, theactual publication of articles in medicine. In 1969,Franz J. Ingelfinger, editor of The New EnglandJournal of Medicine, began to worry that prema-ture disclosure of unevaluated and unauthenti-cated medical research results before they werepublished in a peer-reviewed medical journal (andhence presumed to be evaluated and authenti-cated) could be dangerous to the public. He ar-gued that such reports might contribute to falseexpectations of unevaluated drugs or treatmentsor, on occasion, might advocate treatments laterfound to be useless or potentially harmful.55 In-gelfinger therefore instituted an editorial policythat denied publication to an article if its conclu-sions or major data had appeared in a medicalnews publication or similar unrefereed format.This rule has been continued and reinforced bythe Joumal’s subsequent editor,56 and similar prac-tices are followed by editors at other journals.

To Protect a Professionalor Economic Interest

Scientists have always exercised a form of self-regulation in publication in order to achieve per-sonal or professional rewards. Timing and place-ment of publication, for example, can significantlyaffect a scientist’s career success.

Protecting an economic advantage has also longbeen accepted as a legitimate motive for commu-nications restraint in commercial circles; busi-nesses control publication to protect their eco-nomic rights in the material, to make a profit, orto avoid a loss. Examples of such motivations forrestrictions may be the protection of patent rights,the maintenance of competitive advantage, or theprotection of rights in biological materials.57 In-dustry, in fact, has cited the ability to protect in-

55 Barbara J. Culliton, “Dual Publication: ‘Ingelfinger Rule’ De-bated by Scientists and Press, ” Scienm, VO]. 176, June 30, 1972, pp.1403-1405,

s~Arno]d S. Relman, “The Ingelfinger Rule, ” The New England

journal of Medicine, vol. 305, 1981, pp. 824-826.SYKelley and Jaworski, oP. cit.

tellectual property as a major determinant ofsuccess:

There is a direct correlation between the secu-rity of patent rights and industry’s willingness tocommit large sums to the inherently risky effortsneeded to find and develop new technologies .58

In industrial research, the sponsor wants to pro-tect the proprietary nature of the research andmay not want competitors to have access to theinformation resulting from the sponsored re-search. This justification for secrecy now extendswidely as more and more universities enter intoresearch agreements with industrial sponsors. Theinstitutions’ naturally opposing views about thevalue of information are often a subject of nego-tiation in university-industry relations, where thetraditional openness of the university could actagainst the commercial interests. Most frequently,the resolution is a contract provision that allowsa prespecified delay of publication in order to per-mit the sponsor to file a patent application. Someuniversity research projects will submit to a de-lay to allow an industrial sponsor to review a doc-ument for proprietary data.

A desire to secure or protect certain legal rightsof the generator or sponsor of the research mayalso motivate restrictions. With respect to newproducts or processes developed during research,three outcomes are possible: it may be kept secret;it may enter the public domain; or it may begranted a patent. The patent laws grant an ex-clusive right, for a fixed period of time, to com-mercial exploitation of an innovative product orprocess to the person who discloses the inventionto the U.S. Patent Office. Data or analyses col-lected during research may also receive protec-tion via the copyright laws, which prevent pla-giarism. The first U.S. regulation concerning theuse of research results was, in fact, stated in thepatent and copyright clause of the Constitution.59

Premature disclosure of patentable informationcould endanger the legal rights of the inventor and

Se Alexander MacLachlan, testimony before the U.S. Congress,Science Policy Task Force, House Committee on Science and Tech-nology, Apr. 25, 1985.

S~Haro]d Relyea, Congressional Research service, personal com-municat~on, 1985.

Page 48: The Regulatory Environment for Science - Princeton

4 5

therefore restriction is chosen. In other cases, dis-closure might be used to establish some such right.

An organization may take action to impede dis-semination (or to hasten dissemination) in orderto preserve a corporate image or administrativepower. Similar action might be taken to supportthe mission of a government agency. In a few re-ported cases, businesses have acted to impede thedissemination of scientific data in order to pro-tect the company’s legal position or to avoid ad-verse publicity. When studies of the toxic effectsof vinyl chloride on rats revealed cancer, one Ital-ian researcher, for example, found that his indus-trial sponsor refused to let the evidence be re-leased. It was some time after that suppressionoccurred that cancers were found in workers inthe United States who had been exposed to vinylchloride. 60

To Protect the Health, Privacy,or Safety of Individuals

Federal regulations as well as informal controlson the publication of data from human subjectsresearch often seek to control dissemination in or-der to protect the privacy or safety of individualsdescribed in the reports, or to protect subjects whoparticipate in research on controversial topics orillegal activity.”]

Some research information may have the po-tential of harm to the public welfare either becauseof what is said or when it is said. In those cases,dissemination is regulated (delayed or prohibited)to protect the public health and safety. Announce-ments pertaining to some real or potential publichealth problems could cause panic, and so re-straint is used in the dissemination or publicityprior to publication—a justification that has beenused for restricting communication of data on pos-sible modes of transmittal of acquired immuno-deficiency syndrome (AIDS). Dissemination of aresult may also be delayed or controlled to pre-vent a potential adverse economic, social, or po-litical reaction. This justification is used to avoid

‘“John T. Edsall, “Scientific Freedom and Responsibility: Reportof the AAAS C<>mm ittee on Scient i tic Freedom and Responsibil-ity, ” .%lence, vo], 188, .May 16, 1975, pp. 687-693, and \rol, 18Q,July 18, 1~75, pp. 174-175.

“’Barr~’ Barnes, ;I’ho Shouid Knon’ ;l’h~t: Social Science, Pri\’ac>and Ethics ( Cambridge, England: Cambridge University Press, 1979),

the panic or devaluation of property that mightfollow publication of an earthquake prediction fora specific area. The “Paigen” report, which ana-lyzed the alleged health effects of the chemicalsdumped at Love Canal, was later criticized by apanel of scientists for improper epidemiologicmethods that “fueled rather than resolved publicanxiety. “62 One of the most important questionsraised in such situations is how, in the face of greatscientific uncertainty, adverse economic effectsshould be weighed against possible health risksto individuals or the general public.

Opponents of such restrictions argue that theyinhibit free discussion in a democracy. The Amer-ican public has a right to be told the technical in-formation, even if the public policy decisions areultimately based on normative rather than tech-nical grounds. In the laser isotope separation casediscussed earlier in this chapter, the argument wasmade that severe classification of such researchmight not, in the long run, prevent disseminationof the scientific “trick” or secret of the laser iso-tope separation process to other countries, butthat it could discourage public discussion in theUnited States on whether the work should con-tinue. The “lid of secrecy” would “effectively pre-clude public scrutiny, ” one observer wrote.63

To Protect National Militaryand Economic Security

Protection of national security has been usedfor centuries as a justification for government reg-ulation of technical information.64 During WorldWar II, American scientists and engineers acceptedtwo kinds of censorship or control of communi-cations—voluntary (justified in the spirit of patri-otism) and mandatory. Scientific journal editorspracticed extensive voluntary censorship duringthe war—they believed that some type of censor-ship was necessary to prevent the leakage ofvital information to U.S. enemies.65 Scientific pub-

*’Lewis Regenstein, America the Poisoned (Washin~t(~n, DC:Acropolis Books Ltd., 1982), p. 140.

* ~Casper, “Lasa Enrichment, ” op. cit., p. 38.“Relyea, “Shrouding the Endless Frontier, ” op. cit., p. 80.“hllchael Ivl, Sokal, “Restrictions on Scientific Publication, ”

Science, ~’o]. 215, hlar. 5, 1982, p. 1182; and Michael M. Sokal andJanice F. Goldblum, ‘From the Archives, ” Science, Techno]og}, &Human I’a]ues. ~’o]. 10, spring 1~85, pp. 24-27,

Page 49: The Regulatory Environment for Science - Princeton

46

lications of all types were also subject to manda-tory review by the Office of Censorship. Andscientists engaged in weapons research were, ofcourse, subject to the military’s restraints and con-trols on all their activities, including conversationsas well as written communications. In peacetime,however, the “conflicting imperatives of nationalsecurity and open scientific communication” haveoccasionally led to controversy and legal actionand are continually the subject of vigorous de-bate.” The tension between these two objectivesarises primarily in a clash between the justifica-tions for restraint and openness. The governmentwants to control all information that could be ofpossible value to potential enemy states; the scien-tists stress that such measures could damage scien-tific progress and creativity and abridge tradition-al scientific freedom. ’7

The motives for restrictions justified by nationalsecurity can be both economic and military.Often, the restriction is indicative of whether theresearch has demonstrated application. * Although“national security” is vaguely defined in the lawand the uses of the term range from “defense ofthe United States” and “public peace and safety”to “financial policies of the United States, ” thereis agreement among policy analysts that that pol-icy concept does provide to the President a broadgrant of administrative discretion to justify allsorts of policies.

Recently, the justifications for communicationsrestraints based on national security considera-tions have tended to relate to quite specific per-ceptions about the importance of science in an in-ternational context. First, those who believe thatthe United States’ lead over the Soviet Union insome important areas of military technologies isdiminishing attribute that situation to Soviet ab-sorption of U.S. technologies. Second, the mili-tary systems themselves have become more de-pendent on sophisticated new technologies andon the science that feeds them. Third, proponents

“Richard D. DeLauer, “Scientific Communication and NationalSecurity, ” science, vol. 226, Oct. 5, 1984, p. 9.

“NATO Science Committee, op. cit.*,, know-how is a precious commodity leading to the com-. . .

mercial or military products that determine the fortunes of nationsin peace and in war. Yet sometimes it is hard to tell where scientificknowledge leaves off and engineering know-how begins. ” DeLauer,op. cit.

of restrictions believe that a steadily increasingshare of these technologies is dual-use in nature,that is, that they can have both military and non-military applications. ’a And fourth, such nationalpolicies as East-West detente and scientific ex-changes with the Chinese are perceived to haveincreased the opportunities for leakage of tech-nical information of all types. Such rationaleswere clearly stated in the draft “National Policyon the Transfer of Scientific and Technical Infor-mation, “ issued by the executive branch on June15, 1984:

The acquisition of advanced technology fromthe United States by Eastern Bloc nations for thepurpose of enhancing their military capabilitiesposes a significant threat to our national security.Intelligence studies indicate that a small but sig-nificant target of the Eastern Bloc intelligencegathering effort is science and engineering researchperformed at universities and federal laboratories.At the same time, our leadership position in sci-ence and technology is an essential element in oureconomic and physical security. The strength ofAmerican science requires a research environmentconducive to creativity, an environment in whichthe free exchange of ideas is a vital component .69

The government has recently justified the ap-plication of export control regulations to basic re-search as necessary to protect: 1) tangible goods,including technical data, that relate to nationalsecurity; and 2) the domestic economy. The ap-plication limits “information of any kind that canbe used or adapted for use, in the design, produc-tion, manufacture, utilization, or reconstructionof articles or materials.”70

The ensuing controversy over the wide-scaleapplication of these controls has led to a reaffir-mation by the Department of Defense/UniversityForum that: “No restriction may be placed uponthe conduct or reporting of fundamental researchthat has not received national security classifica-tion.“ However, how the various participants insuch restrictions define what is or is not fun-

bsPZ,nel ~jn Scientific Commmication and National Security ~’om-

mittec on Science, Engineering, and Public Po] icy, Scientific Com-munication andlVational Security (Washington, DC: National Acad-emy Press, 1982), p, 11.

69U S. Congress, “Scientific Communication and National Secu-rity, ” hearings before the House Committee on Science and Tech-nology, May 25, 1984.

7015 CFR 379. 1.a.

Page 50: The Regulatory Environment for Science - Princeton

4 7

damental research can determine the extent of re-striction. This window of uncertainty promptedthe Department of Defense (DOD) to state its def-inition of “fundamental research”:

For DOD purposes the decision whether a par-ticular research activity is or is not fundamentalwill be determined primarily by considering thefollowing easily identified characteristics: 1) per-former (for example, university, industry, in-house) 2) budget category (for example, 6.1, 6.2)3) sponsoring DOD entity 4) special contract pro-visions. . . Unclassified contract research sup-ported by 6.1 funding shall be considered ‘fun-damental. Similarly, unclassified researchperformed on campus at a university and sup-ported by 6.2 funding shall with rare exceptionsbe considered ‘fundamental.’71

In the disputes over restrictions on scientificcommunication, DOD sees itself as “caught in adilemma. ” In the words of the Defense ScienceBoard:

If it vigorousl y attempts to regulate the f lowof scientific information in the scientific commu-nity, it could jeopardize the strength and vitalityof the very community it is seeking to revitalizefor the sake of national defense. On the other

“’I.e(l }’(lung, “( t~mrnentary The C(~ntr~>l of Government-Spon-w~rcd Tec hn ica 1 I nt or ma ti on, Scienm, Technology >,, & Human \’al-ues, vol. 10, spr]n~ 1Q85, pp 8 2 - 8 0 .

SUMMARY

In 1979, Miller, Prewitt, and Pearson conducteda public opinion poll for the National ScienceFoundation in which they asked respondentsabout specific types of scientific studies andwhether scientists should be “allowed” to conductthose studies. Because the structure of the pollquestions implied regulation, the response can beinterpreted as one measure of the public’s will-ingness to restrain certain types of scientific in-quiry. 75 The results indicate that a majority of therespondents would have liked to prohibit researchdealing with the creation of new life forms andwith the gender of children, Opposition to genetic

“ion 11 Nllller, et J]., The Attitudes of the 11, S, Public To;\ardScience and Ttt-hno)og},” ( ~f’ashln~ton, DC: Nat]ona] Science Fc~un-dation, 1 Q80 ), alw} J{~n D !vliller, The ,4nwrican People and Science/’()/ic}, ( ~cw }’orh [)erg~m[,n I’ress, 1 Q83 ).

hand, if DOD abandons any attempt at regula-tion in the university context, it could seriouslycompromise and, in certain cases, totally under-cut other efforts to control the out-flow of militar-ily critical technology. The middle ground is a dif-ficult one to establish. ”

The Corson panel of the National Academy ofSciences, fearful that the government policy couldbegin to endorse a form of “blanket justification”for restricting some fields of basic research, at-tempted to clarify the limits of acceptable re-straints. The Corson panel’s report73 stated thatcommunication restrictions should not be appliedto any area of university research, be it basic orapplied, unless they involve a technology meet-ing all the following criteria:

The technology is developing rapidly, and thetime from basic science to application is short;The technology has identifiable direct military ap-plications; or it is dual-use and involves processor production-related techniques; Transfer of thetechnology would give the U.S.S.R a significantnear-term military benefit; and the U.S. is theonly source of information about the technology,or other friendly nations that could also be thesource have control systems as secure as ours.74

71 Report (~t the Detense Science Board Task F(>rce In Universit\rResponsiveness t[> National Security I{equirements, )anuary 1982.

‘~[>anel O n Sclentlflc COmmunicatlon, (~p. c]t,‘i[bid,

engineering declined some when the question spe-cified plants and animals rather than humans, butthere was still substantial disapproval of scientificresearch in this area. In contrast, only one-quarterof the population expressed opposition to stud-ies that would involve weather modification orthe extension of the average human life span.

Such data tend to indicate that Americans, indeciding whether research should be restricted orprohibited, may make such decisions based onwhether they believe that the restriction wouldrespond to moral or social objections. The resultsmay also indicate that the public is more likely

to approve regulation for reasons relating to theimmediate protection of human health or safety

or to preservation of the moral order than for rea-

Page 51: The Regulatory Environment for Science - Princeton

4 8

sons relating to potential long-term damage of theenvironment or depletion of economic resources.

Although most Americans believe that the gov-ernment has some control over the work of scien-tists, the public does not appear to be willing toendorse more direct public control. A 1983 An-nenberg study asked whether the government pre-sently has any control over “what scientists do”and 77 percent of the respondents indicated thatthey thought that the government did have thatkind of control. ” When asked if the government“should” have control over what scientists do, 67percent of the public agreed that this kind of con-trol was appropriate. In a 1979 study, respond-ents were asked whether “most citizens are wellenough informed” to help set goals for scientificresearch or to decide which new technologiesshould be developed .77 Approximately 85 percentof the public indicated that they did not feel thatmost citizens had the knowledge needed either toset research goals or to select technologies.78 Con-tradictory evidence of such willingness to partici-pate was found, however, in a pilot study con-ducted in 1979 by the Public Agenda Foundation,which concluded that public participation in deci-sionmaking can extend to priority-setting based

7bMiller, The American People and Science Policy, op. cit.7“Miller, et al., op. cit.781bid.

on limiting or restraining certain areas of re-search .79

Available public opinion data suggest that thepublic is not unwilling, based on a number of ra-tionales, to restrict the scope of scientific inquiry,especially in the area of human health and safetysuch as genetic engineering. The data also sug-gest that a larger portion of the public would becomfortable with the genetic modification ofplants and animals, but that there is substantialconcern about work involving changes in the hu-man genetic structure. Other evidence, such as in-creased demonstrations, publicity, and legislativeinitiatives, indicates that, in the eyes of the gen-eral public, some regulation of experimentationon animals is supported.

It is possible that because the public appearsto place such high value on science’s contributionto human health and to quality of life, and be-cause usefulness and application play such a sig-nificant role in the public’s evaluation of scien-tific priorities, a willingness to regulate mayindicate that, in such instances, the perceived riskis believed to outweigh perceived benefit, eventhough there may be inadequate evidence to sup-

Tqpllbllc Agenda Fo~&tion, Scienm Poiicy Pn”on’ties and the pu~-

lic, a report to the National Science Foundation on a Pilot Projectto AS:CSS Public Attitudes About Priorities and Indicators of Qual-ity for Scientific Research, New York, 1983.

Page 52: The Regulatory Environment for Science - Princeton

Chapter 4

The Mechanisms for DirectControl of Research

Page 53: The Regulatory Environment for Science - Princeton

Chapter 4

The Mechanisms for DirectControl of Research

At the laboratory bench level, each researchercontrols his or her own activities, deciding whichquestions to answer and how to go about answer-ing them. Controls are also part of the normalprocedures of a research field or discipline—forexample, the peer review system that governs thecontents of disciplinary journals. Other, more for-mal control takes place at the laboratory or in-stitutional level, through set policies or such mech-anisms as review committees. And finally, legaland administrative regulation of research occursat all levels of government, most likely in responseto public opinion or public protest. This chapterlooks at the administrative mechanisms for con-trol or influence at all stages of the researchprocess.

In an idealized model of scientific freedom, ascientist sets his or her own research agenda, per-forms the research without fear of repercussionor criticism, and describes the work to anyone andeveryone. Sissela Bok characterizes such freedomas “freedom of limitless thought and unfetteredspeech. “I For scientists in some fields, that free-dom has traditionally been perceived to encom-pass autonomy of action as well as responsibil-ity for how the work is conducted. BiologistDavid Baltimore observes that contemporary re-search in molecular biology, for example:

. . . has grown up in an era of almost completepermissiveness. Its practitioners have been allowedto decide their own priorities and have met withvirtually no restraints on the types of work theycan do. z

In many research facilities and in many labora-tories, open communication has been routine,

IS]\sela Bok, “Freedom and RIsL, ‘ l.{mifs of Scientltic lnqu)rj,Gerald Ht)lt[)n a n d R[]bet-t 5 ~loris(~n leds ) INe\v }’c~rk. \$’ \lr N~~r-t(~n & Cc) 1~7~ ~, p 110

‘David Baltimore, “A Biologist’s Perspective, ” Limits of Scien-tif~c inquiry, Gerald Holton and Robert S, Morlson (eds, ) (NewYork: W,W. Norton & Co., 1979), p. 37,

Zoologist Alexander Faberge describes the “opentraditions” in one field:3

Not only is it normal to discuss one’s ongoingresearch, but also to give away one’s researchmaterial in the form of genetic stocks, with theverbal understanding that the giver should begiven time to publish first. Such genetic stocks,or cultures of organisms, sometimes take yearsof work to prepare, and are placed in culture col-lections, freely available.

“It would bean unheard of matter, ”Faberge con-tinues “to keep genetic stocks private . . .“ Theseand similar aspects of the sharing of research dataare discussed in a 1985 report from the NationalAcademy of Sciences (NAS) which notes that, ingeneral, access helps in reanalysis and verifica-tion. 4 In fast-moving areas, the sharing of researchdata may jeopardize a researcher’s patent rightsor the commercial return on a discovery, so thereare significant forces against openness. Neverthe-less, the NAS committee concluded that withoutdata sharing, scientific understanding and prog-ress would be impeded.

Even in early modern science, however, com-munication of ideas was not totally open. Inven-tors delayed or repressed publication out of fearof ecclesiastical or political displeasure. Becausethe community of science rewards priority, re-searchers have also delayed sharing data untilcredit is assured, usually through publication. His-torian David Hull attributes such secretiveness toscience’s “intrinsic competitive nature.”5 Techno-logical skills and knowledge regarded as appliedhave not always been tightly controlled by theirpossessors; skills were assumed to be the prop-

‘A]exander C, Faberge~ “ Thoughts on the Clrlglns C}I Secrec } andOpenness in Science, ” American As%lclat](~n t or the Adiranccmcntot Science, annual meeting, New York C i t}’, Nla}’ 1484.

‘National Academ}r ot Sciences, Committee on Nat]ona] Stat]+t]cs, .$h.]rinX l{ew~r~h Data (llrashlngtt~n, DC Nat ional Acaclem}rI)ress, 1Q851

‘Ila~’]d Hul l , “Openness and Sccrec} in Sc]ence: Their Ori~insand I.irnitat ions, ” Science Techno/og~,,” & Human \ ‘alues, v[~]. 10,~pring 1Q85, pp. 8-0

51

Page 54: The Regulatory Environment for Science - Princeton

52—

erty of those who exercised and developed them.6

Although unqualified sharing (especially beforepublication) has never been the norm, until sev-eral decades ago, scientists exercised relatively fewrestraints on their communications to others.

Contemporary attitudes to openness in univer-sity science are also influenced by the concept ofacademic freedom in general. In the United States,“academic freedom” has stood for personal free-dom of the academic, rather than the collectivefreedom of the Nation. Because of this emphasison the individual, some academics have regardedregulation more as a limit to which they areobliged to submit. Limits have been, therefore,equated with “responsibilities.”7 But, Walter Metz-ger points out, “academic freedom and scientificfreedom are different species of freedom. . . .“8

The former is an ideology of a profession, acrossthe disciplines, with common duties; the latter isthe ideology of various professions in a discipline(e.g., science). And the latter need not be con-nected to a university.

Historian Carroll Pursell takes a more prag-matic view. Science, he observes, is always “reg-ulated” in the sense that it is given shape, direc-tion, and impetus by something. There is, hewrites, “a tendency to take it simply as the work-ing of some invisible hand until the public (in theform of government, mobs, or whatever) takesa more visible hand. This is, of course, non-sense.”9 Communities have, for example, nevertolerated (for very long) any researcher whotackles topics outside the boundaries of acceptedmoral behavior or social beliefs or who knowingly

OErnan Nlchfullln, “Openness and Secrecy In Science: S(>me Noteson Early Hist(>ry, ” Science, TechniJ/(]~}’, & }fuman I’alues, ~r(~l. 10,spring 1985, pp. 14-23,

‘Walter 1’, Metzger, “Academic Freedom and Scientific Freedom, ”Limits of Scientific inquiry, Gerald Holton and Robert S. Morison(eds. ) (New York: W.W. Norton & Co., 1979), p. 102.

‘Ibid , p. 107.‘Carr[]ll I’ursell, [Department ~~t History, Universit} (lf Call[~~r-

nia, Santa Barb,] ra, perw}nal c (~m mu nlcat i [>n, 1 Q85.

puts the community at risk through hazardous,dangerous procedures (e.g., experimenting withexplosives in Times Square). Such “moral regu-lations’’--enforced through social condemnationor disapproval— have been the predominant con-trols on research for centuries, other than thosewhich arose in connection with military research.Neither Federal nor local governments in theUnited States had formal laws, rules, or policiesby which the subjects or procedures could be con-trolled.

This environment changed for American scien-tists in the 1940s. When the United States enteredWorld War II, the scientific community joined inthe war effort and, just like millions of other peo-ple, scientists relinquished to the Governmenttheir personal autonomy over how they did theirwork. They accepted government control overagenda, over process, and—in the case of infor-mation considered to be of military importance—over dissemination even to their colleagues. Whenalmost everyone in science was working behindthe secrecy fence, the communications restrictionsdid not seem so onerous. There was a comraderieand free exchange that participants recall as fre-quently greater than in the structure of univer-sity departments. Within the Manhattan Project,for example, Robert Oppenheimer successfullyconvinced General Leslie Groves not to implementirrevocable application of “compartmentaliza-tion. ” Groves wanted to keep scientists from shar-ing information with their colleagues in otherparts of the project; Oppenheimer argued thatsome decompartmentalization was necessary forprogress at both the laboratory and the individ-ual level. The perspective argued by Oppenheimerwas that the scientists were the best judge of howto get to their goal. He also believed that the crea-tive scientist required intellectual “elbow room”for the cross-fertilization that could be vital in anew field. Each individual had to feel free to pur-sue research whatever way he or she wished.

Page 55: The Regulatory Environment for Science - Princeton

53

MECHANISMS FOR INDIVIDUAL OR RESEARCH GROUP CONTROL

At the level of individual choice, then, research-ers in a democratic system decide what line of re-search to pursue, how to test their hypotheses,which data to gather and how to gather it, andwhen and to whom to tell about their results. Thescientists simply seize on personal freedoms avail-able to all citizens.

The amateur astronomer provides an excellentexample of how little externally imposed controlscan affect a researcher who is working outsideconventional institutional settings for research,academic or otherwise. There are approximately10,000 amateur astronomers in the United States,many of them engaged in the search for new as-tronomical bodies, or in recording astronomicalphenomena. Almost all work at their own ex-pense, in return for the reward of discovery, thejoy of creative activity. Such an individual candecide what to do, can build his or her own equip-ment according to any schedule, and can elect totell everyone or no one* about the results. Toachieve recognition and acceptance by the com-munity of professional astronomers, however, aresearcher must adhere to the standards andnorms that govern conduct in the field and mustsubject that work to review by colleagues, usu-ally through the journal peer review system.

Agenda Controls

In research groups, internal factors may notonly direct but also constrain research. At anygiven point in the development of a scientificspeciality, for example, there exists some finite setof research topics that are considered by the mem-bers of the speciality to be legitimate, interesting,and feasible.10 If peers do not consider an areato contain “interesting” questions and hence to beintellectually stimulating or professionally reward-ing, then researchers may suspend research outof concern for their professional reputations. In-fluence on the researchers to control or restraintheir own work may also come from the socialenvironment, as when the public raises questionsabout the morality of a project. Other researchtopics may receive little attention and no fund-ing because peers consider them to be outside theboundaries of “legitimate” science; research onparapsychology often falls into this category, Re-searchers working on new chemicals for introduc-tion into commerce or on new pesticide formula-tions may tend to shape their investigations so thatthe chemical will withstand scrutiny and secureapproval under such Federal environmental reg-ulations as the Toxic Substances Control Act orthe Federal pesticide laws.

The cost of instrumentation and the internal al-location of resources can be devices for dramati-cally controlling the research agenda. The majorexperimental facilities for high energy particlephysics research, for example, are few in numberand because the demand for time on the acceler-ators far exceeds the time available, laboratories

‘ ‘Darvl E. Chubin and Terence C(lnn(}ll} Rcwarc h Trail\ andS c i e n c e I’(~llcies: Local and ~xtr.~-l.(~ca] Nc~L~t iatl(~n (}i SC ltntlt IL\\’ork, Scientific ~Sfdb/l\hl?lf’rI~\ .Ind }firrar( hit’s .S[l( J()/()g~ ()/ th(

S c i e n c e s I.TC)] b, Norbert Ella+, et al ~ vds } ( [ {Inxham \lA: I<cldc,lH(~lland, 1Q82 ), p, 2 0 . 3 .

Page 56: The Regulatory Environment for Science - Princeton

54

Box A.-NIH Study Sectbns as a Mechankmfor Control d A#?M48 , ,

Thepeerreview system, whereby scientists ad-vise the Federal Government on huw to akeateresearch funds, is rqxesentative of a tw-pronged mechanism for control of the ms@arch-da: decisions an fundiq p&Wti4XdiW propS& both S@ the current & kr~ WUIaffect the direction of a field or discipline. Thernuki4?vel process used by NM provides a godexample of such controls.

A research proposal submitted to NM is firstscanned by an OffiUer in the Division of RM?arChGrants, WhO refers the pro@wd to tk mostxebVant of 4S chartered Study -sl?!etkns arid to @wmost appropriate Institute of NM4. Each studysection is composed of app+ximdy 1$ *n-tis~s and a fulktime? nom- mwcwtk -tary who is an NIH scientitk W exaathm *-retary nominates memti of *- study aeetigwfor selection by thehJIHDimctcw. s&ldy&q&mmembers serve a 40year tgrm andme43 tbxkesa year to review mmr NM pFoposds pw Wmkm.

The study SeCtiOt@ assi@’k to @ach + @priority score ba8ed @n sCMNiEi4 merits R@ikw-ers am not aware of the to@, bud@ary ratpwston the prqxwal. The relevant Mtut@ theh re-ceives the falikedpmposak and pa$ses * Mx@”kg decision on to itit advisory coWwilrWhkk’i*composed of kth saientkt~ and publk &jgums.BeC?#se h=titute$ with MJer bud@iets ,* ablata fired HM2* proje@#?, Solite CWq Ck@:thikthigh qualiky rAear&h ti’$o ~-%- ~

where social and W-i n i % l o w e r priorfty-foi’h?sser quality m?aearcfl may be ftmdd @ IW08e“glamorous” di#ase eat@&rie& The st##@MN&tiolts ako CCMMkt workah@ps to*rSp@?t Qlx’thastatus of the field, thereby prw!ding kqpterrnP- ~~● n to.tkand sen~ S@als to the!@/ntihlR Commimiiydxxlt areas of potential Merest to *h”@’ailtinffagency.

adopt specific criteria for selecting experimentsand assigning priorities to them. The InternationalCommittee for Future Accelerators of the Inter-national Union of Pure and Applied Physicsrecommends four criteria be used for selecting ex-periments and determining their priority: 1) scien-

tific merit, 2) technical feasibility, 3) capabilityof the experimental group, and 4) availability ofthe resources required.

At the individual level, research may be re-stricted not only by mechanisms driven by polit-ical, economic, or professional concerns but alsoby personal values. Especially when alternativesare limited, individuals “feel forced to chooseamong projects they would normally not con-sider, the moral questions . . . become more im-mediate and controversial.”11 The decision not toparticipate in military-supported or weapons-related research, therefore, may be also a deci-sion to alter one’s lifetime research agenda. Thisdecision is a personal one, an individual ratherthan a collective control .12

In each of these instances, the regulatory forcemay actually be outside the research group, butthe group or the individual chooses to suspenda line of research in response to either moral oreconomic pressure.

Controls on Procedures

Within each scientific field or research special-ity, the social influence of tradition and “stand-ard practice” also govern aspects of the researchprocess. Even though these practices are volun-tary, they carry the authority of social norms. Onoccasion, they may later form the basis for insti-tutional or governmental regulation.

Many controls are linked to the formal princi-ples and rules that govern admission to a profes-sion. Since antiquity, the medical profession, forexample, “has formalized principles and rules ofconduct for its members in prayers, oaths, andcodes. ”13 Universal codes adopted by more thanone scientific field may relate basic ethical andmoral principles to research practice. The Nurem-berg Code of 1947, for example, serves as themodel for many international and national codespertaining to clinical research in general and tosuch topics as organ transplantation .14 Another

I IRc)~emary Chalk, “Drawing the Line: Science and hfilitary I~e-

search, ” unpublished manuscript, May 1983, p. 24.“Ibid,, p. 22,“Judith P, Swazey, “Protecting the ‘Animal of Necessity’: Limits

to Inquiry in Clinical Investigation, ” Gerald Holton and Robert S.Morison (eds. ) (New York: W.W. Norton & Co., 1979), p. 136.

“Ibid., p. 132.

Page 57: The Regulatory Environment for Science - Princeton

important code has been the World Medical Asso-ciation’s Declaration of Helsinki (1964, revised1975).

In some research areas, laboratory groups havenot refrained from imposing voluntary clinicalmoratoria—prohibiting researchers from, for ex-ample, using on patients a procedure still consid-ered to be experimental. Such a moratorium canlast weeks, even years .15 It is linked to percep-tions of the risks associated with premature useof a procedure and to the incomplete nature ofthe research process, however, not to the topic.

The most dramatic instance of a voluntary mor-atorium on basic research was, of course, that im-posed by molecular biologists in the 1970s. Re-combinant DNA regulation arose first in the formof a moratorium called by researchers in the field.What began in private discussions was dramati-cally brought to public attention when, as de-scribed in chapter 2, biologists proposed a vol-untary suspension of certain types of geneticresearch. This extraordinary step was followedby the 1975 Asilomar meeting, when researchersfrom the United States and elsewhere, discussedthe appropriateness of continuing the moratori-um. After that meeting, action moved to the Fed-eral Government level—to legislation and the for-mal development of National Institutes of Health(NIH) guidelines for the research.

Communication Controls

Through the centuries, individuals have also ex-ercised self-restraint in dissemination of research

ludlth I’ Sw~ze} and Rcnee C. Ft~x, “The Clinical \loratorium:A (-ate clt hlltr~l L’alvc Surger}, E~perirnentat~on 1$’ith HunrarISuh/e~ts, J)dul A, Freunci (cd. ) (LA!e~v York: George Brazil]er, 1%Q ).

55

results, either by withholding the information al-together or by delaying dissemination for a shortperiod of time. British mathematician John Na-pier, who had experimented with a new form ofartillery around 1600, “took great pains to con-ceal the workings of his invention.”16 In 1947,Massachusetts Institute of Technology professorNorbert Weiner refused to supply a paper of histo an industry scientist engaged in military re-search. 17 In the 1980s, many biologists assert thatresearchers in certain areas are refraining fromsharing information and substances with col-leagues, ’a

The science community as a whole may sup-port an investigator’s desire to avoid prematuredisclosure, especially when data are incompleteor not yet published in a refereed journal. Suchdelay may also be linked to the researcher’s de-sire to maintain professional security through tem-porary but exclusive control of knowledge. A dra-matic example of group self-censorship occurredbefore World War II when a group of physiciststried to limit the publication of scientific researchrelating to nuclear fission. By the middle of 1940,most physics journals had agreed informally “todelay the publication of any article that might helpa knowledgeable scientist build an atomicbomb.” 19

‘“Cited in Chalk, c~p. cit , p Q.‘“Norbert Weiner, The At)ant~c .Ilc)nthl}’, letter t(\ the edltot-, ]an-

uary ICM7; reprinted in Science, Technolc)A~j, & Humdn I’alues, vol.8, No. 3, s u m m e r 1Q83.

I H ~io](J~ist5 in[erJriem~e~ bv Sandra Pan em ‘unltt]rnll! agreed t h~tthere was more . free information exchange ]n the I Q~Os th~ncurrently .“ Sandra Panern, “ T h e Interfer(}n D]lemma Swrec} \FOpen Exchange, ” The Broohirrgs Re\iew’, winter 1 Q8?, p, ZO

‘*Michael hl. Sokal and Janice F. Goldblurn, “Fr<lnl tht’ Archi\’e\,Science, Technology, & Human Values, vol. 10, spring IQ85, p. 24.

REGULATIONS IMPOSED BY INSTITUTIONS

The attitudes and policies of research organi- perimental protocols and safety procedures andzations can regulate the agenda, procedures, and have policies on what subjects are not acceptable.communication of a project, through both infor- A research group may decide deliberately to takemal guidelines enforced by social pressure and for- up or drop a specific line of research for politicalmal rules enforced by threat of dismissal or pen- or moral as well as scientific considerations. Thosealty. Although the extent and stringency of rules decisions—whether to pursue certain topics, ormay vary, most organizations have specific ex- how to disseminate results—can reflect such fac-

Page 58: The Regulatory Environment for Science - Princeton

56

tors as: 1) fear of social criticism or protest, 2 )accommodation to the pressures of the surround-ing social or political climate, or 3) a desire to pro-tect property rights (e.g., when a team delays pub-lication until assured of patent protection or keepsa project secret in order to assure first publica-tion) .20

As a result of discussions from the Vietnam era,some U.S. research laboratories or universitiesdecided not to allow classified research to be per-formed on their campuses. ” Ohio State Univer-sity, which accepted over $5 million in defensecontracts research in 1982, now bans any campusresearch on offensive weapons .22 Other univer-sities impose procedural restrictions—e.g.,whether classified work may be accepted with anyentailing restrictions on publication23—or haveplaced such work off-campus and attempted toinsulate it from the research environment of un-dergraduate or graduate students.

To enforce such administrative policies onacceptable research topics or procedures, orga-nizations employ a variety of mechanisms, rang-ing from informal guidelines and review commit-tees, to formal administrative rules. In most cases,these controls are enforced through social pressureor reprimand. Many universities have adoptedguidelines for the acceptance of externally spon-sored research which govern, for example, theterms of university-industry cooperative projects,the use of human subjects in experiments, or thehandling of dangerous biological materials.” Suchpolicy documents may govern with a velvet glove,however. As the Harvard University guidelinesnote, “The pursuit of truth in the academic com-munity is impossible without a measure of mutualtrust between its members, and no set of detailedprinciples and criteria can be a substitute for this

‘“As described in, for example, lames D. Watson, The DoubleHelix (New York: Atheneum, 1968).

‘ ‘Robert C. Cowan, “Degrees of Freedom, ” Technology Revie~\r,August September 1985, p. 6; also see Dorothy Nelkin, The Llnit’er-slt~r and ,$Iilitar}’ Resedrch: Lfc)rdl Politics at &flT f Ithaca, NY: Cor-nell University Press, 1972),

“Kansas Cit.v Times, Jan. 13, 1983, p. A-7.“ C h a l k , op. cit., p. 13.“AS described in, for example, Nicholas H. Steneck, “The Univer-

sit y and Research Ethics, ” Science, Technolog}r, & Human Values,VO]. q, fa]] 1984, pp. 6-15.

trust .“25 The Harvard report further points outthat:

. . . the principal means by which the faculty ex-ercises control over the quality of the scholarlyactivities of its members is through its role in rec-ommending the selection of its own members andthrough the professional standards that it and theUniversity apply in the selection process.26

To administer such rules on a routine basis,universities and private laboratories set up institu-tional committees to review safety procedures, todecide which topics to pursue, or to review thequality or process of publication. Committees aredominated by members of the institution and areappointed and funded by the institution. Manyexist as a direct result of Federal regulations tiedto grant or contract funds. Institutional AnimalCare and Use Committees, Institutional ReviewBoards, and the Institutional Biosafety Commit-tees, for example, are all required of institutionsreceiving funds from the Public Health Service,NIH, or other compliant agencies. Institutionalreview boards, which are (administratively andfinancially) local committees of the institution,nevertheless must include both scientists and non-scientist members from the local community, allof whom “examine and pass judgment on the risksand benefits of a proposed study and on the ade-quacy of the consent proceeding as described inthe research protocol.”27

Some institutional committees govern theadministration of a specific research program. Forexample, in 1977, the Monsanto Co. and HarvardUniversity set up a special independent advisorycommittee to oversee aspects of their $23 millionresearch agreement. The five-person committee,established out of concern for the public interest,assures that “both sides honor their contractualpromises to protect academic freedom—namely,the right to publish—and to develop any prod-ucts that may emerge in a manner consistent withthe public good. ”28 The Health Effects Institute,

“Report of the Committee on Criteria for Acceptance of Spon-sored Research in the Facult,v of Arts and Sciences (Cambridge, MA:Harvard University Press, October 1983), p, 3.

“Ibid.‘ 7 Swazey, op. cit., p. 139.“Barbara J. Culliton, “Harvard and Monsanto: The $23-Million

Alliance, ” Science, vol. 195, Feb. 25, 1977, p. 759.

Page 59: The Regulatory Environment for Science - Princeton

57

a private research firm, insulates scientists frompressure by using two independent committees,one that creates the research agenda, another thatreviews finished work.29 In the national labora-tories, visiting committees, composed of scientistsfrom other institutions, are used to evaluate thequality of work in the lab.

Institutional restraints on communication mayresult not just in suspension of publication but alsodeliberate, albeit temporary delays. At a 1984seminar on secrecy in science, many universitydeans of research remarked that it was not un-usual for them to receive requests to delay the sub-mission of a Ph.D. dissertation to University

“Sc~nce, Feb. i5, 1985 , p, 72Q.

Microfilms, Inc. (a general clearinghouse for U.S.theses and dissertations) and that it was not un-usual for graduate schools to cooperate—as a mat-ter of policy—by granting l-year delays .’” Typi-cal reasons for such requests were: 1) to allow thestudent time to seek patent protection, 2) to al-low time for first publication in a journal of rec-ord, 3) to provide a degree of protection for in-dustrial sponsors of research, 4) to protect thesafety and welfare of informants used in the re-search, or 5) to protect militarily-sensitive infor-mation.

‘L’’’ Opennes~ and Secrecy in Scientific anci Tcchn]c.~1 Commun]-cation, “ seminar, Jranderbilt1Q84,

CONTROLS BY PROFESSIONAL ORGANIZATIONSSCIENCE AND ENGINEERING

Professional codes or industry associationguidelines may act as a regulatory force on theresearch conduct of members. These rules arevoluntary standards that reflect private consensuson public matters. They are enforceable primar-ily through the social pressure of membership inthe association and hence are effective only whensuch membership is useful or necessary to acquir-ing or maintaining employment in the field or inacquiring a government grant or contract.

In some scientific fields these rules may formthe foundation for State licensing procedures—as in the case of physicians or engineers. In otherfields, the codes may pertain to accepted proce-dures in the field or to testing. The American Psy-chological Association, for example, has issuedguidelines for research psychologists who use ani-mals, as have such groups as the Society for Neu-roscience, the Society of Toxicology, and the In-ternational Society for the Study of Pain.31 TheAmerican Psychological Association has also de-veloped a set of Ethical Principles in the Conductof Research with Human Participants. The Amer-ican Anthropological Association and the Amer-ican Sociological Association adopted new codesof ethics for research in 1971. The Evaluation Re-

“Science, vo]. 228, May 17, 1985, p, 830

Universit}r, Nashville, Th’, Sept 2 4 ,

IN

search Society has established professional stand-ards for evaluation research. The American Chem-ical Society developed the Chemists’ Creed andProfessional Employment Guidelines, which setstandards for practice in research settings. Mostscientific societies have, at minimum, guidingprinciples for the conduct of research. Some haveextensive and detailed guidelines for agenda, pro-cedures, and communication of research results.

Recent concern about the leakage of militarilysensitive information to the Soviet Union fromthe U.S. western allies has led the Federal Gov-ernment to request the specific cooperation of thescientific and technical societies in regulating com-munication. The Government has asked societiesto restrict access to certain meeting sessions—thatis, the session would be neither classified nor opento all meeting participants. The American Asso-ciation for the Advancement of Science Commit-tee on Scientific Freedom and Responsibility hasidentified examples of “self-imposed restrictions”in a few professional societies that exclude non-U.S. citizens from meeting sessions dealing withmilitarily sensitive topics .32 The Society for the

—-“Robert L. Park, “Intimidation Leads to Seli-Censorship in

Science, ” Bulletin of the Atomic Scientists, IO]. 141, No. 3, spring1985, pp. 22-25,

Page 60: The Regulatory Environment for Science - Princeton

58

Advancement of Material and Process Engineer- At least half a dozen professional societies are ei-ing and the Society of Manufacturing Engineers, ther reconsidering or reformulating their policiesamong others, have voluntarily censored them- eon restricted meetings. *selves, limiting attendance at their meetings (orat specific meeting sessions) to U.S. citizens only.33

“Janice R, Long, “Scientific Freedom: Focus of National Secu-rity Controls Shifting, r’ Chemical & Engineering News, July 1, 1985, ‘The AAAS is distributing a survey to determine the extent andp. 9. pervasiveness of this trend.

MECHANISMS FOR GOVERNMENT REGULATION

Federal, State, and local governments use leg- tional review of project applications.”36 Its prin-islation, executive (e.g., Presidential) directives, cipal regulatory effects are felt in the universityagency rulemaking, and so forth, to exert con- research labs, although the regulations on molecu-trol on the conduct and dissemination of research. lar biology research or on the use of human sub-Such mechanisms differ from entity to entity. jects in experiments have also been applied to

A comparison of two Federal agencies that reg- some industry research.

ulate biomedical research demonstrates how dif- Because of the variety of mechanisms and en-ferent mechanisms can constrain research. TheFood and Drug Administration (FDA) closelymonitors drug research and applies its programof control—e.g., in requiring the assurance of con-sent by subjects involved in clinical trials—uniformly throughout the United States. Tighterrequirements for the use of human subjects wereinstituted in 1962. This organizational approachis tied to FDA’s principal mission of regulationaimed at protecting the consuming public; its en-forcement power to regulate research comes fromthe fact that it must approve the marketing, ad-vertising, and distribution of all drugs sold in theUnited States. The principal regulatory efforts ofthis agency are thus directed at research in thepharmaceutical industry .34 In contrast, NIH, asan organization that supports and conducts basicresearch, applies a philosophy of encouraging aca-demic freedom and imagination in the researchit supports through its extramural projects grantsprogram. 35 The NIH approach uses a system of

decentralized, institutional review committees thatoperate under generalized ethical guidelines. NIHalso “takes direct responsibility for the protectionof research subjects under its own system of na-

“Alexander M. Schmidt, “The Politics of Drug Research and De-velopment, ” The Social Context of Mecficaf Research, Henry Wechs-ler (cd. ) (Cambridge, MA: Ballinger Publishing Co., 1981), pp.233-262.

35 William J. Curran, “The Approach of Two Federal Agencies, ”Experimentation With Human Subjects, Paul A. Freund (cd. ) (NewYork: George Braziller, 1969), p. 449.

forcement in the executive branch agencies, newfields often face a thicket of duplicative or evenconflicting requirements. The December 31, 1984,Proposal for a Coordinated Framework for Reg-ulation of Biotechnology37 issued by the Officeof Management and Budget is evidence of increas-ing concern about this problem. In May 1984, theWhite House Cabinet Council established a work-ing group on biotechnology to review Federal reg-ulatory rules and procedures relating to the bio-technology industry. All three affected agencies(FDA, Environmental Protection Agency, andU.S. Department of Agriculture) would reviewbiotechnology products and processes. Under theFramework, review would proceed on a case-by-case basis, each with its own staff, consultants,and expert scientific advisory committees. The Re-combinant DNA Advisory Committee (RAC)would continue to oversee rDNA experiments re-lated to biomedical research and the NationalScience Foundation (NSF) would form a reviewcommittee to examine the potential environmentaleffects of basic research experiments employingrDNA. All five advisory committees would re-port to a parent committee, the BiotechnologyScience Board, which would receive summariesof all recombinant DNA, recombinant RNA, orcell fusion applications and may undertake itself,

“Ibid,3749 Federal Register 50856-50907,

Page 61: The Regulatory Environment for Science - Princeton

or request that the agency committee review, aspecific proposal. In addition, the Board wouldevaluate review procedures and committee re-ports, conduct evaluations of broad scientific is-sues relating to this research, develop guidelines,and provide “a forum for public concern. ” As ofthis writing, NIH is in the process of reviewingcomments on the proposed regulations, and nofinal rule has been issued.

Such coordination, is unusual, however. Nor-mally, government regulation of research is ad-ministered through a number of uncoordinatedand therefore potentially conflicting mechanisms.These include: national or local commissions toreview general procedures and policies; tax credits;legislative review and veto; moratoria; controlsof acquisition or possession of materials neededfor research; interpretation of regulations; con-tract provisions; and publication or communica-tion review or classification.

Agenda Controls

Governmental Review Commissions

The Government may set up a commission orboard to review research, either with an eye toimproving research procedures or to resolve somedispute. An example at the local level is the Cam-bridge Experimentation Review Board, whichruled on the safety of rDNA research in the 1970sat the request of the Cambridge City Council. Ata national level, the NIH RAC approves experi-mental procedures and must consider any pro-posed changes in the NIH Guidelines for ResearchInvolving recombinant DNA Molecules. The NIHcommittee, composed of both scientists and non-scientists, meets several times a year to examinespecial cases of recombinant DNA research, pe-titions for examption, and proposals for chang-ing the Guidelines. The Committee must also rec-ommend to the Director of NIH any proposedchange in the guidelines.

National commissions have been used to for-mulate guidelines for research, but most have nothad any power to restrict or delay actual re-search —e.g., the National Commission for theProtection of Human Subjects of Biomedical andBehavioral Research, established by Congress in1974 to develop guidelines for such research.

59

Tax Credits

Legislation offering tax credit or similar finan-cial incentive to encourage and discourage re-search is passed usually to provide incentives forprivate funding on designated topics (e. g., re-search on alternative energy sources). In 1980,Congress gave small businesses and universitiesthe opportunity to obtain patent rights on inven-tions developed with Federal funds.38 In responseto the energy crisis of the 1970s, over 30 Statespassed laws to promote solar energy. Most ofthese laws provide tax incentives (income or prop-erty tax reductions) to stimulate private sector re-search, development, and commercialization ofsolar systems .3” Congress has been similarly ac-tive in passing legislation that promotes researchon alternative energy systems. And the 98th Con-gress enacted the Orphan Drug Amendments tothe Food, Drug, and Cosmetic Act 40

in order to

s t i m u l a t e p r i v a t e research and deve lopment(R&D) of drugs for rare diseases.

The Internal Revenue Code also currently al-lows businesses the option of deducting or amor-tizing expenditures for research and experimen-tation over a period of 60 or more months. TheEconomic Recovery Tax Act of 1981 (Public Law97-34) provides a 25 percent tax credit for incre-mental research expenditures made after 1981.That legislation reflects a deliberate attempt byCongress to reduce tax burdens in order to stim-ulate research. It includes tax credits for labora-tory equipment leases, and for portions of pay-ments to universities to perform basic research.41

These provisions are set to expire at the end of1985. To stimulate research by private industry,Congress has tried to lower the cost of privateR&D through a combination of tax policy, directspending, and patent legislation.

Social science research was specifically excludedfrom the areas of research spending for which atax credit is allowed. Bills introduced in the 98th

Page 62: The Regulatory Environment for Science - Princeton

60

Congress, which would have created additionalincentives for corporate investment in researchand development, also excluded social science re-search from their definition of “qualified research”for which a corporation may take a tax credit.

The Congressional Budget Office estimates thatthe Federal Government gives the high-technologyresearch system approximately $1.5 billion in theform of tax credits.42 There are currently three ma-jor tax-related mechanisms that high-tech researchindustries may employ: 1) straight deductions,year by year, of research expenses such as sala-ries and equipment [Internal Revenue Code, Sec-tion 174]; 2) the R&D tax credit, which expiresin 1985 and allows a company to deduct from itstaxes 25 percent of amounts that exceed the pre-vious 3-year average of amounts the companyspent on research [Public Law 97-34]; and 3) thefunding of research through R&D limited part-nerships, possible through several provisions ofthe tax code.

Legislative Review

Congress has several times attempted to con-trol specific research projects or types of projectsthrough legislative review of proposals or proj-ects. In 1975, Representative Robert E. Baumanof Maryland introduced an amendment to theNSF authorization bill (H.R. 4723) which wouldhave allowed Congress to review all NSF pro-posals prior to the final awarding of the grants.The Bauman amendment passed the House, butwas deleted in Senate Subcommittee and there-fore not included in the final bill. The strongestargument against the amendment was the burdenthat would be placed on Congress to review thou-sands of grants. A second argument was thatMembers of Congress are not scientists and there-fore are not necessarily competent to judge spe-cific research proposals. What might sound friv-olous and inconsequential to a layperson can beof major importance to scientific development.A third argument was that consistency would re-quire Congress to oversee the grants made by allother agencies, including the Department of De-fense (DOD). Other arguments included the ad-ded length of time to receive a grant, the politiciz-

‘2C;n~rewiclnal Budgyt Office, k(~eral Supptlrt for I{&[) 2nd ln -no~r.]tion,” ilpri] 1Q84, p p . 7b-83.

ing of the award decisionmaking process, and thepossibility of making NSF more conservative (andpossibly less innovative) in its effort to pleaseCongress. In 1983, the Supreme Court effectivelyruled in INS v. Chadha (103 S. Ct. 2764) thatlegislative vetoes, such as proposed in the Bau-man amendment, were unconstitutional.

In the 99th Congress, Representative Robert G.Torricelli of New Jersey introduced a bill entitledthe “Information Dissemination and Research Ac-countability Act” (H. R. 1145), which has similarevaluative intent. The purpose of the bill is to“promote the dissemination of biomedical infor-mation through modern methods of science andtechnology and to prevent the duplication of ex-periments on live animals. ” It calls for a NationalCenter for Research Accountability, locatedwithin the Library of Medicine, which would pro-vide for a comprehensive, full-text literaturesearch before any research proposal involving theuse of live animals could be approved. Thus, allanimal research proposals would be funneled bythe potential granting agency through the Cen-ter prior to approval. The President would ap-point 20 persons to serve as members of the Cen-ter. Critics of the bill claim that the process wouldunnecessarily prolong the already extensive grantsreview process and would duplicate the peer re-view process. In addition, the duplication of re-search that the bill intends to eliminate is oftenan important and required aspect of research inthat it enhances validity and reliability, Propo-nents of the bill feel that it will prevent the un-necessary use of animals in research.

Moratoria

The most dramatic device used by governmentto control research is the legally enforceable banor moratorium. Moratoria on science—or the pro-

posal of same—are effective ways to focus atten-tion on a serious issue or to express a political per-spective that the researchers appear to have beenignoring. The 1974 act prohibiting experimenta-tion on human fetuses (see box B) is an exampleof such a ban. In 1975, the Department of Health,Education, and Welfare prohibited the funding ofresearch on in vitro fertilization without review

“h’[lchael Cold, “Research olt-limits, ” Sc)ence 8.5, vol. b, April1985, p. 36.

Page 63: The Regulatory Environment for Science - Princeton

61

Box B.—Fetal Research

The controversy over research on fetuses or fetal tissue illustrates the implementation and effectof various mechanisms for government control of research.

Between 1970 and 1972, advisory groups within the National Institute of Child Health and Devel-opment debated NIH policies on the review and funding of human fetal research. Reports of abuses byresearchers in other countries had led to pressure on NIH to declare a moratorium on any research withthe living fetus before or after abortion, lest such abuses be repeated in the United States, Congressionaldebate and legislation (National Research Act, Public Law 93-348, Section 2130) created the NationalCommission for the Protection of Human Subjects of Biomedical and Behavioral Research, which producedrecommendations that became the core of the Federal regulations. The Commission intended the exist-ing independent, local institutional review boards (IRBs), with the addition of a national Ethics Advi-sory Board, as the means for control and consideration of fetal research. The fetal research guidelineswere adopted as Federal regulations on July 29, 1975, but an Ethics Advisory Board was not chartereduntil 1977 or convened until 1978, so a de facto moratorium existed during this time on both fetal re-search and in vitro fertilization.

The Ethics Advisory Board was allowed to die in 1980 and its absence means that the privatelysupported IRBs are still the only locus of practical control. Supplementing the Federal regulations are25 State statutes and the Uniform Anatomical Gift Act (passed in all 50 States by 1973). That act governsresearch on dead fetuses generally by including explicit language to that effect. Department of Healthand Human Services (DHHS) regulations cover only DHHS funded research or research institutions;other research is governed by State statutes, when they exist. They range from strict (no research exceptthat of therapeutic value to mother or fetus) to those more liberal than the Federal regulations.l

In summer 1974, the National Commission for the Protection of Human Subjects declared a na-tional moratorium on all fetal research, which remained in effect until the Commission issued interimregulations in 1975.

In several cases in Boston in the 1970s, State-level protest also attempted to halt fetal research. Be-cause the research in question used dead fetal tissue obtained from abortion procedures, antiabortiongroups charged that publication of findings based on research using such tissue was unethical. In re-sponse to the public protest, the 1974 session of the Massachusetts legislature passed “An Act Prohibit-ing Experimentation on Human Fetuses, ” which treated such research as a felony offense punishable byup to 5 years in prison. The law (amended in 1976) applies only to fetal tissue and does not prohibitexperimentation on live fetuses in efforts that might be considered “beneficial” to the fetus. Many legalscholars consider the law to be imprecise and, because of that imprecision, to have the potential forproducing a “chilling effect” on researchers. On the other hand, opponents of the research have pushedfor stronger legislation that would make violation a criminal offense. Controversy has continued withthe introduction of legislation in Congress proposing a national moratorium on research on live fetuses.

Rules restricting federally funded research on human fetuses, set in place in 1974, specify that indi-vidual research proposals must be reviewed by a federally appointed Ethics Advisory Board. When theEthics Advisory Board was allowed to lapse in 1980, the effect was a de facto ban. No research in thisarea can be done until it is reinstated. Four succeeding Secretaries of DHHS have failed to limit the morato-rium. In January 1984, the NIH sent a request to DHHS for reestablishment of the board, but as of thiswriting, the Secretary has not acted on the request.

‘John C. Fletcher and Joseph D. Schulman, “Fetal Research: The State of the Question,” Hastings Center Report, vol. 15, April 1985, pp. 6-11.

Page 64: The Regulatory Environment for Science - Princeton

62

A moratorium was proposed in 1976 on allR&D surrounding laser isotope separation of ura-nium. The proposal was to suspend all projectsin the United States, “pending the results of ef-forts to achieve agreement with other industrial-ized nations to halt their work in this area. “44 Theproposers acknowledged the complexity of imple-mentation (and improbability of success) of sucha moratorium, but used the proposal itself as away to further public discussion of the policy onproceeding with the research. The Lilienthal-Acheson proposals in 1946 for regulation ofatomic energy would have entailed a similar banon research in certain sensitive fields .45

Controls on Procedures

Acquisition or Possession ofMaterials Used in Research

Research may be regulated through controls onthe acquisition or possession of the chemicals orother substances necessary to do the research. Forexample, the Environmental Protection Agency(EPA) is authorized by the Toxic SubstancesControl Act (TSCA) to regulate the approxi-mately 60,000 chemicals subject to the act, at allstages of their development.46 TSCA recordkeep-ing and reporting requirements apply to researchon chemicals and additional regulatory require-ments apply when the chemicals are introducedinto commerce .47

Federal regulations control the possession, use,and disposal of radioactive substances, includingtheir use in research. Handling of nuclear andradioactive materials is governed primarily by theNuclear Regulatory Commission (NRC), underthe Atomic Energy Act of 1954.4a The NRC hasregulatory power over any materials made in areactor. It does not have any jurisdiction over ac-celerator materials or over naturally occurringradioactive materials, although some States doregulate these substances. If the research institu-tion is located within an “agreement” State, theinvestigator must be licensed by the State radia-

“’Barry hf. Casper, “Laser Enrichment A New [’ath tt~ I’r[~lifera-tic)n?” Bulletin (Jl the Atomic Scientists, lanuar} 1477, pp. 28-41.

“The c(>mplete text of the plan was contained in a State Depart-ment dc~cument, I)uh[icatlc}n 2498 (1946),

‘“15 U.S, ~. Section 2(301‘“15 U. SC. Section 2007‘“!k’ especially Title 10 (~t the Act

tion control agency. If the institution is within a“nonagreement” State, it must apply to NRC fora license. All research must comply with Federaland State OSHA regulations, which regulate ex-posure to toxic substances in laboratories.

For research using radioactive substances, then,there are various degrees of licensing and permits,depending upon what is done and the substancein question. Regulatory jurisdiction varies fromState to State and between the Federal Govern-ment and the State. In addition, there is a tieredsystem of control based on the quantity of mate-rial in question. (NRC has established categori-cal exemptions from certain regulatory require-ments for certain low-level radioactive materials. )The requirements for a license pertain to the per-sonnel and their qualifications, the facility, usesof the material, estimated human exposures, rec-ordkeeping and reporting systems, and disposalpractice .4’ Large institutions conducting a lot ofresearch may apply for a broad license, whichdelegates considerable decisionmaking power tothe institution’s Radiation Safety Committee. Inall cases, radiation safety officers must be ap-proved by the licensing agency and NRC mustknow and approve the qualifications of such in-dividuals. Most research institutions have a ra-diation safety officer, even if they do not havea radiation safety committee. Licenses can be veryprecise, authorizing one investigator to use a spe-cific material for a specific period of time.

There are also regulations for disposal of radio-active waste. All Federal institutions must com-ply with the Clean Air and Clean Water Act, ifapplicable. In addition, the institution must ap-ply for a discharge and disposal permit from theState air pollution agency and/or the State waterpollution agency. Both Federal and local regula-tions may apply to disposal as well.

If an institution is found to be in violation ofthe laws governing radioactive materials, civilmonetary penalties can be imposed, and its licensesuspended or revoked; in addition, the organiza-tion would probably receive damaging press cov-erage. At large institutions, safety precautionsmay ‘be emphasized, therefore, because of the con-sequences of a single mistake, primarily the sus-

’910 CFR Parts 30-32.

Page 65: The Regulatory Environment for Science - Princeton

pension of all of the institution’s research usingradioactive materials.

Department of Transportation regulations enactedunder the Hazardous Materials TransportationAct govern shipments of certain research materi-als (e. g., radioactive materials, etiologic agents,poisons, corrosives, flammables) by requiring spe-cific carriers, containers, or handling practices .50

Federal laws prohibiting the possession of nar-cotics act to restrict research in a number of fields,Researchers who wish to use controlled or illegalsubstances as a legitimate part of their researchmust first register with the Department of Justice;they are then investigated by the Drug Enforce-ment Administration and their research is vali-dated by FDA. If approval is granted, the re-searcher must request the drug from the NationalInstitute of Drug Abuse. If the research also in-volves human subjects, additional reporting re-quirements must be fulfilled. According to theDrug Abuse staff of FDA, the administration ofthese regulations used to be very time-consuming,but recent streamlining of the approval processand an apparently diminished interest in researchinvolving controlled substances have reduced thenumber of requests and the time required toess them.

Requirements for ProceduralReview Committees

Various Federal and State commissions

proc-

havebeen given principal responsibility to implementlegal regulations that apply to research proce-dures—in the case of the rDNA commissions, re-quirements for physical containment of the bio-logical materials used in the research. By focusingon a very specific step in the research process, theFederal Government was able to establish stand-ards of protection for workers and the commu-nity and as well as to set criteria for researcher(and institutional) accountability.51 Once therDNA guidelines were set (with the aid of thescientists), the discussion moved back to thesphere of the policy makers and administrationofficials, who were under political pressure fromenvironmental and public interest groups.

’049 U.S. C. Sections 1803 et seq. and 49 C.F. R. Parts 100-179,“Dorothy N’elkin, “Threats and Promises: Negotiating the Con-

trol of Research, ” Holton and Morison, op. cit., p, 199.

63—.

Regulation may also occur at the very end ofthe research process, as in EPA regulation of fieldtesting of new insecticides or for agricultural re-search involving rDNA. In a recent case involv-ing insecticide-producing soil bacteria, the EPAhas required the company to apply for an exper-imental use permit and to submit more researchdata on various aspects of the bacteria (e.g., itslongevity in soil and its effect on nontarget spe-cies), an action that has the effect of restrictingor delaying the use and dissemination of resultsfrom the project.

The Federal Government may also require thatan institution set up review committees to moni-tor, approve, and shut down projects. The threeprincipal types are Institutional Biosafety Com-mittees (IBC), Institutional Review Boards (IRB),and Institutional Animal Care and Use Com-mittees.

IBCs are mandated by the NIH Guidelines forRecombinant DNA Research and have served asthe major locus of responsibility for oversight ofthat research since 1978. The latest NIHregulations 52 stipulate that an IBC must have “nofewer than five members so selected that they col-lectively have experience and expertise in recom-binant DNA technology and the capability to as-sess the safety of recombinant DNA researchexperiments and any potential risk to public healthor the environment. ” At least two members musthave no affiliation with the institution apart fromtheir membership on the IBC and should be cho-sen to represent the interest of the surroundingcommunity with respect to health and protectionof the environment. Unless exempt from reviewunder the Guidelines, all rDNA research must re-ceive IBC approval. Certain categories of researchconsidered to be of questionable or high risk mustbe referred to the NIH Recombinant DNA Advi-sory Committee for approval before work can beinitiated. There are currently 301 IBCs registeredwith the Office of Recombinant DNA Activity ofNIH. Approximately 250 are academic IBCs; therest are industrial. Compliance by industry isvoluntary.

“Basic DHHS Policy for Protection of HumanResearch Subjects”53 requires that all research in-

“49 Federal Register 227, Nov. 23, 1984.5’CFR 46,

Page 66: The Regulatory Environment for Science - Princeton

64

volving human subjects conducted by the Depart-ment of Health and Human Services (DHHS) orfunded in whole or part by a Department grant,contract, cooperative agreement or fellowship,undergo review by an IRB. (See box C.) As withan IBC, IRB membership is specified in the Fed-eral regulations. The IRB must have five mem-bers and, per the Guidelines, “be sufficiently qual-ified through experience and expertise of itsmembers, and the diversity of the members’ back-grounds including consideration of the racial andcultural backgrounds of members and sensitivityto such issues as community attitudes, to promoterespect for its advice and counsel in safeguard-ing the rights and welfare of human subjects. ”According to the Office for Protection from Re-search Risks, the office responsible for the imple-mentation of the IRB regulations, there are cur-rently over 5,000 operating IRBs in the UnitedStates. A 1979 study by Jeffrey M. Cohen andWilliam B. Hedberg determined that, in 1 year,a typical IRB was in session 41 times and reviewed278 proposals, involving approximately 80,000potential research subjects.54 That IRB cost theuniversity $36,000 during the year, or about $130per proposal.

Institutional Animal Care and Use Committees(IUCAC) are required of all institutions that re-ceive Public Health Service (PHS) funds for re-search involving animals. Under revised PHS pol-icies released May 1, 1985, those committees,which must have lay members, will have the re-sponsibility for reviewing research plans and mon-itoring compliance. Prior to the establishment ofthese guidelines, nearly 1,000 institutions alreadyhad animal assurances through PHS. It is notknown how many new committees will have tobe established once the guidelines take effect. Thenew guidelines also require that all animal facil-ities must also be accredited by the AmericanAssociation for the Accreditation of LaboratoryAnimal Care, a voluntary association that certi-fies animal handling facilities, or must conductan assessment based on the NIH Guidelines forthe Care and Use of Laboratory Animals. Accred-itation is a necessary condition if an institutionwants to receive PHS funds for research involv-

siJ~ffr~Y M, Cohen and William B. Hedberg, “The Annual Activity

of a University IRB, ” ZRB, May 1980, pp. 5-6.

ing animals. By January 1, 1986, each institutionreceiving PHS funds must submit an “assuranceof compliance” (containing such documents asdescriptions of the facilities and membership andprocedure of the local IUCAC) to the NIH Of-fice for Protection from Research Risks.

Contract Provisions

The provisions of research grants and contractsare used to enforce such things as limitations onspending, allocations of funds among budget cat-egories, requirements that organizations be ac-credited or meet certain accreditation standards,and requirements that research designs or proce-dures be approved by the monitoring committeesdescribed in the previous section.

In the case of recombinant DNA research, theAsilomar recommendations were adopted byNIH, which then issued a set of guidelines cover-ing research in designated categories. It appliedto all recombinant DNA research funded by NIH.Today, the amended NIH Guidelines for Recom-binant DNA Research set forth the generic re-quirements for safety in recombinant DNA re-search. These safety requirements apply throughcontract provisions to all recombinant DNA re-search in the United States which is conducted ator sponsored by an institution that receives anysupport for rDNA research from NIH.55 Failureto comply can lead to termination of NIH fund-ing or other NIH sanctions. Although limited toinstitutions funded by NIH, the Guidelines havebeen adopted or followed by virtually all Federalagencies, State and local agencies, and privateorganizations.

Communication Controls

Reporting requirements in contracts can includeregulations on the “deliverables” of a project, usu-ally through prepublication review. Such contrac-tual provisions are currently being invoked in or-der to restrict the flow of information believedto be linked to national military security or re-lated to the Nation’s ability to compete in worldmarkets. Contract provisions have been used, forexample, to prohibit foreign nationals from be-

~~er~l Register 77384, Nov. 21, 1980, and 77409.

Page 67: The Regulatory Environment for Science - Princeton

65

Box C.—Research on Human Subjects

Until the 1960s—with the exception of the Nuremburg Code (1947)—there were no legal decisionsfrom the courts and no Federal or State laws concerned directly with how humans were used in experi-ments (again, with the exception of violations of criminal law). NIH and PHS first began in 1953 todevelop formal standards for human experimentation, but these efforts were not productive. The turn-ing point was the increased volume of clinical investigations funded by the Federal Government (theshift in research performer) and government regulation of experimental drugs (manufacturing, distribu-tion, and safety) in interstate commerce through FDA. Another factor was the thalidomide tragedy abroad,which triggered the 1962 Kefauver-Harris Amendments to the U.S. Food and Drug Act. Congress, indebate and not in the original bill, added the requirement to the law that subjects or patients be informedthat they were to receive an experimental drug not fully licensed by the Federal Government and thattheir consent be obtained prior to receiving it. In 1966, the U.S. Surgeon General issued the first PHSPolicy and Procedure Order for extramural research on human subjects.1 This Order required all institu-tions receiving PHS funds to review projects for the potential of abuse and it led to the formalizationof the current system of IRBs.2

At present, most categories of human subjects research funded by Federal money or conducted atinstitutions that receive Federal subsidies are subject to some form of review or regulation. Federal agen-cies with oversight in this area specifically list those categories that are exempt or not subject to regula-tion; but, research conducted without direct or indirect Federal funding is not subject to human subjectsregulations. In addition, some populations may not be adequately protected by existing regulations.

All research funded in whole or in part through DHHS by direct award, cooperative agreement,or fellowship, then, is subject to human subjects regulations [45 C.F.R. 46, Section 46.101]. Indirectly,all research conducted at or sponsored by institutions which do not receive DHHS funds are not legallyrequired to comply, but according to Charles McKay, of the NIH Office for Protection from ResearchRisks, 96 percent of the 500 major institutions conducting research apply DHHS regulations to researchnot funded by DHHS. Compliance is indicated through a statement of assurance. Regulations of FDAcover clinical investigations with regard to products specified in the Food, Drug, and Cosmetic Act formarketing. These include drugs, biological, blood and blood products, devices, and food additives.In addition, the U.S. Department of Agriculture (USDA) regulates clinical investigations of food addi-tives. For the Department of Defense, the Veterans Administration, and 20 other agencies, varying degreesof regulations are tied to direct or indirect use of their funds. All use IRBs and informed consent prac-tices. There is an effort underway by the Office of Science and Technology Policy-and near completion—to have cross-agency uniform regulations very similar to the existing DHHS regulations.

Despite these efforts, research on human subjects does take place under conditions or in institutionsexempt from DHHS regulation. DHHS itself exempts educational research, test development, interviewand observation research, and research involving specimens and/or medical records, as long as the in-formation taken from these sources is recorded in such a manner that subjects cannot be identified orthat the information revealed would place the subject at risk. The Department of Justice has exemptionsin their coverage of prison research, and has legal prohibitions against regulation of research on recidi-vism and probation. The Department of Defense exempts epidemiological research from regulation. In-dustrially related research (e.g., academics doing consultative work designing systems to improve workerefficiency) is not subject to regulation. All product marketing research (testing of products not FDA orUSDA regulated) is exempt from DHHS regulations; some of it is covered by consumer safety laws,but those only apply after the product is on the market. Moreover, the regulatory intent is to protectthe public rather than the subjects.

‘Judith P. Swazey, “Protecting the ‘Animal of Necessity’: Limits to Injury in ClinicaJ Investigation, ” Limits of scientific Inquiry, Gerald Holtonand Robert S. Morison (eds. ) (New York: W.W. Norton & Co., 1979), p. 138.

William J. Curran, “The Approach of TWO Federal Agencies, ” Experimentation With Human Subjects, Paul A. Freund (cd. ) (New York: GeorgeBraziller, 1969).

Page 68: The Regulatory Environment for Science - Princeton

66

One of the solutions to criticism has been to sharpen the procedures for ensuring that subjects haveknowingly and willingly consented to participate in an experiment. There is now an extensive body oflaw and regulatory controls, and literature, debating the social and moral aspects surrounding the ques-tion of informed consent to human experimentation. Today, the United States regulates research to pro-tect the physical or psychological health or the privacy of the human subjects used in research underguidelines established by the National Commission for the Protection of Human Subjects of Biomedicaland Behavioral Research. The National Research Act of 1974 instructs the Commission to: I) “identifythe basic ethical principles which should govern research involving human subjects and to recommendguidelines and mechanisms for assuring that such principles are observed; 2) “to clarify the requirementsof informed consent to research in the cases of children, prisoners, and the institutionalized mentallyinfirm;” and 3) “to investigate the use of psychosurgery and recommend policies for its regulation.”

ing employed on a research project.56 More re-cently, the Federal Government has used grantand contract terms as a means of directly control-ling dissemination of research results, 57 by requir-ing that the researcher or institution allow prepub-lication review by the sponsoring or interestedFederal agency. Contracts for classified work, ofcourse, routinely include such provisions but someagencies have begun to invoke prior approval pro-visions previously considered to be pro forma (orhave considered revising their standard contractsto include such provisions). The purpose of re-quested delays or review is to allow the opportu-nity for either classification or alteration (editingand censorship).

In one such case in 1980, NSF refused to fundparts of a cryptology proposal submitted by acomputer scientist because of the national secu-rity implications of his work. A later decisionawarded the funds to the researcher with the stipu-lation that he take responsibility for seeking priorconstraint as required by the content of his work.NSF eventually drafted new language for all itscontracts to require that a grantee take responsi-bility for notifying the cognizant NSF ProgramDirector if data, information, or materials devel-oped in the course of research appear to requireclassification. NSF retains the option—after re-view of the information—to defer dissemination,distribution, or publication.

“Scientific Freedom and National Securjty, Issue 5 (Washington,DC: American Association for the Advancement of Science, March1985), p. 7,

“interim Report of the Committee on the Changing Nature O;Inlorrnation (Cambridge, MA: Massachusetts Institute of Technol-ogy, Mar. Q, 1983), Section 4.5,

The mechanism of classification (or the threatof the possibility of classification) can also be usedto delay publication. This tactic has been used bythe National Security Agency to delay journalpublication of a number of scientific articles oncryptography. 58 From 1982 to 1985, Federal reg-ulations on the export of technical informationhave been used to bar certain foreign nationalsfrom attending otherwise open society meetingsand have been used to require researchers to with-draw unclassified technical papers scheduled forpresentation at professional society meetings be-cause foreign nationals might attend those meet-ings. 59

In addition to contract provisions, there arenine major legal mechanisms by which the U.S.Government can restrict formal and certain in-formal communication of scientific and technicalinformation.

1. The Arms Export Control Act of 1976 andthe resulting International Traffic in Arms Regu-lations and U.S. Munitions List, administered bythe Department of State, authorize control of theexport and import of defense articles and defenseservices, including export of technical data relatedto defense articles.

2. The Export Administration Act of 1979, im-plemented through the Export AdministrationRegulations, Commodity Control List (CCL), andMilitarily Critical Technologies List (MCTL), andadministered by the Department of Commerce,

581bid.“Free Trade in Ideas: A Constitutional Imperative (Washington r

DC: American Civil Liberties Union, May 1984); Long, op. cit., seeespecially the table of meetings that have been restricted.

Page 69: The Regulatory Environment for Science - Princeton

— -—

authorizes control of the export of tangible goods,including technical data, in the interest of nationalsecurity and foreign policy and, to a lesser extent,to protect the domestic economy. MCTL desig-nates arrays of technical information, expertise,or equipment that DOD believes would make asignificant contribution to the military potentialof another country if exported. The unclassifiedversion of this list is over 200 pages long; the clas-sified is reported to be over 700. 60 Both the CCLand the MCTL were not intended to be controldocuments, but rather to be reference lists of thesensitive technologies,

3. Executive Order 12356 of 1982 authorizesclassification of information, including that per-taining to “scientific, technological, or economicmatters, ” that is “owned by, produced by or for,or is under control of U.S. government” for na-tional security purposes. The order contains a spe-cific exemption for “basic scientific research in-formation not clearly related to national security,”

4. The Atomic Energy Act of 1954, as amended,places explicit controls on scientific informationand defines restricted data. The act is not limitedto nuclear physics nor even to activities of the Fed-eral Government; its language is sufficiently broadto allow the extension to “privately generated”knowledge as well. A 1981 amendment allows theSecretary of Energy to adopt regulations on thedissemination of unclassified information regard-ing either the design of facilities or their securitymeasures. 61

5. The Invention Secrecy Act of 1951 author-izes the defense agencies to review applicationssubmitted to the Patent and Trademark Office

‘c’ Long, op. cit.*] Harold C. Relyea, “Shrouding the Endless Frontier—Scientific

Communication and Natlona] Security: The Search for Balance, ”Striking a Balance: NatIonaI Secunt-t and Scientific Freedom, HaroldC. Relyea (ed I (Washington, DC: American Association for theAdvancement of Science, 1985), p. 85

6 7

and, if publication of the patent is deemed harm-ful to national security, to declare the inventionsecret for a period of 1 year (with restriction an-nually renewable). The justifications for the In-vention Secrecy Act (1951) are to allow defenseagencies to review applications for patents sub-mitted to the Patent and Trademark Office, withthe goal of catching inadvertent violations. If pub-lication of a patent is judged to be potentiallyharmful to national security, then a 1-year, renew-able secrecy order is issued .02

6. The Freedom of Information Act containsprovisions allowing but not requiring agencies toexempt certain types of information from man-datory disclosure.

7. Executive Order 12333 on Intelligence, issuedDecember 4, 1981, allows for the covert collec-tion of information by agents posing as aca-demics.

8. A number of DOD directives on nationalsecurity and classification based on 10 U.S. C. 140callow that agency to restrict information devel-oped by scientists under DOD contract. Directive5230.25 outlines the conditions under which DODcan withhold classified data from general publicdissemination in accord with export control laws.An October 1984 memorandum on “Publicationof the Results of DOD sponsored FundamentalResearch” sets forth policies on publications. AndDirective 5230.24 (November 1984) requires allnewly created technical documents in DOD tocarry statements defining distribution and indicat-ing how requests for the document should behandled.

9. The Immigration and Nationality Act maybe used to refuse admission to or deport foreignscholars from U.S. research activities.

b21bid.

THE ROLE OF THE COURTS IN THE REGULATION OF RESEARCH

Statutory law governs the role that the judicial Congress or regulatory actions taken under Fed-branch may play in the regulation of research: the eral, State, or local statutes.courts respond to and interpret existing govern-ment action. The Federal courts have no jurisdic- The courts have frequently been used, however,tion other than to interpret the laws enacted by in environmental disputes to require Federal agen-

Page 70: The Regulatory Environment for Science - Princeton

6 8

cies to perform, commission, or use research tosupport environmental policy decisions. Theintent—through lawsuit—has been to force theagencies to increase or, in some cases, to improvetheir use of research. The effect, at least in the1970s, was to strengthen the quality and increasethe amount of environmental science research.Environmental lawsuits are, however, just partof a general shift in the use of the judicial systemto affect social policy.63 More dramatic use of thecourts occurs when, after a plaintiff has filed suit,the court issues an injunction that prohibits theresearch from going forward at all. In the casestudy presented in chapter 7, the Supreme Judi-cial Court of the Commonwealth of Massachu-setts at first issued a temporary restraining orderagainst a municipal public health order that hadattempted to stop research at a local laboratory;that court later ruled to uphold the city’s right toimpose such a ban.

In May 1984, Federal District Judge John Sirica,in effect, put a moratorium on all field tests ofgenetically modified microbes being conducted bythe University of California.64 The experimentsin question involved tests of genetically engineeredbacteria (Pseudomonas syringae) designed to pre-vent frost formation on plants. In September1983, a group of plaintiffs led by Jeremy Rifkinand the Foundation on Economic Trends claimedthat NIH was violating National EnvironmentalProtection Act (NEPA) requirements for an envi-

“Donald L. Horowitz, The Courts and Social Policy (Washing-ton, DC: The Brookings Institution, 1977),

b’Marjorie Sun, “Rifkin and NIH Win in Court Ruling, ” science,vo]. 227, Mar. 15, 1985, p. 1,321.

ronmental impact statement. NIH claimed that itsapproval process more than satisfied the NEPArequirements. 65 On April 12, 1984, Rifkin filed amotion for injunction to prevent the researchersfrom proceeding with a field test experimentscheduled for May. On May 16, Judge Siricagranted the motion for a preliminary injunctionand ordered that: 1) NIH be enjoined from ap-proving experiments involving the intentional re-lease of recombinant DNA, and 2) that the Uni-versity of California be enjoined from proceedingwith the experiment until final resolution on thecase. In subsequent court action, the U.S. Courtof Appeals for the District of Columbia ruled (Feb-ruary 27, 1985) that experiments could proceedif their potential environmental effects were prop-erly evaluated. NIH must now prepare environ-mental assessments.

Recently, an additional legal action has intro-duced controversy into what has usually been aquiet region of the scientific community, agricul-tural research. In 1979, attorneys filed a lawsuit,on behalf of 17 farm workers and the CaliforniaAgrarian Action Project, that charged the Univer-sity of California with unlawfully spending pub-lic funds on mechanization research that displacedfarm workers.66 (For a full description of the case,see app. A,)

‘5Judith A. Johnson, “Recombinant DNA: Legal Challenges to De-liberate Release Experiments, ” U.S. Congress, Library of Congress,Congressional Research Service, Science Policy Research Division,report No. 85-502, Jan. 7, 1985.

~bphi]ip L, Martin and Alan L. Olmstead, “The AgriculturalMechanization Controversy, ” Science, vol. 227, Feb. 8, 1985, pp.601-606.

MECHANISMS FOR SOCIAL CONTROL OUTSIDE OF GOVERNMENT

Public opinion can be a potent force for pres-suring government to enact regulation or enforcemore stringently existing rules. It is most oftenexpressed through the mechanisms of public meet-ings, picketing, protest, violence, or political ac-tion organizations. For example, recent activitiesby animal rights organizations have includedbreak-ins, vandalism, and theft of animals, dataand equipment from laboratories using animalsin their research; in some cases, direct threats have

been made to the lives and safety of investigatorsand their staff. More moderate groups have usedthe traditional policy arena to seek change, lob-bying Congress and State legislators for stricterlaws. In summer 1985, DHHS Secretary MargaretHeckler suspended funding for a head traumastudy utilizing anesthetized baboons at the Univer-sity of Pennsylvania after members of the Ani-mal Liberation Front (ALF) illegally raided theUniversity lab, stealing videotapes and destroy-

Page 71: The Regulatory Environment for Science - Princeton

6 9

ing equipment. DHHS officials contend that whilethe research protocol was considered to be scien-tifically justifiable, the laboratory had violatedits animal welfare assurance to NIH, and that thedecision to suspend funds was unconnected to thesit-in conducted by ALF at the NIH BethesdaCampus.

A number of church groups, in the UnitedStates and abroad, have recently provided an ad-ditional mechanism for expression of social con-trol on research. Through conferences, newslet-ters, and “pastoral letters, ” religious organizationshave attempted to educate their members, raisepublic awareness about the theological and ethi-cal dilemmas posed by research, or to sway pub-lic action concerning regulation of specific areasof research. In 1979, for example, the WorldCouncil of Churches sponsored a World Confer-ence on Science, Faith, and the Future at whichscientists, theologians, trade unionists, business-men, and politicians met to discuss the nature ofscience and of faith. More recently, the EpiscopalDiocese of Massachusetts convened a Biotechnol-

SUMMARY

There have always been measures of controlimposed on the scientific community. Most oftenthe controls have been in the form of self-restraint,imposed by the scientific community through peerreview and peer pressure. These controls havebeen shaped by scientific and technological cri-teria as well as by the social values and normsapparent in the ethical codes and standardsadopted by most scientific societies. It is only re-cently in the history of science that so many in-stitutions and social forces have influenced in somany ways the conduct of science.

As described in this chapter, research can bedirectly controlled at all stages of the scientificprocess. Forces can affect the agenda, the proc-ess, and the dissemination of results. These forces

ogy Study Group to develop a study guide foruse in churches. The guide addresses the impli-cations of new developments in gene therapy,genetic engineering, and fetal research, Theolo-gians have also taken more direct approaches inregistering their concern, such as the 1983 “The-ological Letter Concerning the Moral ArgumentsAgainst the Genetic Engineering of the HumanGermline Cells, ” a letter signed by representativesof virtually every major church organization inthe United States. Most recently, the House ofBishops of the Episcopal Church adoped an offi-cial position on genetic engineering that “encour-ages . . . research directed to an increase in hu-man understanding of vital processes, recognizingthat human DNA is a great gift of God . . .”67 Inaddition, the Bishops asked that Congress ensurethat FDA or an appropriate agency seek advicefrom ethicists and the lay public to assure thatuse of genetic engineering is ethically acceptable.68

‘-cited in Science, vol. 230, No. 4724, oct. 25, 1985, p. 423.‘*Ibid.

can be exerted informally, through professionaland peer pressure; formally, through institutionalmechanisms; or legally, through Executive orders,legislation, and agency rulemaking. Finally, thecourts can be used as a mechanism for interpre-tation of legal controls.

The value of this “cobweb” of direct control liesin its democratic and pluralistic nature. The dan-ger of this system, however, lies in the potentialfor uncoordinated and potentially conflictingmechanisms for direct control. The potential com-plexity of this approach to regulation may be ex-acerbated when compounded by the many levelsand forms of indirect control of research, to beexamined in chapter 5.

Page 72: The Regulatory Environment for Science - Princeton

Chapter 5

Mechanisms for IndirectControl of Research

Photo credit: National Institutes of Health

Page 73: The Regulatory Environment for Science - Princeton

Chapter 5

Mechanisms for IndirectControl of Research

Research can often be as effectively restrainedby the secondary or tertiary impacts of laws orpolicies intended to do something else as by de-liberate imposition of regulatory law or policy.Governmental activity can control the nature anddirection of science even when regulations wouldseem to have very little to do with regulating thescientific enterprise.1 In fact, many of the con-straints that have been most burdensome to re-search institutions—both financially and ad-ministratively—were not intended to affect thesubstance of scientific work. Such constraints asthe clerical and managerial burdens of social secu-rity taxes, equal opportunity and affirmative ac-tion requirements, environmental protection, oroccupational health and safety “limit the auton-omy of administrators and the freedom of re-search workers.”2 Research institutions that do notquestion the importance of such general domes-tic policy actions may nevertheless question theuse of research grants or contracts and in particu-lar the withholding of Federal funds in order toforce an institution to comply. They argue thatconstraints are imposed not to sustain or ensurethe quality of research but in an effort to secure“short-term practical results, regional distributionof funds, and other criteria more or less irrele-vant to scientific excellence.”3 In these cases,science has not been so much singled out for reg-ulation as caught up in society’s growing willing-

‘Melvln Kranzberg, Georgia Institute of Technology, personalcomrnunlcation, 1985.

‘Don K. Price, “Endless Frontier or Bureaucratic Morass?” TheLimits of Scientific Inquir>,, Gerald Holton and Robert S. Morison(eds. 1 ~New York: W. M’. N’orton & Co., 1979), pp. 75-92.

‘Ibid,, pp. 75 and 81.

ness to regulate all kinds of specialized institutionsand activities.4

The most pervasive control on the scientificagenda is, of course, the supply of money, butthat control is uncoordinated. Such influences aremore likely to result in what historian MelvinKranzberg terms a “shotgun approach” to regu-lation via funding. This situation occurs when,in response to a new government program di-rected at a narrow topic (e. g., some form of can-cer), researchers alter their research descriptionsin order to obtain funding for basic or arcane re-search they are already pursuing.

This chapter looks at some examples of gov-ernment laws or actions that, without so intend-ing, appear to be exerting some regulatory influ-ence on research. They can be distinguished fromthe forces discussed in chapter 4 by the fact thatthey are not intended to restrain or inhibit theprocess of scientific research. This chapter ad-dresses first those forces that act both at theproject and the institutional level through thegrant and contract funding mechanisms of theFederal Government. Such restraints were amongthose most frequently mentioned in OTA’s sur-vey of university administrators and laboratorydirectors. A second type of restraint results fromthe implementation of legislation or rulemakingrelated to social programs, such as antidiscrimi-nation statutes or privacy legislation, and fromoccupational and public health and safety legis-lation.

4Harvey Brooks, Benjamin Peirce Professor of Technology andPublic Policy, Harvard University, personal communication, 1Q85.

ADMINISTRATIVE REQUIREMENTS ON UNIVERSITIES THATACCEPT FEDERAL GRANTS OR CONTRACTS

In the last 40 years, rules governing the research and legal requirements on the universities.5 Thesegrant system, the procedures for assuring account- requirements have stimulated considerable an-ability in the administration of such grants, and tagonism and hostility because many universitythe system of Federal support to higher education 5David Dickson provides some discussion of the larger issues inin general have placed ever more administrative The New Politics of Science (New York: Pantheon Press, 1984).

73

Page 74: The Regulatory Environment for Science - Princeton

74

administrators have seen it as an intrusion intotheir autonomy. In the 1970s, Steven Muller ex-pressed these feelings during congressional testi-mony when he called for universities to:

. . . resist the tendency of the federal governmentto attach a growing body of regulations and con-ditions to its measures of support for higher edu-cation. . . . At stake is the essential need of theuniversity to maintain the unfettered freedom ofthe human mind to apply its powers and meth-ods of reason. b

To many observers, relations between the gov-ernment and the universities appear to have de-teriorated over the last decade; they attribute thischange to policies and practices inherent in Fed-eral support of research. These policies act as aform of indirect regulation on research. Undermost conditions, Robert Sproull observes:

. . . the principal investigator does not feel theweight of this pyramid on his back. Although noone has ever calculated how much more researchcould be supported if this towering apparatus wasmade leaner, the investigator can frequently ig-nore it all. There is a growing intrusion, however,into the control of an investigator’s research.7

A major change in public control of science—and a handle for enforcing regulation—has beenthe increased requirement for financial account-ability imposed by the Federal Government. Thepost-war “contract” between government andscience allowed scientists, through the process ofpeer review, to make decisions about the alloca-tion of government funds to specific projects, butrequired that universities be accountable fiscallyto the government. The universities do not argueover the need for accountability, rather abouthow it can be accomplished. This indirect regu-lation derives in part from the retroactive appli-cation of increased standards and from the inter-pretations applied to key principles contained inindirect cost calculation, and Office of Manage-ment and Budget (OMB) Circulars A-21 andA-110.

%teven Muller, “A New American University, ” testimony beforethe U.S. Congress, House Committee on Science and Technology,published as National Science and Technology Policy Issues, 1979.

7Robert L. Sproull, “Federal Regulation and the Natural Sciences, ”Bureaucrats and Brainpower: Government Regulations of Univer-sities, Paul Seaburg (ed, ) (San Francisco, CA: Institute of Contem-porary Studies, 1979), p. 72.

Indirect costs. Donald Kennedy, President ofStanford University, has referred to the univer-sity-government indirect cost reimbursement sys-tem as “the basic fabric of understanding and trustthat has supported science for 30 years.”8 Dur-ing those decades, “government policy has heldindirect costs to be an entirely legitimate part oftotal research costs, ” but as these cost rates haveincreased, and the resources for research at theFederal level have become more constrained, therates-–and the accounting necessary to calculatethem-–have become an increasing source of ten-sion, Many university administrators and scien-tists argue that the process of including an indirectcost rate in a research proposal significantly af-fects research because it can act to array a prin-cipal investigator and his/her sponsor or grant-ing agency against the institution—a polarizationthat is “damaging to morale and, ultimately, toresearch.”9

Indirect costs are paid as grant overhead to in-stitutions to cover maintenance, administrators’salaries, and other operating expenses. A 1984General Accounting Office (GAO) Report calcu-lated that, in the last decade, indirect costs haveincreased so much that they now account forabout 30 percent of all National Institutes ofHealth (NIH) extramural grant expenditures (upfrom 21 percent in 1972) .10 Each institution receiv-ing a grant negotiates its rate. The Departmentof Health and Human Services (DHHS) may au-dit an institution to determine whether indirectcost claims are valid. OMB’s Circular A-21 “CostPrinciples for Educational Institutions” lists allow-able categories of indirect costs. Government rulesallow for the varying circumstances of institu-tions, but they do require that methods used forcalculations be consistent with sound accountingprinciples. The disagreement over the reality ofindirect costs, and what the costs of researchshould include, form a major part of the prob-lems surrounding this topic.

Indirect cost rates are based on the most recentactual expenditures for indirect costs. The costs

‘Donald Kennedy, “Government Policies and the Cost of DoingResearch, ” Science, vol. 227, Feb. 1, 1985, p. 482.

‘Sproull, op. cit.10National Academy of Sciences, Strengthening the GOvernment-

Universjty Partnership in Science (Washington, DC: NAS, 1983),p. 129.

Page 75: The Regulatory Environment for Science - Princeton

75

rates are recomputed annually and submitted toa granting agency for evaluation, which uses oneof two main systems for determining and reim-bursing indirect costs. The NIH requires that prin-cipal investigators submit proposals that specifyonly the direct costs of the research proposed, Peerreviewers see only the direct cost budget. A sep-arate award is made to the institution for indirectcosts. Other agencies, such as the National ScienceFoundation (NSF), the Department of Defense(DOD), and the National Aeronautics and SpaceAdministration (NASA), require that proposalsinclude both direct and indirect costs, andreviewers see the complete budget. If an awardis made, the approved budget must be based onthe last negotiated indirect costs rate for the in-stitution on the grant. If this rate is out of date(as can occur), then rising indirect cost rates canlimit the funds available to a researcher on a proj-ect-by-project basis.

As indirect cost rates rise, research budgets buyless. Some researchers “consider payment of in-direct costs a subsidy for higher education anda diversion of support for research.”11 Universityadministrators, however, see indirect cost recov-ery as essential to maintaining the research infra-structure.

Budget Circulars A-21 and A-110. Relations be-tween the universities and the Federal Govern-ment were strained considerably in 1979 as a re-sult of revisions to OMB Circular A-21 (CostPrinciples for Educational Institutions). In orderto allocate indirect costs, the universities were re-quired to institute “time and effort” reporting foremployees whose salaries were charged in anypart to the research grant. The rule requiredfaculty to account for 100 percent of the time forwhich they were compensated, regardless of thefraction devoted to federally supported work. Al-though the OMB believed these procedures to benecessary to determine if research funds were be-ing used for the purposes designated by Congress,academic scientists considered them a violationof their autonomy. Emotions ran high. A. Bart-lett Giametti, President of Yale, proclaimed:

I I u S Genera] Accounting office, Assuring Reasonableness @f,.Rising Indirect Costs on NIH Research Grants—A Difficult Prob-lem (Washington, DC: 1984).

“Never have I seen the lash of federal regulationsapplied to a crucial area of the nation’s intellec-tual life with such seeming indifference to finan-cial and human consequences.“12 The stringencyof these OMB requirements has since been relaxedsomewhat. In fall 1982, for example, DHHSawarded 22 contracts to large universities to testa procedure whereby coordination audits wouldbe carried out by public accounting firms anduniversity auditing staff. For research administra-tors, this change meant added responsibility andthe imposition of a cost previously borne by theFederal granting agency, but it allowed the inde-pendent auditors to conduct an audit that betterreflects the research environment in the institu-tion under investigation. This single-audit con-cept, now allowable for all grantee institutions un-der 1982 revisions to Circular A-21, provides forgreater flexibility in university reporting of timeand effort, Some universities, Yale University andStanford University, in particular, have negoti-ated agreements with OMB that allow them toeliminate time and effort reporting for the timebeing. ’3

Concerns about the regulatory nature of theFederal grant relationship date back to the 1960s.A U.S. Commission on Government Procure-ment—set up in response to widespread concernabout grants and contracts administration rules—found that procurement-type Federal controlswere being inappropriately applied to grant-typeassistance relationships. The Commission’s rec-ommendations to deal with this problem were im-plemented eventually in the Federal Grant and Co-operative Agreement Act of 1977 (Public Law95-224), which distinguished Federal assistancefrom Federal procurement and required that grantrelationships entail minimal government involve-ment in grantee affairs. However, OMB guidanceissued to implement the act failed to require thatFederal agencies use the grant in the fashion en-visioned by the act, thereby vitiating its regula-tory-reducing effect .14

“See discussion of these agreements in Report o} the Workshopon the Effort Reporting Requirements of OMB Circular A-21 (Wash-ington, DC: Nat ional Academy of Sciences, 1984).

“Quoted in Dickson, op. cit., p. ~8.I JI Robert NeW~ton, Nationa] Science Foundation, persona] com-

munication, 1~85.

Page 76: The Regulatory Environment for Science - Princeton

76

In the late 1970s, the National Commission onResearch concluded that the administrative andfiscal controls used by Federal agencies in the sup-port of academic research interfered with the con-duct of research. During the same period, NSFand the Association of American Universities con-ducted an experiment in grant administration thatresulted in NSF delegating to grantees the respon-sibility for administering most Federal controls.More recent discussions have focused on prob-lems caused by discrepancies between Federalrules, and so there are proposals now for a sim-plified and standardized Federal approach to thesupport of academic research. Such an approachmight reverse or remedy the fragmentation of re-

search programs caused by multiple sources ofFederal support, might reduce overall researchcosts, and might increase research productivity.Federal administrative and accounting rules mightbegin to match the realities of how research is con-ducted. One way would be to recognize the re-searcher’s program of research as the administra-tive and accounting unit. Under the aegis of theNational Academy of Sciences Government-Uni-versity-Industry Research Roundtable, a Federaleffort is being undertaken to demonstrate such asimplified arrangement, including a demonstra-tion project for the Federal support of academicresearch in Florida.

AGENCY RULEMAKINGPROTECTIVE LEGISLATION AND

Antidiscrimination Statutes

Several statutes bar recipients of Federal finan-cial assistance from excluding persons, becauseof their color, race, sex, or national origin, fromparticipation in federally supported activities.These antidiscrimination statutes apply to recip-ients of NSF and NIH grants and compliance mustbe assured prior to receipt of funds.

Section 602 of the Civil Rights Act of 1964 re-quires Federal agencies and programs to issue reg-ulations implementing Title VI of the Civil RightsAct. The act provides that no person shall, on thegrounds of color or national origin, be excludedfrom participation in, be denied the benefits of,or be otherwise subjected to discrimination un-der any program or activity receiving Federal fi-nancial assistance. These regulations are alsoapplicable to sub-grantees, contractors, and sub-contractors of a grantee. Grant applicants mustissue an “Assurance of Compliance” to be filedwith the agency. * Similar assurances must be filedby the grantee to assure compliance with the Re-habilitation Act of 1973, and the Age Discrimi-nation Act of 1975.

Title 1X of the Education Amendments of 1972prohibits the exclusion of persons on the basis ofsex from any education program or activity re-

*See National Science Foundation and National Institutes ofHealth Grant Policy Manuals.

ceiving Federal financial assistance. NSF interpretsthis to apply to grants under their Science Edu-cation Programs but not to grants for scientificresearch. Public Health Service (PHS) grantees,however, are required to submit an assurance tothe Office for Civil Rights, Office of the Secre-tary, DHHS, before a grant, sub-grant, or con-tract under a grant maybe made. In addition, allPHS grantees are encouraged to “adopt practicesthat will eliminate sex discrimination and encour-age sex fairness, including but not limited to, usinglanguage that represents both genders, avoidingsex stereotyping, and representing women equi-tably in leadership and policymaking positions. ”

Small Business Innovation ResearchPrograms and Use of ServicesRegulations

Federal research funding and how the granteesuse those funds are subject to laws and guidelinesthat are intended to promote opportunities forsmall businesses and minority-owned businesses,and to favor American businesses over foreign in-terests. Under Public Law 97-219 (the Small Busi-ness Innovation Development Act) each agencywith a qualifying research and development (R&D)budget in excess of $100 million must establisha Small Business Innovation Research Program(SBIR). Each agency must set aside 1.25 percentof its extramural R&D obligations for its SBIR

Page 77: The Regulatory Environment for Science - Princeton

77— —

program. The SBIR program is intended to pro-vide a mechanism for opening up Federal R&Dopportunities for small high-technology firms.The Departments of Agriculture, Energy, Trans-portation, and Interior; the National ResearchCouncil; the Environmental Protection Agency(EPA); DOD; DHHS; NASA; and NSF are all re-quired to participate in this program. In addition,funds flow indirectly to small firms through thesubcontracting requirements of Public Law 95-507, which requires that large prime contractorsmust subcontract part of their Federal work tosmall firms.

Executive Order 12138 of 1979 establishes a na-tional program to foster the role of women in busi-ness, by encouraging preference in procurementand the deposit of Federal funds. Executive Or-der 11625 of 1971 strives for the same goals forminority-owned businesses. Recipients of NSFgrants are also encouraged, but not required, touse banks owned at least 50 percent by minoritygroups or women. In addition, according to theInternational Air Transportation Fair Competi-tive Act of 1974, grantees must use a certified U.S.flag carrier for foreign transportation of personsor property, for purposes of the research, unlessnot available.

Privacy Legislation

The Privacy Act of 1974 (Public Law 93-579)provides certain safeguards for individuals against

ENVIRONMENTAL LEGISLATION

Until the end of 1984, EPA regulations for thecontrol of hazardous wastes under the ResourceConservation and Recovery Act (RCRA)17 hadlimited influence on some research activities, be-cause they established regulatory exemptions for“small quantity” generators18 but held all othergenerators subject to the full range of regulatoryrequirements. To avoid regulatory costs and otherburdens, many research institutions sought to staywithin EPA’s limit for small quantity generators.Chemical associations also promoted the concept

1-42 U.S, ~. Sections 6901 et seq‘“40 cFR part 2 6 1 ,

invasions of personal privacy. These include: 1)the right of individuals to determine what infor-mation about them is maintained in Federal agen-cies’ files and to know how that information isused; and 2) the right of individuals to have ac-cess to such records and to correct, amend, or re-quest deletion of information in their records thatis inaccurate, irrelevant, or outdated .15

The act imposes requirements on Federal agen-cies with respect to the manner in which theycollect, use, disseminate, and maintain recordscontaining information pertaining to specific in-dividuals. Thus, information obtained for onepurpose cannot be used for other purposes with-out the consent of the concerned individual. Thisregulation applies to records maintained by PHSwith respect to grant applications, grant awards,and the administration of grants, as outlined inDHHS regulations that implement the PrivacyAct.16 Records maintained by grantees, however,are not subject to these regulations.

“Public Health Service, “Grants Policy Statement, ” December1982, p. 15.

’045 CFR 5b,

and practice of waste stream reduction in proc-ess industries and laboratories. 19 In November1984, amendments to RCRA severely limited theseexemptions by requiring EPA to regulate genera-tors of as little as 100 kilograms a month .20 It willbe more difficult for research facilities to avoidRCRA regulations in the future.

Other laws administered by EPA have led tofurther unintentional regulations of research activ-ities. Research laboratories that emit air and water

1 gsee L,ess IS L?etter: LabOrafOV Chemical Management for WasteReduction (Washington, DC: American Chemical Society, 1985).

2042 U, S.C. Sections 6931(D), added by Public Law 98-616 (1984).

Page 78: The Regulatory Environment for Science - Princeton

78

pollutants are subject to EPA regulations under nificantly, especially when the research involvesthe Clean Air Act21 and the Clean Water Act,22 highly toxic substances.as well as corresponding State statutes. Research- Laboratory research in industry, universities,ers will have to secure permits and comply withEPA and State permit restrictions on the discharge

and government may also be affected by suchthings as EPA regulations on disposal of hazard-

of these pollutants. These “end-of-pipe” emissionrestrictions can influence the research process sig-

ous waste, health regulations set by the State orcity in which the laboratory is located, and regu-lations on handling nuclear and other hazardous~ 142 LJ .s. c. Sections 7401 et seq.

2233 U.S. C. Sections 1251 et seq. s u b s t a n c e s .

OCCUPATIONAL AND PUBLIC HEALTHAND SAFETY REGULATIONS

As will be illustrated chapter 6, government reg-ulations intended for business and industry canhave quite different effects when applied tolaboratory-based, university research activities.Richard W. Lyman has observed that “universi-ties are not quite the uniquely subtle and com-plex organisms they like to consider themselves,but they do possess a good many characteristicsthat make regulations suitable to a steel mill notalways relevant or appropriate [to a university].“23

Most of these regulations are of two types: regu-lations designed to apply to production, or pilotplant activities and regulations aimed at achiev-ing some social goal.

Some observers propose that, in principle, thecost imposed on various economic or researchactivities by regulation should be proportionateto the potential environmental, health, or safetyimpact of these activities; but others disagreesharply with the argument that such regulationtakes industrial financial resources away from re-search, and cite a lack of evidence for those ef-fects.24

The protection of worker health is required bysuch legislation as the Occupational Safety and

z ~Richard Lyman, quoted in Seaburg, op. cit.Z~Brooks, Op. cit.; for a general discussion of the effect of health

and safety regulation on R&D, see, for example, Nicholas Ashfordand George Heaton, “Regulation and Technological Innovation inthe Chemical Industry, ” Law & Contemporary Problems, vol. 46,No. 3, summer 1983, pp. 109-137; also see Nicholas Ashford, etal., “Using Regulations to Change the Market for Innovation, ” klar-vard Environmental Law Review, vol. 9, No. 2, 1985.

Health Act,25 by the Occupational Safety andHealth Administration (OSHA), and by other reg-ulations. 26 The act concerns the general duty ofemployers to provide safe working conditions fortheir employees and for employer compliancewith OSHA regulations. The regulations limitworker exposure to various physical, chemical,and other agents and hazards to health and safety,such as noise, radiation, or toxic chemicals. Inaddition to requirements for recordkeeping, med-ical surveillance, and monitoring, the regulationsalso set forth, on a generic basis, employee rightsto request OSHA inspections and to obtain ac-cess to medical records, exposure records, andlabels on hazardous chemical containers.

Right- To-Know Legislation

The need for people to obtain information onthe risks they run by working with hazardouschemicals has recently prompted more specific leg-islation, which could have a significant unin-tended regulatory impact on research laboratories,especially those based in universities. Althoughcurrent Federal regulations in this area do not ap-ply to such laboratories, the recent State and pro-posed Federal “right-to-know” legislation couldhave the secondary effect of creating new controlson university laboratories. These laboratorieswork with hundreds of chemicals in an experi-mental situation or do research on hazardouschemicals.

2529 IJ. S. C. Sections 651 et Seq.ZbL~ CFR part 1900.

Page 79: The Regulatory Environment for Science - Princeton

79— — —

In November 1983, OSHA issued a “hazardcommunication” rule, 27 which establishes thatworkers in the manufacturing industry have aright to know about the chemical and physicalhazards in their workplaces. Manufacturers andimporters of hazardous chemicals, and their cus-tomers who use the chemicals in subsequent man-ufacturing activities, thus have disclosure dutiesunder the rule.

The rule initially requires that manufacturersand importers of chemicals provide their custom-ers with labeled containers and a Material SafetyData Sheet (MSDS) for each purchased chemicalthat has been determined to be hazardous. It alsorequires that all firms in the manufacturing sec-tor—both “downstream” customers and chemi-cal manufacturers themselves—institute hazardcommunications programs to provide informationto and train workers who could potentially be ex-posed to the chemicals.

The rights created by the OSHA rule do notcurrently extend to workers in research labora-tories that are not within the manufacturing sec-tor. For research laboratories within the manu-facturing sector, however, employers must:

ensure that labels on incoming containers ofhazardous chemicals are not removed ordefaced;maintain any MSDSs that are received withincoming shipments of hazardous chemicals,and ensure that they are readily accessible tolaboratory workers; andprovide laboratory workers with informationand training on hazardous chemicals in theirwork areas at the time of their initial assign-ment, and when a new hazard is introducedinto their work area.

Chemical manufacturers must comply with itslabeling and MSDS requirements by November25, 1985. Employers, thereafter, must meet theworker communication requirements by May 25,1986.

The rule provides that it is intended to preemptany state law pertaining to this subject, althoughthe OSHA administrator has indicated that

‘“29 CFR SectIons IQ IO. 1200

OSHA would not assert preemption over anyState rules until the effective date of the Federalstandard (Nov. 25, 1985). Nevertheless, in the past3 years, numerous State and local governmentshave enacted right-to-know laws and ordinances.These enactments can be classified as follows:

1.

2.

3

Worker right-to-know laws are designed toexpand the rights created by OSHA by giv-ing workers in other sectors access to tech-nical information concerning the hazardoussubstances to which they are exposed, andby increasing the list of hazardous substancesto which such access rights apply.Comprehensive community right-to-knowlaws are designed to give local officialsand/or residents access to technical informa-tion concerning hazardous substances at fa-cilities within a community, without regardto the use to be made of that information.Limited community right-to-know laws aredesigned to give specified local officials ac-cess to technical information concerninghazardous substances at facilities within acommunity, for the purpose of facilitating

appropriate responses in the event of emer-gency and/or protecting emergency responsepersonnel.

Such laws are likely to face a legal challengeon the grounds that right-to-know requirementshave been preempted by the OSHA rule.28 Likethe OSHA rule, typical State right-to-know lawsare limited in their application to a specific listof hazardous substances, rather than to hazard-ous substances generally. For example, the Mas-sachusetts law29 directs its Commissioner of PublicHealth to compile a substance list. The initial listincluded approximately 2,000 substances.

New Jersey’s law30 has been challenged, andfound invalid by a U.S. District Court, in its ap-plication to the manufacturing sector, due toOSHA preemption.

31 It is still in effect for other

zsse~ for example, New Jersey Chamber of Commerce v. Hu~he~r.12 OSHC 1121 (t). N.1,, Jan, 3, 1Q85), which invalidates New Jer-sey’s right-to-know law insofar as it applied to manufacturing in-dustries.

‘*Massachusetts General Laws c 11 IF.j~chapter 315 of the Acts of 1983,1‘New lerse?’ Chamber of Commerce v. Hughe?, op. cit.

Page 80: The Regulatory Environment for Science - Princeton

80

industries, however, and requires State agenciesto compile three sets of lists: a short list of “Envi-ronmental Hazardous Substances” (now approx-imately 154 substances); a more comprehensivelist of “Workplace Hazardous Substances” sub-ject to less thorough reporting requirements(about 800-1,000 substances); and a “SpecialHealth Hazard” list of substances that pose spe-cial hazards because of their known carcinogenic-ity, mutagenicity, teratogenicity, flammability,explosiveness, corrosivity, or reactivity.

The New Jersey law, as still valid, applies toresearch laboratories that are part of facilities en-gaged in:

pipelines, transportation, communications,electric, gas, and sanitary services;wholesale trade, nondurable goods;automotive repair, services, and garages;miscellaneous repair services;health services;educational services; andmuseums, art galleries, botanical, and zoo-logical services.

The law also applies to State and local govern-mental laboratories. By exclusion, therefore, thelaw does not apply to retail trades, the profes-sions, most service industries, and R&D labora-tories that are not part of a covered facility.

The Massachusetts law, as yet unchallengedand now being implemented, exempts researchlaboratories not involved in the production ormanufacture of goods for direct commercial sale.

Comprehensive right-to-know laws, of course,are far less restrictive. Massachusetts, for exam-ple, requires that an MSDS for each substance atan installation be filed with the Department ofEnvironmental Quality Engineering (DEQE) and,on request, with a designated municipal coordi-nator from the community in which the installa-tion is located. Thereafter, any community resi-dent who has reason to believe that a hazardoussubstance may be endangering public health orsafety may request an investigation by the mu-nicipal coordinator. If an investigation is deemednecessary, DEQE can provide relevant MSDSs tothe petitioning resident in appropriate cases. Theinvestigation is intended to ascertain what, if any,

State or local action is necessary to protect thehealth or safety.

The New Jersey law is even less restrictive. Itrequires employers to submit environmental sur-veys to the Department of Environmental Protec-tion (DEP) and the health department of thecounty in which the facility is located, as well aspertinent parts of the survey to local fire and po-lice departments. Any person making a writtenrequest would obtain a copy of the survey fromthe DEP. In addition, a list of workplace hazard-ous substances at each installation and the MSDSfor each one were to be maintained by the De-partment of Health and made available on requestto any person.

In the 99th Congress, a number of legislativeproposals have sought to extend community right-to-know and accident control provisions. H.R.963, introduced February 6, 1985, by Represent-ative James Florio (D-NJ), would amend the Oc-cupational Health and Safety Act of 1970 to per-mit States to adopt more stringent right-to-knowprovisions for access to information. The “Chem-ical Manufacturing Safety Act of 1985, ” H.R. 965,also introduced on February 6, 1985, by Repre-sentative Florio, amends the Toxic SubstanceControl Act to add provisions concerning a com-munity’s right to know of the risks, emergencyplanning, and liability for hazardous substancesreleases. Research laboratories and hospitals areexempt, if the substance is used under the directsupervision of a technically qualified person.

H. Con. Res. 53, introduced by RepresentativeBob Edgar (D-PA), is a concurrent resolution ex-pressing the sense of the Congress that all personshave a fundamental right to know when they areexposed to hazardous substances that may be dan-gerous to their health. In part it declares that theAssistant Secretary for Occupational Safety andHealth should immediately revise the hazard com-munication standards to extend right-to-knowprotection to employees in any service or indus-try that employs hazardous substances and thatthe Federal right-to-know standards should setonly the minimum requirements that the Statesmust follow. On March 6, 1985, Senator AlfonseD’Amato (R-NY) introduced S. 606, the “Com-munity Right to Know Act of 1985 .“ It provides

Page 81: The Regulatory Environment for Science - Princeton

81

for the annual notification to a city or county ofthe presence of hazardous substances in or nearsuch city or county (within a lo-mile radius) bythe owner or operator of such a facility.

On March 21, 1985, Representative RobertWise (D-WV) introduced H.R. 1660, the “Haz-ardous Materials Manufacturing Safety Act of1985.” Similar to H.R. 965, the bill would amendthe Toxic Substance Control Act to add provi-sions concerning a community’s right to know ofthe risks, emergency planning, and liability forhazardous substances releases. And finally, H.J.Res. 225, “The Hazardous Substances ‘Right-to-know’ Resolution, ” introduced by RepresentativeBruce Vento (D-MN) on April 4, 1985, mirrorsthe H. Con. Res. 53 expression of a person’s fun-damental right to know when they are handlingor are exposed to a hazardous substance that maythreaten their health and well-being. It alsodeclares that OSHA should immediately revise itsHazardous Communication Standard to extend“right-to-know” protection to all workers in in-dustries and services and that the Federal stand-ards should only be minimum requirements thatthe States must follow. *

*These legislative initiatives contrast sharply with the politicalassumptions directing similar European efforts, where public dis-closure is limited, In 1979, the European Community adopted aDirective, commonly called the Sixth Amendment, containing pro-visions for the testing of chemicals to be placed on the market, andfor the notification to governments of the results of such tests, as

well as for chemical classification, packaging, and labeling. The SixthAmendment requires an importer or manufacturer proposing to placea new chemical on the market to submit a premarket notificationto the appropriate regulatory body of the member nation where thesubstance is produced or imported. The notice must contain health,environmental, and physio-chemical test data on the substance, esti-mated volume and uses, and recommended precautions. On the basisof the data submitted, packaging and labeling requirements maybe imposed, Exempted are substances subject to other regulatoryprograms, such as medicinal and radioactive substances, waste sub-stances and pesticides, research substances for evaluation, and sub-stances marketed in quantities less than 1 metric ton per year permanufacturer.

The most important European Community enactment on risk com-municant ion for the control of chemical accident hazards is the“Seveso Directive. ” This Directive is named for the town in Italywhere explosions at a Hoffman-LaRoche plant in 1976 spewed di-oxin into the community, necessitating removal and testing of itsinhabitants.

Adopted in 1982, the Seveso Directive is to be fully implementedby the member nations in 1989. Under the directive, a manufac-turer who conducts activities which involve one or more of 178 des-ignated substances must provide to the competent national author-ity: information on the substances and the processes used,information on the installation (facility), information on possiblemajor accident situations, new information relevant to safety andhazard assessment on a periodic basis, and special information onmultiple installations close to dangerous substances.

Each member nation is required to designate a competent authoritywho will be responsible for receiving the information from the man-ufacturer, examining it, requesting additional information, devel-oping an off-site emergency plan, organizing inspections, and de-termining that the manufacturer takes the most appropriate stepsto prevent major accidents and limit accident consequences. Thedirective thus represents the most complete and integrated approachtaken to date to prevent chemical accident hazards. The directivereflects European values in vesting full responsibility for public safetyin public officials, in restricting the flow of risk information to pro-tect industrial secrecy, and in affording citizens and interest groupsno access to the risk communication process or to the informationoutputs of the process.

Page 82: The Regulatory Environment for Science - Princeton

Chapter 6

Institutional Differences

Page 83: The Regulatory Environment for Science - Princeton

Chapter 6

Institutional Differences*

Research in the United States is performed ina number of different institutional settings—inuniversity laboratories, in industry, in nonprofitinstitutions, and in government laboratories. Thefollowing profiles give examples of how some of

the mechanisms described in chapter 4 apply inthree different institutional settings. The profilesare based on interviews and data gathered in ac-tual laboratories.

PROFILE OF A REPRESENTATIVE INDUSTRIAL LABORATORY

After acquisition in 1981 by a major interna-tional chemical firm, this laboratory was madepart of a newly formed pharmaceutical productsdivision. One of the largest U.S. producers ofradioactive chemical compounds for life scienceresearch and radiopharmaceuticals, the pharma-ceutical products division’s research activities areprincipally aimed at new product development,at improving production efficiencies, and at de-veloping radioisotope marking procedures for thefields of radiopharmacology, life science chemis-try, and biotechnology. In addition, the divisionconducts research on industrial health and safetyimprovement techniques.

Because the Laboratory’s primary function is todevelop and supply radioactive materials, a largeportion of the research is conducted to accomplishthis corporate marketing or production objective.Some research is aimed at developing a marketa-ble pharmaceutical that will receive U.S. Food andDrug Administration (FDA) approval. Other re-search seeks to improve the efficiency or safetyof production processes. A small portion of thedivision’s research is conducted under contract toother industrial firms, principally to developproduct-specific radioisotope marking proceduresusable in their own research.

Control of the Research Agenda

Because the laboratory is a commercial facil-ity, its research decisions are strongly influencedby such considerations as the potential profit to

*This chapter is based on the regulations in force in three differ-ent laboratories. Interviews and data collection were performed byMichael Baram and Raymond Miyares, Bracken & Baram, Boston,MA, under contract to the Office of Technology Assessment.

be earned from new product sales, or the savingsto be anticipated from production efficiencies andsafety improvements. Funds for research aredeemed an investment and, as such, are expectedto pay dividends. The laboratory’s researchagenda therefore depends on corporate judgmentsconcerning the potential of research to yielddividends.

In this regard, U.S. tax policy has, in recentyears, promoted research investments by grant-ing business deductions or credits for researchexpenditures—for example through the InternalRevenue Code and the 1981 Economic RecoveryTax Act (see ch. 4). The effect of these provisionsis to lower the effective cost of research to busi-nesses and thereby to make the research morelikely to be profitable. To the extent that taxbreaks are given without regard for the contentof research, all forms of research are stimulated.Virtually all of this laboratory’s research fallswithin the categories eligible for preferential taxtreatment under current law.

The most important regulatory impacts on theselection of research opportunities at this labora-tory, however, are the result of policies or regu-lations of FDA. By specifying the evidence ofsafety and efficacy it will require for new drugapproval, the FDA sets forth much of the researchagenda for product development. The FDA reg-ulations also control the activities that may be un-dertaken with investigational new drugs. By speci-fying the protocols to be followed for each stepof such research, including toxicology and phar-macology procedures for animal testing and useof human subjects, the FDA regulations chart thecourse of new drug development research.

85

Page 84: The Regulatory Environment for Science - Princeton

86

Outside of FDA requirements, the principal in-direct impacts on the selection of research oppor-tunities are higher costs resulting from compliancewith health, safety, and environmental regula-tions. Sometimes firms engage in research in or-der to control and reduce these costs. For exam-ple, the high cost of hazardous and low-levelradioactive waste disposal at licensed facilitiesstimulates interest in development, production,and recycling processes that produce less waste.At this laboratory, decisions to undertake researchon ways to recover waste carbon isotopes and toreduce the curies of tritium used to producetritium-marked products are attributable, in part,to waste disposal cost increases.

Regulations can also raise the cost of the re-search itself and thereby deter a firm from under-taking it. This firm, for example, has thus fardeclined to use P-3 level (high hazard) micro-organisms in its biotechnology laboratory becauseof the elaborate substance approval process re-quired by National Institutes of Health (NIH)guidelines. In interviews, company officials sup-plied other examples of research that they aban-doned because of high regulatory compliancecosts :

The cost of establishing and maintaining alaboratory “closed box” acceptable underNIH guidelines and emission controls ade-quate to meet State air pollution requirementswas deemed so substantial that planned re-search utilizing aflatoxins was abandoned.Similarly, the cost of obtaining an antidotefor the venom of an African poisonoussnake, in compliance with NIH guidelines,was considered prohibitive and research uti-lizing the venom was dropped.The Occupational Safety and Health Admin-istration (OSHA) xylene and toluene ex-posure standards (to protect workers) wereconsidered too costly to comply with in re-search on liquid scintillation cocktails, so thatresearch was redirected to find scintillatorswith less toxic components.

Controls on the Research Process

The protocols specified in FDA regulations dic-tate not only the type of research necessary to sup-

port a new drug but also the research proceduresto be utilized. Thus, to satisfy FDA, research mustmeet regulatory requirements for a “scientificallywell controlled” study. Some of these steps areundertaken for no other reason than FDA require-ments; in the absence of FDA regulation, the re-search might be designed in a more streamlinedfashion.

For other types of research, regulatory require-ments play a far less pervasive role. Research isdesigned so as to be most likely to yield the desiredinformation. Because regulatory agencies have lit-tle interest in production efficiencies, much of theresearch conducted in this area, for example, isdesigned to suit the objectives of the researcher.

Management of Risks

Much of the division’s research is undertakenby a permanent staff of scientists and technicianswho expect a long-term career with the company.The long-term consequences of radiation and toxicchemical exposure are as important to them as im-mediate health and safety effects.

Since its acquisition by an international corpo-ration, this laboratory has undertaken to conformits occupational and environmental health andsafety practices to those of its parent company,which enjoys a national reputation for responsi-bility in this area. The laboratory has a safety ex-ecutive committee of senior managers, with sub-committees on safety awareness, process hazards,equipment safety, chemical safety, and radiationsafety. Every employee is required to attend amonthly safety seminar, and regular inspectionsare made, not only of the obvious hazards ofchemical storage areas, radiation shielding, andelectrical wiring, but also of the more mundane,such as chair hazards. On-the-job injuries, lostwork time, restricted work time, medical treat-ment, and off-the-job injuries are reportedmonthly to corporate headquarters.

The laboratory workers are protected by severalFederal and State regulatory programs. OSHAregulations, the most comprehensive of these, setgeneral safety standards as well as exposure stand-ards for toxic chemicals. Nuclear Regulatory

Commission (NRC) regulations specifically gov-

Page 85: The Regulatory Environment for Science - Princeton

87

ern the safety of laboratory work on radioactivematerials and require dissemination of informa-tion to workers on any materials used in the lab-oratory and reporting of any incident involvingradiation exposure. NIH guidelines govern thecontainment and security of micro-organisms usedin biotechnology research.

The firm’s preoccupation with safety also ex-tends to environmental and community hazardconcerns. The laboratory conducts annual emer-gency training programs for local fire officials, po-lice, and hospitals. In addition, it complies witha panoply of Federal and State requirements forconstruction and operation of industrial facilities.The State plumbing code, for example, requiresseparate piping in the biotechnology lab for hu-man contact and contact with organisms. The Re-source Conservation and Recovery Act and cor-responding State requirements, as well as NRCregulations, govern the disposal of hazardous andradioactive waste. Federal and State air qualityregulations limit emission of radionuclides intothe air. And, in addition, regulations under theClean Water Act restrict the disposal of waste intopublicly owned treatment facilities as well as thedischarge of such waste into surface and ground-water. Under the Clean Water Act, even deminimis spills of hazardous substances must bereported.

Beyond these specific regulatory programs, thepotential for liability under the Federal Superfundlaw, corresponding State laws, and the commonlaw influences the management of laboratoryrisks. Such potential for liability provides an in-centive for due care in the management and dis-posal of wastes and the emission of pollutants intothe environment. However, the incentive oper-ates further to cause the company to reduce wastegeneration so as to avoid strict liability even wheredue care has been exercised.

Restrictions on Communicationof Information

As a manufacturer, the company is subject toboth the OSHA hazard communication standardand the State’s right-to-know law, notwithstand-ing that law’s research laboratory component. Un-der both provisions, it must label—and supply its

customers with Material Safety Data Sheets(MSDSs) on–all hazardous products sold. Com-pany practice is to compile a notebook of MSDSsfor all its products sold and to disseminate thesame notebook to all customers with a printoutlisting which products they purchased during theprevious 18 months.

The laboratory was in virtual compliance withthe OSHA standard prior to its issuance and isencountering no significant difficulty in adaptingits prior community liaison activities to the State’sright-to-know requirements. For laboratory work-ers, it is required, under the OSHA standard, to:

Ž ensure that labels on incoming containers ofhazardous chemicals are not removed ordefaced;

● maintain any MSDSs that are received withincoming shipments of hazardous chemicals,and ensure that they are readily accessible tolaboratory workers; and

● provide laboratory workers with informationand training on hazardous chemicals in theirwork areas at the time of their initial assign-ments, and when a new hazard is introducedinto their work area.

The State has required further that the companysupply a list of the hazardous substances that ituses, and an MSDS for each one, to local offi-cials in both the State Capital and the town inwhich the lab is located. Neither the OSHA northe State list of hazardous substances, however,includes radioactive materials.

Although the laboratory often seeks to dissem-inate a product developed in its research labora-tories, corporate policy necessarily restricts dis-semination of information about its research orabout research results. Such information may belegally patentable or protectable as a trade secret,and corporate policy is to control disseminationor publication of research data, especially on proc-ess refinements or improvements, unless there isa valid business reason for dissemination. On theother hand, the parent company’s policy is to dis-seminate fully any proven safety improvementsdeveloped by corporate research laboratories.

Again, the potential for liability affects theavailability of information on research. The lab-oratory has a “document retention program” that

Page 86: The Regulatory Environment for Science - Princeton

88

requires disposal of all documents not requiredto be retained for a business purpose or by regu-lation, the objective of which is to avoid a papertrail that could be used in enforcement or liabil-ity proceedings against the company. This pro-

PROFILE OF A REPRESENTATIVEINDEPENDENT LABORATORY

This cancer research center is one of severalteaching hospitals affiliated with a major medi-cal school, although its Board of Trustees is fullyindependent of the school. The center is involvedin cancer research of virtually all types, from cellbiology to clinical trials, with a special emphasison childhood cancers. Its research divisions in-clude specialities in: biostatistics and epidemiol-ogy; cancer control, genetics, and pharmacology;cell growth and regulation; immunogenetics; med-ical and pediatric oncology; medicine; and tumorimmunology and virology.

The center administers an annual budget which,in 1984, exceeded $55 million. Of this amount,approximately 45 percent was for patient care andservices and the remainder was for research. Thecenter maintains a 57-bed inpatient facility andprovided care through nearly 24,000 outpatientvisits in 1984. Approximately 50 percent of thecenter’s patients participate in some form of clin-ical study, however, and the line between researchand patient care is not always precise.

Approximately 80 to 85 percent of the center’sresearch is funded by Federal grants from NIHand the National Science Foundation (NSF). Ofthis amount, about 70 percent comes from the Na-tional Cancer Institute (NCI). Of the remainder,most comes from private granting organizationssuch as the American Cancer Society and the Na-tional Leukemia Foundation (principally for stu-dent fellowships). Approximatelys percent of re-search funds come from other private sources.

The center’s therapeutic research often involveshighly toxic chemical agents or radiation, and, ofcourse, the use of human subjects. The hazardsof the research may be concentrated on these sub-jects in some circumstances, and human subjectsmust be informed of, and consent to, acceptance

gram can be seen as a prudent response to theliberal discovery rights afforded litigants in thecourts today, but it may also cause informationon prior research to be lost to researchers withinthe company.

NONPROFIT

of the related risks before participating in a clini-cal study in the hope of achieving therapeuticbenefits.

The center’s research staff of 600 includes ap-proximately 350 with doctoral degrees. Much lab-oratory research is conducted by technicians. Suchtechnicians typically have a high turnover rate,so that long-term exposure to laboratory chemi-cals and radiation is unusual. Nevertheless, someresearch personnel, both professional and non-professional, perform research over a period ofseveral years.

Control of the Research Agenda

The center has a fairly formal administrativehierarchy that sets the general theme of the re-search to be undertaken. Because the center is it-self mission-oriented (i. e., devoted to cancer re-search), and its principal funding sources sharenearly the identical research mission, the center’sbroad outlines of research are not generally af-fected by considerations of funding availability.To be sure, the work of an individual researcheror laboratory might be terminated if funding werewithdrawn. Much of the research staff is depen-dent on continuing funding for their employment.Further, if NCI were abolished or fundamentallyredirected, the center’s research agenda might besubstantially affected. The importance of cancerresearch in American public health policy, how-ever, makes the possibility of a major redirectionremote.

Indeed, it is this public policy commitment,rather than the mission of funding sources, thatappears to affect the center’s selection of researchopportunities. For example, the center has thusfar declined to undertake a large amount of epi-

Page 87: The Regulatory Environment for Science - Princeton

89

Box A.—An Assessment of Regulatory Forces by Lab Directors and Research Administrators

In March 1985, OTA sent questionnaires to 32 university research administrators (deans and vicepresidents) and 112 laboratory directors around the United States. The two groups were selected be-cause of their varying roles in the university research process; administrators tend to be intimatelyinvolved in the research administration policies established and followed within the university and thelab directors are more involved with concerns linked to their particular field. The laboratory directorgroup was selected from the Gale 1984-85 Research Centers Directory. Out of the 7,427 entries for theUnited States and Canada, at least two were chosen from every State and from the disciplinary divisionsaccording to the following proportions: agricultural and nutrition sciences, s percent; astronomy, 5 per-cent; engineering (research), 20 percent; life sciences, 20 percent; mathematics and computer sciences,20 percent; physics, 25 percent; and social sciences, 5 percent. Laboratories of all sizes were chosen ran-domly within the above parameters. The response rate for this group was 23 percent.

The research administrators were chosen from the top 32 research universities in the country basedon the level of Federal funding for research, according to the NSF Academic Science R&D Funds: FiscalYear 1980, Surveys of Science Resources Series. Forty-four percent of the research administrators re-sponded to the survey. The aggregate response rate was 27.7 percent.

Participants were asked to identify major regulatory forces in their research institutions, trends inregulation of research, and channels or forums for discussing solutions to potential problems. In gener-al, there were few differences in the forces listed between the responses of the two groups.

Major Regulatory Forces.—The regulatory forces listed by the respondents can be classified intofour major areas: controls on substance, whereby nonscientific forces are setting or influencing the re-search agenda; controls on process, whereby the nature of the research is not under consideration, butthe means and methods used to accomplish the research goals are regulated; administrative constraints(including the funding process); and restrictions on dissemination of research results.

Clearly, the regulatory forces that are most keenly felt by both groups are the Federal guidelinesfor the protection of human participants and animals in research. But following closely in the rankingare many unintentional regulatory forces, such as environmental, health, and safety legislation intendedto protect the general public; or radiation safety regulations, environmental and worker protection laws,and Good Laboratory Practice Standards intended to protect labor.

Administrative constraints were listed as a major area of concern by 37.5 percent of all respond-ents. Perhaps predictably, university administrators saw the financial accounting requirements as mostpernicious. When the responses are disaggregate, 64 percent of the administrators v. 23 percent of thelab directors, listed financial accounting requirements as a major force.

Both groups found “social regulations” to play a major role, Social regulations include laws to en-courage small business and minority business subcontracting, Fair Labor Standards, and Equal Employ-ment Opportunity Commission and Affirmative Action requirements. Because respondents were not askedto rank their answers, it is not clear how much of a force they feel these regulations present. The fre-quency of responses, however, indicate that they are not as significant a force as the administrative andaccounting requirements.

There was not a strong indication that either group feels that there are major regulatory forces affectingthe research agenda, although a few respondents from each group indicated that national priority set-ting is increasingly affecting the research agenda, particularly as a result of increased defense spendingand concern about industrial competitiveness. Only 17.5 percent of the respondents listed controls ondissemination of research results as a major force.

When asked whether they believed that the controls they had listed were any different from con-trols experienced by other universities, 86 percent of the administrators responded that they did not feel“singled out.” The major regulatory forces affecting them are most likely the same for other researchinstitutions.

Page 88: The Regulatory Environment for Science - Princeton

90

demiological research in the area of cancer pre-vention because of its doubts about the long-termcommitment of NCI to such research. A cancerprevention epidemiological study would requirea commitment to long-term funding without anyimportant intermediate benefits to be derived fromthe research. National policy, in contrast, focuseson research with measurable output within a shorttime span, and the center has rot been satisfiedthat funding for cancer prevention or epidemio-logical studies would not be terminated before us-able results could be achieved.

With this exception, the principal factor govern-ing the choice of research activities is the individ-ual strategy of the principal investigator. Becausestaff investigators are selected on the basis of thecongruence of their expertise and interests withthe center’s research mission, the focus of its ef-fort is maintained without any formal structureto review research proposals for consistency withthe center’s objectives.

Virtually all of the research conducted at thecenter is investigator-initiated, and very little iscontract work. NIH and NSF proposals—as wellas others—are peer reviewed, and this review, insome instances, may tend to discourage fundingof highly innovative research projects in favor ofmore conventional undertakings. However, thevagaries of peer review are regarded as less in-fluential on an organization such as this centerthan they might be for another research facilityof lesser reputation.

The impact of all types of government regula-tion on the selection of research opportunities isalso fairly minor. The cost of compliance with reg-ulatory requirements is incorporated into grantapplications and is rarely so significant that fund-ing is jeopardized. Moreover, the very propertiesthat make a substance or procedure a candidatefor regulation often also make it attractive as asubject for research.

Control of the Research Process

Because NIH funding is so important to thecenter’s research, the protocols established by thatagency strongly influence the conduct of that re-search. Moreover, NIH guidelines have become,in many instances, the industry custom for lab-

oratory research, and thus are followed even be-yond the jurisdiction of NIH. NIH has publishedguidelines for animal testing, for the use of hu-man subjects, for recombinant DNA research, andfor the use of investigational drugs. The latterguidelines apply when NCI provides a pharma-ceutical on which research is to be conducted; ifan investigational drug were applied by a privatefirm, the firm would have to obtain FDA approvaland meet FDA requirements established for suchdrugs. In general, the center attempts to relieveits researchers from the nonsubstantive burdensof these guidelines and to assign to administra-tive staff the responsibility for paperwork andmanagerial burdens.

A more fundamental effect on the conduct ofresearch is perhaps caused by the structure of theusual grant agreement. Although NIH is author-ized to make research grants for up to 7 years,the typical grant is generally for 3 to 5 years andnever longer. On the average, an NIH grant pro-vides 3½ years of funding. This limit is set, notso much by policy considerations, which argua-bly favor longer grants that are relatively easy toadminister and require burdensome administra-tive procedures less frequently, as by the demandsof peer review. Often reviewers recommend fund-ing for less than the total period of time requestedin order to allow for thorough review at frequentintervals. A possible impact of such temporallimits, however, is to promote research protocolsthat permit tangible work products in shorter timeperiods.

A principal limitation of NIH funding is thatit does not fully cover experimental patient care.This is not a reflection of any NIH judgment thatit is not responsible for experimental patient care,but rather a dictate of the limitations of availablefunds. Third-party procedures (e.g., Medicaid andBlue Cross/Blue Shield), in contrast, do not as-sume responsibility for experimental clinical pro-cedures. Thus, a gap exists, at least in theory, be-tween NIH funding and third-party procedures.In practice, the gap is less significant than it ap-pears, in part because the line between clinical re-search (nonreimbursable) and “best patient care”(reimbursable) is not precisely drawn. However,as pressure to control medical costs increases,third-party providers are likely to draw a morerestrictive line of distinction and more experi-

Page 89: The Regulatory Environment for Science - Princeton

91

mental patient care may have to be funded byNIH.

Management of Risks

The center is subject to OSHA, the Environ-mental Protection Agency, and NRC regulationsgoverning work place safety, environmental pro-tection, and radiological health. The center’s Di-vision and Laboratory Chiefs are responsible forassuring compliance with such regulations bythose they supervise. In addition, the center is sub-ject to the Joint Committee on Hospital Accredi-tation, a hospital self-regulating organization thatapproves radiation, chemical, and biologicalsafety programs. The State and city in which thelaboratory is located further regulate aspects ofhealth, safety, and environmental hazards, includ-ing fire and chemical hazards and the biohazardsof recombinant DNA.

As noted, NIH guidelines govern the use of hu-man subjects in research, and these guidelines gen-erally require the approval of the research pro-tocol by an Institutional Review Board and theinformed consent of the human subject or legalrepresentative. Typically, the center has found,its patients are relatively sophisticated about thenature of their disease, the hazards of research,and the potential for harm. The center is a ter-tiary care facility, and the majority of its patientshave substantial experience with medical proce-dures and treatments. Therefore, the content ofits informed consent form can sometimes be rela-tively more technical than corresponding docu-ments at other facilities such as at a primary carehospital.

The informed consent form states the center has“no formal policy” with regard to compensationof research subjects who are injured in the courseof research. In practice, medical cost incurred asa result of such injury might be paid by the third-party provider. In particular circumstances, how-ever, the center might seek to assume or dividethese costs.

Controls onInformation

Communication of

Because the center regards its staff to be bothintelligent and well-educated, it relies on the pro-vision of chemical and radioactive hazard infor-mation as the principal instrument for managingrisk hazard. The center employs one half-timechemical safety information officer whose exclu-sive responsibility is to provide relevant informa-tion about substance hazards and appropriateprecautions and responses, as well as to performlaboratory inspections.

The center, however, is not subject to either theOSHA hazard communication standard or theState right-to-know law. The OSHA standard, asnoted, applies only to workers within the manu-facturing sector. The State law exempts researchlaboratories not involved in the production ormanufacture of goods for direct commercial sale.The regulations issued under the law require anapplication for an exemption to be filed by eachlaboratory. The center has filed such an applica-tion and, while that application is pending beforethe State department of public health, the centeris exempt from right-to-know requirements.

A principal objective of the research conductedat the center is the dissemination of user results.Dissemination serves the interests of the center’smission by advancing the state of knowledgeabout cancer and also enhances the reputation andstanding of the center and its staff. Therefore, thecenter will not accept research funding that re-quires secrecy. Nevertheless, some limits on dis-semination are accepted. Grant agreements mayspecify, for example, that publication of researchresults might be delayed for up to 3 months inorder to allow for review by the sponsor. Suchlimits do not generally appear in NIH or NSFgrant agreements. Indeed, the policy of Federalfunding organizations is to stimulate the dissem-ination of research results, and failure to publishmay be considered negatively in evaluation of newgrant proposals.

Page 90: The Regulatory Environment for Science - Princeton

92

A theoretical restriction may also derive from suits seemingly adverse to private clients. Suchthe practice of some center staff to consult pri- an impact may be completely unconscious and im-vately for industry. In such circumstances, the possible to demonstrate, but the restriction maystaff member could feel disinclined to publish re- nevertheless operate.

PROFILE OF A REPRESENTATIVE

The Department of Chemical Engineering atthis major research university engages in a rangeof research projects and activities in support ofthe academic development and advancement ofits students and faculty. Although much of thisresearch is funded from outside sources, the de-partment does not principally engage in clientservices, but rather seeks grant support for re-search into chemical transformation and separa-tion processes, and energy intensive functions ofrelevance to the interests of its members. The prin-cipal areas of interest include coal conversion,synfuels developments, utilization of micro-orga-nisms in chemical processes, and polymer produc-tion. Approximately 30 to 40 percent of the de-partment’s research funds come from privatesources (e. g., one international corporation con-tributes $1 million annually); the remainder areFederal grants, principally from the departmentsof Energy and Defense (DOE and DOD) and NSF.

Much of the department’s research is conductedat a very small scale, utilizing small quantities(often less than a gram) of chemicals in any partic-ular step. However, a great variety of chemicalsare involved in the department’s research pro-grams so that the cumulative hazard of the re-search is highly variable. An exception to this pat-tern of low volume and high variety is in the areaof combustion engineering, in which substantialvolumes of petroleum products are burned in thecourse of research.

In many instances, laboratory research is con-ducted by students rather than by staff techni-cians. Because the turnover rate for both studentsand technicians is fairly high, most of the peoplewho work in the department’s labs are not sub-ject to long-term exposure to laboratory chemi-cals. The students, however, may be more vul-nerable to risks because they are relatively naiveand untrained in laboratory safety procedures andmay be somewhat less cognizant than career

UNIVERSITY LABORATORYworkers of the hazards of their research. This sit-uation influences the amount and type of safetymeasures taken by the university.

Controls on the Research Agenda

Because the university lacks the formal hierar-chical administrative structure of a commercialenterprise, decisions to undertake research arerelatively individual and idiosyncratic, rather thandirected by an overall institutional strategy.Clearly, the availability of funding affects the re-search agenda, but this financial incentive differsfrom that of an industrial laboratory, which mustmake its research choices based, in part, on anestimate of the market impact of the anticipatedresults. Further, because of its stature in academiccircles, the department carries considerable weightin negotiations with potential sponsors over thecontent and conduct of its research activities.

This is not to say that research sponsors haveno influence over the manner in which their fundsare utilized. Many Federal sponsors and somefoundations, as well as virtually all commercialsponsors, are mission-oriented. Such sponsorsmake their grant awards based on whether theyperceive the research to promote their particularmission objectives. Approximately 75 to 80 per-cent of the department’s research funds come fromsuch mission-oriented sponsors, which includeDOD and DOE, the petroleum and chemical man-ufacturing industries, integrated circuit manufac-turers, and pharmaceutical companies. The re-mainder comes from sponsors seeking to support“basic” research–some DOE programs, NSF, andprivate foundations.

The impact of regulation on the selection of re-search opportunities is, by contrast, fairly minorcompared to the academic interests of the facultyand the mission objectives of sponsors. Becausethe department’s research seeks to be at the cut-

Page 91: The Regulatory Environment for Science - Princeton

93

ting edge, regulation addressing the specific sub-stances or processes under investigation may notyet have been developed. The cost of complyingwith health, safety, and environmental regulationsis rarely so significant that research is foreclosed.To the extent that such regulation raises the costof certain lines of inquiry and thus may divert at-tention to other research activities, this effect iscountered somewhat by intellectual interest instudying “problem” chemicals and process—thosethat may be the subject of extensive regulatoryattention.

Because combustion engineering research hasdifferent characteristics from other research con-ducted in the department, the effect of regulationof this research is also somewhat different. In par-ticular, research into the emissions produced bycombustion processes requires the use of substan-tial volumes of fuel. Often, from a purely researchperspective the fuel of choice would be benzene,but benzene is the subject of such intense regula-tory scrutiny that researchers are reluctant to useit if a relatively less problematic alternative suchas toluene is available. This reluctance stems bothfrom current regulatory activities (principally byOSHA) and from the concern that department re-searchers share with regulators over the hazardsof the substance.

A similar effect may be discerned with respectto the use of radioisotopes in research. Becauseof the regulatory burden of becoming licensed tohandle radioisotopes and the cost of their disposalunder NRC regulations, their use is discouragedif alternative research procedures are available.Again, the effect may be to skew the allocationof research resources.

Controls on the Research Process

Most research grant agreements specify how re-search activities are to be conducted; but the levelof specification varies considerably with the spon-sor. Sponsors of basic research typically requirea proposal that sets forth the research protocolin sufficient detail to allow reviewers to judge thetechnical adequacy of the research. Such protocolsare often thereafter incorporated by reference intogrant agreements, but most sponsors do little su-pervision or monitoring of performance underthese agreements.

Industrial sponsors may provide “foundation”grants intended to support the department’s gen-eral research activities, rather than any particu-lar project. Such broad grants are less likely toconstrain the specific conduct of research. Whenindustrial sponsors underwrite a particular re-search project, however, the grant agreement mayinclude specification of the research protocol. Sur-veillance of the department research, however,is somewhat more extensive in that reporting re-quirements are more often imposed and site visitsmore frequent.

Mission-oriented Federal agencies, such asDOD and DOE, tend to specify research protocolsin the greatest detail. Such protocols are drafted,not only to assure the technical adequacy of theresearch, but also to assure that research resultswill be usable by the sponsor in achieving its ob-jectives. Commonly specified details include per-formance requirements, cost allocations, equip-ment, milestones, and personnel. This greaterspecification is typically accompanied by greatersupervision and monitoring of the research. TheDepartment of Defense is especially strict in itssurveillance of the research it sponsors; however,this university does not automatically complywith DOD’s (or any other sponsor’s) requests forsecrecy in the conduct of research. Because it isan academic institution principally devoted toeducation, the university laboratories are widelyopen to students and faculty. Classified or weap-ons-related research is deemed an inappropriateactivity for an academic institution, although itmay be conducted at university affiliated labora-tories off campus.

Management of Risks

The university maintains a Safety Officer andan Office of Environmental Medical Serviceswhich are intended as a resource to consult in thedesign of laboratory risk management activitiesand to facilitate compliance with environmental,health, and safety regulations. In addition, theuniversity faculty maintains nine standing com-mittees that develop risk management proceduresto be used in research:

● Council on Environmental Health andSafety,

• Committee on Assessment of Biohazards,

Page 92: The Regulatory Environment for Science - Princeton

94

Committee on Radiation Protection,Committee on Safety,Committee on Animal Care,Committee on the Use of Humans as Exper-imental Subjects,Committee on Toxic Chemicals,Committee on Radiation Exposure to HumanSubjects, andCommittee on Reactor Safeguards.

The university will not approve a grant agreementor research contract that has not been approvedby the committee having jurisdiction over suchresearch.

In general, the safety practices established bythese offices and committees are equivalent to, ormore stringent than, corresponding regulatory re-quirements. Often, the faculty have been involvedin the development of Federal regulatory require-ments and their influence is reflected in the re-quirements adopted. Nevertheless, researchers arealways alert to proposals for regulatory require-ments that specify unachievable standards or un-workable administrative burdens, For example,a new State plumbing code originally would haverequired that no micro-organisms be disposed ofin the wastewater of laboratories where biotech-nological research was being conducted, eventhough there are safe and acceptable levels ofmicro-organisms commonly allowable in waste-water generated by nonresearch facilities.

The department’s laboratories are subject to arange of environmental and occupational safetyand health requirements that typically include: 1)safety or health standards for emissions of, or ex-posure to, a particular substance; and 2)documentation of compliance with such stand-ards. Many of these regulations specify the chem-ical substance as the unit of regulatory attention,and the paperwork burden of reporting require-ments varies with the number of chemical sub-stances used in research. Because department re-search typically utilizes tiny quantities of amultitude of chemical substances, the paperworkburden is substantial even though the exposureand emission standards may be fairly easy toachieve.

An example is provided by the regulations un-der the Resource Conservation and Recovery Act

(RCRA) for hazardous waste disposal. These reg-ulations essentially prohibit the disposal of sub-stances on the hazardous waste list by conven-tional means (emission standard =0) and insteadrequire that licensed hazardous waste transportersand disposal facilities be utilized. The regulationsfurther specify the packaging and paperwork re-quirements to be followed in the disposal of haz-ardous wastes. At this university, hazardouswaste from chemical engineering laboratories issent to the safety office where such waste is col-lected from all parts of the university. Much ofthis waste is unique and packaged in tiny vials.Traditional practice has been to combine vials ofcompatible wastes in “laboratory packs’’—con-ventional drums lined with absorbent material—before shipping them off for disposal. The regu-latory manifest, however, requires that the con-tents of each vial in the lab pack be separatelyidentified, and monthly and annual generator re-ports required under RCRA also must include in-formation on each vial. Because the waste in eachvial may be unique, the information necessary tocomplete the manifest may not be routinely avail-able, and the safety officer may encounter somedifficulty in preparing the waste for shipment. Abarrel of industrial waste, in contrast, is likely tocontain only one waste type that is routinely gen-erated. The paperwork burden for this barrel iscorrespondingly light: Only one substance needsto be identified on the manifest and the informa-tion to be provided is the same day after day.

Similarly, the State’s Clean Air regulations re-quire an individual permit for each vent throughwhich air pollution emissions are made. Theuniversity has approximately 20 such permits forthe Department of Chemical Engineering and eachone is supposed to include a specification of thesubstances being emitted as well as the technol-ogy being employed to reduce those emissions.Because laboratory work varies over time, how-ever, any specification of substances in the per-mit is necessarily uncertain. Moreover, if anhonest effort is made to specify all of the sub-stances likely to be utilized in the laboratory, thepermit application must then demonstrate that thetechnology is in place to reduce emissions of thefull list of substances. Nothing in this State lawexempts from the permit requirement substancesthat are being emitted in de minimis quantities.

Page 93: The Regulatory Environment for Science - Princeton

95

Essentially, these two environmental regulatoryprograms were devised with a different model offacility in mind; it is increasingly apparent thatregulations devised for industrial facilities may bepoorly suited for application to research labora-tories. RCRA regulations do include a partial ex-emption for small quantity generators, but thisis of no use to the university, which clearly doesnot qualify as a small generator because of its sub-stantial total volume of waste. If paperwork bur-dens are to be more closely related to the hazardsof the regulated activity, new efforts are requiredto tailor regulatory requirements to the type ofenterprises being regulated.

One such effort is suggested by the statementof projected rulemaking issued by OSHA on April29, 1985, concerning health hazards of chemicalsin laboratories. In that statement, OSHA ob-served:

Existing OSHA standards are designed to pro-tect employees who are engaged in work involv-ing exposure to only a few toxic chemicals dur-ing relatively standardized, continuous orrepetitive processes. In contrast, laboratoryworkers are exposed to a multitude of toxic sub-stances under frequently changing or unpredict-able conditions. OSHA will examine whetherprudent work practices and protective equip-ment, chosen for the specific facility and task,are more effective, feasible and economical forlaboratory work than adhering to OSHA’s cur-rent substance specific exposure standards.

Such a proposal appears to be better suited toachievement of the goals of environmental, health,

a n d s a f e t y r e g u l a t i o n s t h a n t h e t r a d i t i o n a l a p -

proaches. The effect of traditional regulations, inmany instances, is to consume the time of safetypersonnel in the documentation of compliancerather than to stimulate such people to devotetheir time to analyzing problems for which rou-tinized solutions are not readily achievable.

Restrictions on Communicationof Information

It is the formal written policy of the universitythat people who may be exposed to hazardsshould be informed about the nature of these haz-ards and how to protect themselves and others

who also may be exposed. Faculty, administra-tion, and research supervisory personnel are re-sponsible for promoting safe practices and for in-forming individuals working in laboratories aboutsafety in connection with the work being con-ducted.

This policy derives from the university’s assess-ment of its own responsibility under ethical andgeneral liability principles rather than from a par-ticular hazard disclosure regulation or statutoryrequirement. For example, the OSHA hazardcommunication standard applies only to work-ers within the manufacturing sector. Therefore,no university laboratory is subject to its re-quirements.

Similarly, the State right-to-know law exemptsresearch laboratories not involved in the produc-tion or manufacture of goods for direct commer-cial sale. The regulations issued under the law re-quire an application for exemption to be filed byeach laboratory. The Department of ChemicalEngineering has filed such an application for itslaboratories, which is presently pending beforethe State Department of Public Health (DPH). Un-der the regulation, the department is exempt un-til DPH rules on the application.

Nevertheless, the State law does affect univer-sity operations in other ways. A number of ven-dors have terminated all business in the State inresponse to the law, so that alternative vendorshave had to be found in some instances. In addi-tion, MSDSs being supplied with chemical prod-ucts sometimes appear to have been prepared bylawyers to achieve minimal compliance with reg-ulatory standards and to provide the least in-criminating information possible, rather than bypersons desirous of promoting proper manage-ment of substance hazards. For this reason, whenthe university is establishing safety procedures,it frequently uses MSDSs prepared by an inde-pendent service, rather than those supplied bymanufacturers.

Clearly, a principal objective of academic re-search is publication and dissemination of researchresults both to advance the state of knowledge inthe research field and to advance the reputationand study of the department and its faculty. Asnoted, therefore, the university will not approve

Page 94: The Regulatory Environment for Science - Princeton

9 6

funding arrangements that require secrecy in theconduct of research and the dissemination ofresults.

Nevertheless, some limits on dissemination arecommon. Grant agreements may specify, for ex-ample, that publication of research results mustawait a release by the sponsor. In some circum-stances—for example, where proprietary informa-tion has been licensed to the department for theconduct of research—the sponsor may requirethat articles proposed for publication be submittedfor review in advance to assure that inappropri-ate disclosures of patentable material or tradesecrets are not made.

The department’s agreement with the corpora-tion cited above, for example, illustrates howthese provisions operate. Under that agreement,the university will hold the patent on any discov-ery made in the course of the research funded bythe corporation, subject to the corporation’sroyalty-free license. If the university does not de-velop its patent, then the rights will revert to thecorporation. The corporation is also given 10 daysto review articles proposed for publication andto make any objections. Of course, the principalrestraint on publication under this arrangementmay not be the final restrictions of the grant agree-ment but the desire to maintain a harmonious rela-tionship with a major source of research funding.

Page 95: The Regulatory Environment for Science - Princeton

Chapter 7

Community Control ofResearch: Two Case Studies

Photo credit: National Institutes of Health

Page 96: The Regulatory Environment for Science - Princeton

Chapter 7

Community Control ofResearch: Two Case Studies*

This chapter describes two cases involvingprecedent-setting interventions into scientific in-quiry by a local government in Massachusetts.The first describes the city’s two-phase regulationof recombinant DNA molecule technology—in1977, passage of the country’s first law regulat-ing rDNA research, and in 1981, a revised law,enacted in response to research and development(R&D) activities of newly established biotechnol-ogy firms. The second case describes the city’s ef-forts to proscribe the handling and testing of cer-tain chemical warfare agents by a consulting firmunder contract with the Department of Defense(DOD). The public controversy over the second

case was kindled in October 1983 and has beenthe subject of litigation since March 1984 whenthe city promulgated its first regulation.

The case studies that follow describe the eventsleading up to the respective regulations, discussthe possible national impacts of these types ofcases, and survey the arguments presented in fa-vor of and opposed to local regulation of research.The report also examines the general policy im-plications of these cases on the issue of freedomand accountability in the conduct of scientific re-search.

RESEARCH INVOLVING RECOMBINANT DNA MOLECULES

The controversy in Cambridge, Massachusetts,over research involving the use of recombinantDNA molecules began in Spring 1976. At thattime, the administration of Harvard Universitywas considering a proposal for the renovation ofone of its biological laboratories. The purpose ofthe renovation was to construct a facility thatwould conform to requirements of the NationalInstitutes of Health (NIH) for performing certainclasses of rDNA experiments, designated at thetime as “moderate risk.” NIH was also in the proc-ess of issuing guidelines that defined six classesof gene-splicing experiments: research exemptedunder the guidelines; P-1; P-2; P-3; P-4; and re-search prohibited under the guidelines. The plannedHarvard laboratory was expected to meet the per-formance and physical containment specifications

*This chapter was prepared by OTA staff, based largely on workperformed under contract for OTA by Sheldon Krimsky and onreviewer comments thereon. Dr. Krimsky is Associate Professor inUrban and Environmental Policy, Tufts University, and in 1984 was

appointed Chairman of the Cambridge Scientific Advisory Com-mittee that played a major role in the Arthur D. Little controversydetailed in this chapter. Professor Krimsky’s contract report wasreviewed by numerous experts, both within and outside of OTA,includlng Arthur D, Little, Inc. , and other participants in the con-troversy.

of a P-3 facility, designed to provide a protectivebarrier against the release of experimental organ-isms. A laboratory of this type required severalhundred thousand dollars in equipment and spe-cial construction techniques.

When plans for the $380,000 research labora-tory were being discussed by the universityadministration, several Harvard scientists ques-tioned having an rDNA facility in a densely popu-lated area close to other research and teachingactivities. The issue was taken up by Harvard’suniversity-wide Committee on Research Policy.The Committee responded by holding an openmeeting for the Harvard community which wasalso attended by a member of the Cambridge citycouncil and a reporter from a weekly newspaper,The Boston Phoenix. A news story on the meet-ing, “Biohazards at Harvard’ ’-the first media re-port of the controversy surrounding the newlaboratory—appeared in the Phoenix on June 8,1976.1 Troubled by the story, Cambridge MayorAlfred Vellucci decided to hold hearings on rDNA

‘Charles Gottlieb and Ross Jerome, “Biohazards at Harvard, ” l?os-fo~ Phoenix, ]une 8, 1976,

9 9

Page 97: The Regulatory Environment for Science - Princeton

100

research at Harvard. Mayor Vellucci was sup-ported and advised by several scientists in the city,including some of Harvard’s own faculty. 2 Whenthe city council held hearings on June 23 and July7, 1976, scientists and physicians affiliated withBoston-area universities and hospitals were amongthose who testified. Academic and biomedical re-search centers outside of Cambridge, contemplat-ing rDNA research at the P-3 level, were con-cerned that the imposition of a city-wide ban oncertain rDNA experiments would eventually af-fect their own institutions.

Harvard’s Committee on Research Policyagreed unanimously that the research should pro-ceed despite its potential hazards. According tothe Committee, the new facility provided a suffi-cient margin of safety. Harvard set up a parallelreview committee comprised exclusively of scien-tists. Known by the name of its chairman, theBranton Committee also issued a favorable re-sponse to the proposed rDNA facility. On June14, 1976, a week prior to the first Cambridge hear-ing, the Harvard Corporation authorized con-struction of the P-3 laboratory.3

Subsequent to the public hearings, the citycouncil, frustrated by the technical complexity ofthe issues and perplexed by the polarization ofviewpoints, voted on the recommendation of oneof its members to establish the Cambridge Ex-perimentation Review Board (CERB). The citycouncil order contained no specifications aboutthe composition of the citizen board, leaving theappointments to the discretion of the city man-ager. The city council also requested that Harvardand the Massachusetts Institute of Technology(MIT) accept a 3-month, good-faith moratoriumon any P-3 level rDNA research. Both universi-ties accepted the moratorium, thus giving thenewly established review board an opportunityto evaluate the risks. Since the new laboratorywas expected to be completed by the spring of1977, the city’s moratorium on research did not

‘For a detailed account of Cambridge, MA’s involvement in therDNA controversy, see ch. 22, Sheldon Krimsky, “Local Initiativesfor Regulation, ” Genetic Alchemy: The Soa”al History of the Re-combinant DNA Controversy (Cambridge, MA: The MIT Press,1982).

‘Marc M. Sadowsky, “Rosovsky Approves DNA Research Lab, ”Harvard Crimson, June 15, 1976. Also, Richard Knox, “Harvardand Genetics Controversy, ” Boston Globe, June 22, 1976.

postpone any work. However, Harvard pro-ceeded with the laboratory’s construction with-out assurances that an occupancy permit wouldbe issued.

Members of CERB were appointed by the citymanager in late August 1976. The manager con-sciously avoided the appointment of any biologiststo the nine-member committee on the groundsthat they were already divided on the question.(Initially appointed as a full member, the Com-missioner of Health and Hospitals subsequentlybecame ex officio. )

CERB met over a period of 4 months betweenSeptember and December 1976. Harvard and MITagreed to a 3-month extension to the good-faithmoratorium on P-3 experiments otherwise sched-uled to elapse in September, The citizens’ com-mittee issued its report to the city manager andthe Commissioner of Health and Hospitals in Jan-uary 1977. The report stated that P-3 rDNA re-search may be permitted on the stipulation thatadditional safeguards be added to the require-ments of the NIH guidelines. CERB also recom-mended passage of a new ordinance that includedthe creation of a Cambridge Biohazards Commit-tee (CBC) to oversee all rDNA research in the city.The committee’s recommendations were enactedinto law on February 7, 1977. The law was notsubjected to legal challenge by any of the affectedparties. Overall, public reaction to the outcomewas favorable and controversy subsided quickly.

A second debate over rDNA activities eruptedin Cambridge during 1980. This time the issue wasover R&D activities in genetics. Biogen, a newlyformed Swiss biotechnology firm, seeking its com-mercial and management headquarters in theUnited States, chose a site in an area of Cambridgezoned for manufacturing and light industry. Un-daunted by the city’s reaction to rDNA experi-ments 4 years earlier, Biogen officials notified thecity manager and the health commissioner of thefirm’s interest in selecting a site and its willing-ness to conform to all Federal and local regu-lations.

CBC called a public hearing on October 28,1980. Unlike the first rDNA debate, public op-position was mild. No biologists testified againstsiting the new biotechnology facility or spoke insupport of additional local controls. Furthermore,

Page 98: The Regulatory Environment for Science - Princeton

beyond those employed by Biogen, Boston-areascientists were not present at the hearing. Publicreaction centered around the release of geneticallymodified biological agents into the air and water,particularly when cultures of rDNA moleculeswere prepared in large scale.

In response to public anxieties over commer-cial gene splicing, the city manager once againcalled on the Cambridge Experimentation ReviewBoard to respond. Since CBC was responsible forimplementing the rDNA ordinance, CERB consid-ered it wise to involve this body in any decisionson revising the law. Thus, CERB chose to holdits hearings in collaboration with CBC. The jointcommittee developed a consultative relationshipwith representatives of Biogen, Harvard, andMIT. After several months of hearings and de-liberations, the CERB-CBC review panel issuedrecommendations emphasizing safeguards againstthe promiscuous release of genetically modifiedorganisms and, to a somewhat lesser degree,against occupational hazards. The Cambridge citycouncil voted the recommendations into law onApril 23, 1981. In contrast to the extensive pub-licity surrounding the passage of the first rDNAlaw, this new enactment was accompanied by lit-tle public discussion, and was only mildly ac-knowledged by the national media.

The new law established a permit system forall institutions intending to use recombinant DNAmolecule technology. The ordinance distinguishes

101—.

between small scale and large scale permits, thelatter being required for cultures of genetically

modified organisms in volumes greater than 10liters. The deliberate release into the sewers,drains, or the air of any organism containingrDNA molecules is prohibited. For fermentationprocesses, the law also requires effective sterili-zation of spent organisms before they are releasedinto the waste stream.4

During the second rDNA debate, the city con-vened a citizen review process while Biogen wasin the planning stages of siting and constructingits new facility. None of the firm’s research washeld up as a consequence of the city’s delibera-tions. Similarly, Harvard’s P-3 laboratory wasscheduled for completion in the spring of 1977,several months after the city’s moratorium on P-3 experiments was terminated. Neither of the twoCambridge rDNA laws was subjected to a legalchallenge. The universities considered that optionbut favored a negotiated settlement that avoidedlitigation. The 1981 Cambridge rDNA law is stillin effect and is administered by the Commissionerof Health and Hospitals, who currently heads theCambridge Biohazards Committee.

‘A brief history of the passage of rIINA legislation in nine citiesand towns (including Cambridge, LIA ) and two States is presentedin Sheldon Krimsky, et al., ,~lunicipal and State Rec[lmbinant DNALams (hledfor-d, LIA: Tufts U n i v e r s i t y , lune 1Q82)

TESTING CHEMICAL WARFARE AGENTS

The second case centers around Arthur D. Lit-tle, Inc. (ADL), a multi-faceted management andtechnology consulting firm with its world head-quarters in Cambridge, Massachusetts. The firm,which has been operating in Cambridge since theearly part of the century, has offices in Europe,Canada, and South America, and a work forceof 2,500.

Around June 1982, ADL decided to renovatean existing chemical laboratory with state-of-the-art safety features that would enable the firm totake on work with highly toxic chemicals. Therenovated laboratory was designed to meet the

specifications of DOD for working with “chemi-cal surety materials, ” —chemical warfare agents—consisting mainly of nerve and blister agents.

The company’s investment in the laboratory ex-ceeded $750,000. The Philip L. Levins Laboratorywas planned to occupy 1,300 square feet in ADL’sAcorn Park, a 40-acre complex located at thenorthern boundary of Cambridge, near the ad-joining towns of Arlington and Belmont. Becauseof the extensive renovation required, ADL appliedfor and was issued a building permit on Decem-ber 10, 1982. Approximately a month later, ADLpersonnel met with the Cambridge city manager,

Page 99: The Regulatory Environment for Science - Princeton

102

the fire chief, and officials of the police depart-ment to inform them about the new testing facil-ity. Notification of the police was in conformitywith DOD stipulations; surface shipments of thechemical nerve and blister agents require a po-lice escort. ADL disclosed the general nature ofthe facility and indicated that, among its func-tions, it would be used for testing chemical agentssupplied by the army. According to an official ofADL, the company was not requested to provide“specific names and toxicities of the materials itwas planning to test in the new laboratory.”5

ADL requested that city officials keep confiden-tial the location of the laboratory and the typeof work to be undertaken there. Public safety con-siderations were given by the firm as the reasonit requested nondisclosure. The firm maintainedthat its policy of confidentiality would reduce thechances that the laboratory would be a target forvandalism or terrorism. The city manager, police,and fire chiefs complied with the request. ADLfiled for an occupancy permit on May 18, 1983.The certificate of occupancy was issued on May25. The laboratory was approved for operationby DOD on September 19, 1983.

Responding to the cooperative arrangementthat existed between the fire departments of Cam-bridge and its neighboring towns, ADL also con-tacted officials of Arlington and Belmont in Sep-tember 1983 to inform them of the new facility.At a meeting with Arlington’s town manager, offi-cials of the police and fire department, and thetown’s civil defense officer, ADL continued its pol-icy of requesting confidentiality about the natureof its facility.

Arlington’s town manager, however, informedADL that he planned to introduce the issue of thelaboratory at the upcoming meeting of the townselectmen. On that same day, October 14, 1983,ADL issued a press release announcing the estab-lishment of a laboratory to be used for “advancedchemical analysis of toxic and hazardous chemi-cals so as to develop improved methods for de-tecting, identifying, and detoxifying such mate-rials and new means of protecting people from

5Reid Weedon, Vice President of Arthur D. Little (ADL), com-ments made on Mar. 7, 1985, during a community debate betweenADL and the North Cambridge Toxic Alert.

them. ” The news release omitted any mention ofchemical nerve or blister agents. On October 20,1983, the story of the laboratory was reported inthe Arlington Advocate and the Boston Globe.The Globe speculated that chemical warfareagents may be among the agents handled at thefacility. On October 17 and 24, 1983, respectively,the Arlington selectmen and the Cambridge citycouncil held public meetings at which the ADLmatter was discussed.

At the October 24 Cambridge council meeting,the nature of the chemical warfare agents suppliedto ADL under DOD contract was disclosed bycompany officials. By that time, the company hadbegun work on the DOD contract. The councilalso heard residents of the North Cambridge com-munity voice a strong protest against ADL’s test-ing of chemical nerve and blister agents adjacentto a densely populated area. In response to pub-lic concerns, at the same meeting the city councilvoted to establish a “citizens’ scientific advisoryboard” to review the risks associated with theADL laboratory. Individual councillors requestedthat ADL accept a moratorium on its tests ofchemical warfare agents until the city completedits risk assessment. ADL, having to contend withits DOD contract requirements, did not accept amoratorium.

By early winter, the Scientific Advisory Com-mittee (SAC) had still not been appointed, al-though responsibility for implementing the orderspassed by the city council had passed to the citymanager. Long delays between council orders andtheir implementation are not unusual in Cam-bridge. As the city’s principal fiscal agent, themanager must consider the financial impacts ofcouncil orders and the practical consequences ofits policies. In this instance, however, the hiatusbetween the time the SAC was created by thecouncil and the time its members were appointedis indicative of the city manager’s hope that thecontroversy could be resolved quickly. In latewinter, however, the conflict intensified when theCambridge Commissioner of Health and Hospi-tals issued an emergency regulation (March 13,1984) that prohibited “testing, storage, transpor-tation and disposal of five specified nerve and blis-ter agents within Cambridge, until SAC and anindependent hazard assessment has been com-

Page 100: The Regulatory Environment for Science - Princeton

103

pleted and these recommendations have been re-viewed by the Commissioner’s office.”6

Three days later, ADL received a temporary re-straining order against enforcement of the regu-lation from a Massachusetts superior court judge.On March 27, 1984, the temporary restraining or-der was converted into a preliminary injunction.The injunction against enforcement of the city reg-ulation remained in effect until February 27, 1985,after a decision was issued by the Superior Court.

The city manager appointed the membershipto the Cambridge SAC on March 26, 1984. Fol-lowing established tradition, the manager ac-cepted recommendations from the council. Thecommittee was comprised of 16 members, includ-ing scientists, individuals in the fields of publicand occupational health, and residents fromNorth Cambridge.

SAC completed its inquiry and issued a reportin September 1984. The cornerstone of its deci-sion was a series of worst-case scenarios in whichdifferent volumes of nerve agent are hypotheti-cally released into the environment. The analyti-cal calculations for the worst-case scenarios weredeveloped by a risk assessment consultant hiredby the city. Building on those calculations, SACconcluded:

. . . the benefits of research with these chemicalsdo not justify lethal risks to the general public.For this reason, the SAC believed that storageand testing of these chemical warfare agentswithin the densely populated city of Cambridgein the quantities and concentrations used by ADLis inappropriate.

‘Melvin Chalfen, Commissioner of Health and Hospitals, Cityof Cambridge, “Order on the Testing, Storage and Transportationof Chemical Nerve and Blister Agents, ” Mar. 13, 1984.

COMPARISON OF THE CASES

Origins of Local Regulations

The city’s involvement in both rDNA researchand chemical weapons testing started with citi-zen concerns over the research slated for a reno-vated Laboratory facility. Harvard’s P-3 labora-

The majority of the SAC members judged therisks associated with any such work to be unac-ceptable. 7

On receipt of the SAC report, the Commis-sioner of Health and Hospitals made his interimorder—prohibiting any person from testing andhandling three nerve agents and two blisteragents—into a permanent regulation on Septem-ber 18, 1984. Hearings before the MassachusettsSuperior Court resumed, The judge severed theissues into the questions of Federal supremacy andthe reasonableness of the Cambridge order. OnDecember 14, 1984, the Court ruled in favor ofthe city on the supremacy issue. The decision onwhether the Cambridge regulation was reasonableor not and whether it conformed to State law wasrendered on February 26, 1985. Once again theruling favored the city. On the following day, theSuperior Court judge proclaimed the September1985 order of the city “valid and enforceable.” Theinjunction, which had been in effect for 11months, was removed by the court order.

ADL appealed the case to the MassachusettsAppeals Court on March 12, 1984. The court gaveADL immediate relief by reinstating the injunc-tion against the order, pending the outcome ofthe appeal. In response, the city petitioned the Su-preme Judical Court (SJC) of the State and askedthat it take the case over from the Appeals Court.The SJC agreed and heard the case on April 4,1985. In a four to one decision issued on August1, 1985, the SJC upheld the Cambridge regula-tion banning the testing, storage, transportation,and disposal within the city of the five chemicalwarfare agents.

‘Scientific Advisory Committee for the City of Cambridge, Re-port to the Cib hfanager on the Use of Chemical If’arfare Agentsat Arthur D, L“ttie Levins L.aborator}r (Cambridge, hlA: Septem-ber IQ84),

tory, designed to conform to NIH specificationsfor working with rDNA molecules, was in itsplanning stages when the city council learned ofits prospective use. In contrast, Arthur D. Little’stesting laboratory for chemical toxins was com-pleted and set for operation by the time its use

Page 101: The Regulatory Environment for Science - Princeton

104

became known to the Cambridge citizens. In bothinstances, existing facilities owned by the respec-tive institutions were significantly renovated.Building permits were obtained and several hun-dred thousand dollars in renovation costs wereallocated. The planned P-3 laboratory was re-ported in the media after information was ob-tained at a university hearing attended by severaloutsiders. Harvard neither attempted to keep thelaboratory’s presence confidential nor sought toinform city officials and the public of its inten-tions to construct the facility. Funding for therenovated moderate containment P-3 laboratoryand for the research for which it was designedcame from science funding agencies of the Fed-eral Government.

ADL’s Levins Laboratory was paid for entirelyout of company funds. The laboratory wasplanned specifically for the testing of toxic sub-stances. It was anticipated that one major sourceof funding for recouping the investment in the lab-oratory was DOD. Other potential clients wereFederal and State environmental agencies andthose segments of the private sector that, increas-ingly, have become responsible for the control oftoxic substances. ADL sought to have the labora-tory’s purpose and function known only to a se-lect number of local officials in Cambridge, Ar-lington, and Belmont, Massachusetts. Publicsafety was the company’s reason for nondisclosureof the laboratory’s purpose to the general pub-lic. ADL’s efforts to preserve the confidentialityof the lab and the chemical warfare agents it wastesting was thwarted when a local official fromthe neighboring town of Arlington, informedabout the facility, filed a report with the townselectmen.

Types of Local Interventions

When the Cambridge city council learned ofHarvard’s plans for a new laboratory, it requestedboth Harvard and MIT to accept a good-faithmoratorium on rDNA experiments classified asP-3 or greater under the 1976 NIH guidelines, untilCERB issued its recommendations. Harvard andMIT complied. No other intervention was takenby the city until the release of CERB’s report.

In contrast, ADL was unwilling to accept a gen-eral moratorium on its testing of chemical nerveand blister agents pending investigation by acitizens’ committee. However, on February 16,1984 ADL did agree to a 30-day moratorium onperforming any work on new contracts involv-ing chemical warfare agents.

In neither of the two cases did the city attemptto withhold building permits or change the zon-ing regulations. ADL obtained its building per-mit in December 1983, long before the city coun-cil became involved in the issue, Neither of thevoluntary moratoria affected any ongoing re-search projects. The ADL voluntary moratoriumwas short-lived and probably not disruptive. TherDNA moratorium was targeted to research thatawaited completion of the new laboratory. Therewere several months between the end of the rDNAmoratorium (January 1977) and the opening ofthe P-3 facility at Harvard (spring 1977).

In response to ADL’s unwillingness to accepta general testing moratorium, an action that mighthave threatened its contract with DOD, the citycouncil urged the Commissioner of Health andHospitals to act. After several months of discus-sion and consultation, the commissioner issuedan interim public health order that prohibited thetesting of five chemical warfare agents. A courtinjunction kept the order from being enforced dur-ing the entire period of litigation. As of the writ-ing of this report, the commissioner’s order wasthe sole nature of the city’s intervention intoADL’s testing program. SAC did recommend anordinance that, if passed, potentially could affectresearch at universities and other R&D firms. Todate, the proposed supertoxin ordinance has notbeen acted on by the city.

One month after CERB issued its report onrDNA research, the city council passed an or-dinance incorporating the principal elements ofthe recommendations. The rDNA law, amendedin 1981, requires that all individuals or institu-tions undertaking experiments involving the pro-duction of recombinant DNA molecules must belicensed. Except for minor differences, the require-ments for research are ostensibly equivalent to theguidelines issued and periodically amended by

Page 102: The Regulatory Environment for Science - Princeton

105

NIH. The law sets additional requirements for alarge scale permit for which there is no counter-part in the NIH guidelines.

In conclusion, the Cambridge rDNA ordinancefollowed the general framework of the FederalNIH guidelines. It permitted academic and com-mercial research to continue while incorporatingadditional safeguards. The city’s intervention inthe testing of chemical warfare agents involveda specific, local prohibition against the use of fivechemicals. This was the first stage in a long-termplan supported by some city officials to regulateall highly toxic chemical agents in research andcommerce. In May 1984, Health CommissionerMurray Chalfen issued a report that included aproposed ordinance on toxic chemicals and haz-ardous materials. The proposal, along with theSAC’s recommendations, is currently under re-view by the city.

Stage of the Scientific EnterpriseAffected

The first rDNA ordinance in Cambridge hadits direct impact on university research, particular-ly the field of molecular genetics. The regulatoryintervention was directed at a specific techniqueof scientific inquiry, namely, plasmid-mediatedgene transfer, which is of fundamental significanceto genetics research. Any scientific discipline thatplanned to use the technique was ipso facto underlocal regulation, however.

The revised rDNA law of 1981 was a direct re-sponse to the emergence of commercial biotech-nology. Its principal effect was on R&D applica-tions of gene splicing. Special attention was givento large volumes of genetically modified organ-isms, The utilization of large cultures representsa stage beyond basic science. Organisms geneti-cally modified to produce a desired product aretested in pilot plant bioreactors with capacities ofa hundred to several hundred liters, a stage inproduct development prior to manufacturing andproduction. The Cambridge law sets environ-mental and occupational safety requirements spe-cifically for large cultures of rDNA-generatedorganisms.

ADL contracted with DOD to develop detec-tion kits for nerve agents, to study the means bywhich fabrics may be made impermeable to them,and to investigate methods of detoxification. Thefirm’s R&D work incorporated the expertise ofanalytical chemists, product developmentchemists, and electronics specialists. The order is-sued by the city on chemical warfare agents wasnot targeted to a particular research technique ormethodology, as in the rDNA case; instead, it pro-hibited the use of five substances cited in an ADL-DOD contract. The regulation was, therefore,directed at the application of science and technol-ogy for solving targeted problems. In distinctionto the rDNA case, ADL’s research was not de-signed to generate new science. The purpose ofthe research was to supply the army with new in-formation on the handling, detection, and detoxi-fication of chemical warfare agents.

Social Risk Assessment

The two cases illustrate different approaches tosocial risk assessment. This is particularly evidentin the composition, goals, and functions of thetwo citizens’ committees. CERB was a commit-tee comprised of nonexperts in the subject mat-ter under consideration, namely moleculargenetics. Out of eight members, the one who cameclosest in expertise to the field was a physician,board-certified in infectious diseases. The mem-bership of the committee was chosen to reflect ra-cial, ethnic, and neighborhood diversity. It wasdivided equally between men and women. In aninternal memo, one member likened CERB’s func-tion to that of a jury in a legal proceeding.8 Thismemo clarified the role of nonexperts in a tech-nical controversy. CERB was asked to review thedebate among scientists on the safety of rDNAresearch; but it was not asked nor was it equippedor prepared to undertake a risk assessment. Af-ter receiving testimony from experts, CERB mem-bers weighed the strengths of the arguments andon that basis made their decisions.

6A detailed account of CERB’S decisionmaking process is containedin Sheldon Krimsky, “A Citizen Court in the Recombinant DNAControversy, “ Bulletin of the Atomic Scientists, vol. 34, No. 8, Oc-tober 1978, pp. 37-43.

Page 103: The Regulatory Environment for Science - Princeton

106

In contrast, the Cambridge Scientific AdvisoryCommittee was comprised of experts and non-experts with respect to the problems of highlytoxic agents. Of the 16 members, 10 had advanceddegrees in one or more of the relevant fields:physics, chemistry/biochemistry, chemical engi-neering, biology, and public health. SAC was pre-sented with three tasks: 1) to undertake a riskassessment of ADL’s use of chemical warfareagents, 2) to make a determination about accept-able risks, and 3) to advise the city council on arisk management plan.

Although the structures and goals of the twosocial risk assessment processes differed, bothSAC and CERB were given the charge of deter-mining whether the respective research activitiesshould be prohibited, unconditionally permitted,or conditionally permitted. Also, both processesresulted in a proposed framework of risk man-agement involving the creation of a new institu-tional structure for the city.

Parties Affected by the Proposed orActual Regulations

The first Cambridge rDNA law had a direct im-pact on biomedical scientists, including biochem-ists and molecular geneticists who study genestructure and function. The revised law primar-ily affected R&D firms that were investigatingcommercial and medical applications for geneti-cally modified organisms. In the former case,scientists responded as a community to the pros-pect of being regulated and opposed differentialstandards of research between Cambridge andother parts of the country. In the latter case, Har-vard and MIT joined with Biogen to ascertain theimpacts of licensure on rDNA research in theirrespective institutions. The revised law creatednew formal requirements for academic and com-mercial institutions but the actual requirementsfor individual investigators in academe remainedunchanged.

The Cambridge emergency order on nerve andblister agents did not single out the names of anyinstitutions. However, no institution other thanADL is known to have been directly affected. The

agents prohibited for use were taken directly froman ADL-DOD contract. For all practical purposes,therefore, the order was directed at ADL. The reg-ulations covering the use of supertoxins recom-mended by SAC were, however, much broaderin scope and, if passed, probably would affect re-search at other institutions. For example, SACproposed that certain designated hazardous ma-terials proposed for testing, use, storage, or dis-posal within the city must be reported to the Com-missioner of Health and Hospitals at least 3months prior to the date of planned entry intothe city. The substances designated for reportinginclucle: chemical warfare agents (as provided ina list), other nerve agents of different chemicalstructure to those listed when used in chemicalweapons R&D, biological warfare agents, andother highly toxic agents as the Commissionermay designate. SAC also proposed that each useof the regulated agents be reviewed by the Com-missioner and given a site evaluation in writingafter appropriate information is provided. Shouldthe Commissioner find that the use of the regu-lated chemical presented an unacceptable hazardto public health or safety, then a site assignmentcould not be given and the commissioner couldprohibit the use of the materials. And, finally,SAC recommended that in addition to chemicalwarfare agents, the City of Cambridge developpolicies to regulate other supertoxins.

To date, the city has not acted on these recom-mendations, which, if adopted, could significantlyaffect university research. At the least, the pro-posed regulations would apply to any experi-mental uses of substances designated by DOD aschemical warfare agents. Most broadly inter-preted, the rules might regulate research employ-ing any highly toxic chemical such as dioxins,chemotherapy agents, or potent mutagens. In theformer case, the impact to academic scientistswould be minimal for chemical warfare agents arenot widely used in university laboratories (al-though analogs and close derivatives of them maybe more readily found). In the latter case, manychemical and biomedical facilities would be af-fected because it is not uncommon to find somequantities of highly toxic agents in most well-equipped laboratories.

Page 104: The Regulatory Environment for Science - Princeton

107

Legal Issues

The authority of cities and towns to enacthealth and safety regulations is firmly establishedunder State laws. Both the rDNA law and the or-der on chemical warfare agents are examples ofsuch powers exercised by the city of Cambridge.Three generic legal questions arise when the cityregulates an activity under public health andsafety statutes: 1) Are there any procedural er-rors in the process of issuing regulations? 2) Is theregulatory action arbitrary or capricious? 3) Is theregulation preempted by or does it conflict withFederal and/or State laws or authority?

No legal challenges were directed to either ofthe Cambridge rDNA laws. Similarly, laws ofother cities and towns were also enacted and im-plemented without challenge.9 Because no FederalrDNA laws were passed and because Congress hasyet to express a policy on whether it occupies thefield of regulations for gene-splicing, the issue ofpreemption in either of the rDNA cases is gener-ally considered weak. NIH guidelines may havethe force of law to those who receive Federalfunds, but the agency lacks legislative authorityto preempt other political jurisdictions from pass-ing more stringent rules.

Harvard and MIT were prepared to challengethe legality of the rDNA laws if they had pro-hibited or substantially inhibited scientific re-search. As it turned out, the universities avoidedlitigation and accepted rDNA standards some-what stricter than those which were required ofother academic institutions in the country. The“Balkanization” of standards for scientific researchwas a great concern to researchers during theCambridge debate and for years thereafter as Con-gress considered Federal legislation; but the pre-dicted adverse consequences on scientific researchfrom local rDNA laws never materialized. Noneof the 13 communities that passed rDNA legisla-

‘Sheldon Krimsky, “Local Monitoring of Biotechnology: The Sec-ond Wave of rDNA Laws, ” Recombinant DNA Technical Bulletin<

vol. 5, No. 2, June 1982, pp. 79-85. To date, the following citiesand towns have passed ordinances on recombinant DNA research.In Massachusetts: Amherst; Belmont; Boston; Cambridge; Canton;Lexington; Newton; Shrewsbury; Somerville; Waltham, In otherStates: Berkeley, CA; Princeton, NJ; Emery vine, CA.

tion have placed undue burdens on scientific re-search, and scientists have adapted easily to theadditional local requirements.

Cambridge’s public health regulation on chem-ical warfare agents took a different legal course.ADL challenged the order immediately after it wasissued. Counsel for ADL argued that the regula-tion was invalid on all three grounds cited above.The legal question with the widest implicationswas whether DOD-sponsored research performedat a private facility was protected against localregulations. Is this a case where Federal supremacyover local authority applies?

ADL offered the following arguments:

1.

2.

3.

4.

Congress authorized DOD to establish achemical warfare program and this includesthe authority to issue requirements for han-dling and disposing of chemical warfareagents.The framers of the U.S. Constitution as wellas Congress intended the Federal Govern-ment to have exclusive responsibility for na-tional defense. The city’s regulation prohibit-ing ADL from conducting defense-relatedtesting of chemical warfare agents is tanta-mount to interference with government func-tions and represents a clear conflict with theFederal interest.If Cambridge is free to prohibit such workby a duly contracted agent of the FederalGovernment, then so too is any other com-munity. If all jurisdictions followed Cam-bridge, Federal programs in chemical war-fare research would be frustrated.Because ADL is a contractor of the govern-ment, the firm is invested with “derivativesovereign immunity, ” which allows thesupremacy clause of the Constitution to ap-ply to it with equal force to that of the Fed-eral Government.

Counsel for the city argued that two conditionsmust be satisfied for Federal supremacy to hold.Either the Federal Government has explicitlypreempted the field of toxic substances regulationor a fundamental conflict exists between the Fed-eral and local governments on the regulation of

Page 105: The Regulatory Environment for Science - Princeton

108

these substances. According to the city, Congressnever stipulated that testing of toxic substanceswould be exclusively regulated by the FederalGovernment. Moreover, on the question of juris-dictional conflict, the city maintained that the Fed-eral Government possesses other facilities at whichto carry out such tests. The facts do not demon-strate that prohibition of such tests in Cambridgerepresents a fundamental conflict between localand Federal purpose.

On December 14, 1984, a State Superior Courtjudge ruled that Federal supremacy was not in ef-fect for this case. Subsequently, on February 26,1985 after reviewing arguments on the reasonable-ness of the regulation and its legality with respectto State law, the same court found the regulation“valid and enforceable. ” The city’s argumentsprevailed on all the legal points.

The Massachusetts Supreme Judicial Court alsoupheld the regulation, stating in its decision onAugust 1, 1985, that the regulation constituted apermissible attempt by the city to protect its in-

habitants under local police powers derived fromState statutes. The court rejected arguments byADL that the ruling violated the firm’s right todue process or constituted an unjustified interfer-ence in its contract with DOD, The court alsoruled that the regulation is not invalid under theSupremacy Clause of the U.S. Constitution. TheSJC failed to find within Federal statutes congres-sional intent to preempt local communities frompassing health and safety regulations for chemi-cal warfare agents. The court affirmed the rightof local health authorities to prohibit activities aslong as the regulations are not “unreasonable,arbitrary, whimsical, or capricious. ”

The context of legal similitude for the rDNAand the chemical weapons issues is very narrow.In both cases there are Federal guidelines or reg-ulations for certain experimental activities. In bothcases, the city chose to augment or supersede therole of a Federal agency. But from that point, thelegal issues evolved quite differently.

IMPACTS OF THE CITY’S INTERVENTIONS BEYOND ITS BORDERS

The 1976 rDNA debate was covered extensivelyby the national and international media. Little re-search has been done on the impact of the debateoutside the United States, but within this countrythere is documentation about direct and indirecteffects on other municipalities and on nationalpolicies. Nearly two dozen city/town govern-ments and State legislatures considered passinglaws that would have extended coverage of theNIH guidelines to privately funded institutions.In response to the first Cambridge debate, twoStates and four local governments enacted rDNAlegislation. Several communities modeled their cit-izen review process closely on that of Cambridge.The City of Berkeley passed an rDNA law thatincorporated verbatim sections of the Cambridgeordinance. By 1978, however, the ripple effect ofthe first Cambridge rDNA controversy had takenits course and was affecting only a handful ofuniversity communities. The national debate sub-sided and so did the involvement of town and mu-nicipal bodies.

A second wave of community responses brokeafter Cambridge passed its 1981 law. An addi-

tional seven communities in the greater Bostonarea, including the City of Boston, passed simi-lar laws directed at commercial biotechnology butalso applicable to scientific research. In an unusualcase, a law passed in the City of Waltham, Mas-sachusetts, prohibited the use of human experi-mental subjects in recombinant DNA research.This is, perhaps, the first U.S. law prohibiting hu-man genetic engineering.

The rDNA events in Cambridge also had rever-berations in Congress. The publicity surroundingthe Cambridge controversy was one of the keyfactors influencing some Members of Congress tofile bills that would place gene-splicing under Fed-eral regulation. Of the two leading bills, the Sen-ate version, sponsored by Edward Kennedy (D-MA), paid close attention to the events in Cam-bridge, ’” The Kennedy bill contained weakpreemption language, signifying a respect for therights of communities like Cambridge to estab-lish standards of safety for rDNA research in ex-

l~The lt~ding congressional bills were introduced by Representa-tive Paul Rogers (D-FL), H.R. 4759, on Mar. 9, 1977, and SenatorEdward Kennedy (D-MA), S. 1217, on Apr. 1, 1977.

Page 106: The Regulatory Environment for Science - Princeton

109

cess of those required by the Federal Government.Despite considerable congressional activity, how-ever, no legislation emerged during the years ofpeak public interest between 1977 and 1980.

The extensive publicity around the citizen par-ticipation process in the Cambridge rDNA affairprobably did have some influence in the reorgani-zation of the Recombinant DNA Advisory Com-mittee (RAC) in 1978. Department of Health,Education, and Welfare (HEW) Secretary JosephCalifano expanded the size of the RAC from 16to 25 members to accommodate more public par-ticipation. Cambridge became a model for envi-ronmental groups like Friends of the Earth andthe Sierra Club which lobbied Congress and HEWfor broadening public involvement in the decision-making process. One of the members of the Cam-bridge citizens’ committee was appointed to anexpanded RAC in 1979 when 30 percent of itsmembership was drawn from the fields of publichealth and public interest.

ARGUMENTS FOR

The ADL debate over the testing of chemicalwarfare agents is over a year old. It has been ac-companied by a limited amount of national pub-licity. Lower court decisions were picked up bythree national television news networks. The ABCTV news magazine program “20/20” produced asegment on the debate. National Public Radio alsobroadcast a program on “Morning Edition, ” Oc-tober 3, 1984, describing the Cambridge-ADLdebate.

Cambridge is one of at least 12 cities in theUnited States containing firms that have con-tracted with DOD to conduct research with chem-ical warfare agents. This list became public as aconsequence of the “20/20” broadcast. There havebeen no reported actions taken by any of thesecommunities in response to the Cambridge pro-hibition, but it is too early in the legal process tospeculate whether the case might serve as a prece-dent for local regulation of research involvinghighly toxic chemicals.

AND AGAINST REGULATIONSRecombinant DNA Controversy

For Regulation

NIH released its first set of guidelines for rDNAresearch on the same day the city of Cambridgeheld public hearings to discuss Harvard’s plannedP-3 laboratory. The guidelines were issued in re-sponse to concerns by molecular biologists thatgene splicing might result in the unexpected cre-ation of a new epidemic pathogen, toxin-produc-ing bacteria, or a coliform bacteria harboring ahuman cancer virus. In Cambridge, the debatecentered on whether the research should be doneat all and whether the NIH guidelines provideda sufficient margin of safety against an accidentor unintended outcome.

Scientists spoke forcefully on both sides of theissue. Those against the use of a P-3 facility atHarvard for rDNA experiments cited three defi-ciencies in NIH’s role as the overseer of the re-search. First, they argued that the guidelines wereconstructed from untested a priori hypotheses andthey placed little confidence in the regulation’s ef-

fectiveness as a containment strategy. Second, itwas pointed out that the NIH guidelines had noforce over R&D activities that were not fundedby the Department of Health and Human Serv-ices. At the time, biotechnology firms had notsought entry into the city, but that was thoughtnot to be far off. Third, opponents argued thatNIH had not enlisted sufficient participation fromthe general public and other segments of the scien-tific community. Some scientists maintained thatrDNA molecule technology was an unknown anduncharted area of research with unpredictablerisks. They felt it should not be done in proximityto classrooms and other research activities.

When the city was approached by the first ofseveral biotechnology firms planning to locate inCambridge, a new set of public anxieties arose.By that time the city’s rDNA law had been in ef-fect for 3 years. The principal rationale for pas-sage of the revised law was the concern over largevolumes (over 10 liters of culture) of geneticallymodified organisms, and the potential hazards

Page 107: The Regulatory Environment for Science - Princeton

110

associated with occupational exposure and envi-ronmental release.

The citizens’ committee was not aware of anyregulatory body at the Federal or State level whichset standards for large-scale work involving rDNAmolecules. After consultation with experts in fer-mentation engineering and the sterilization ofspent organisms in large vessels, the citizens’ com-mittee proposed revisions in the 1977 law. Amongthe restrictions cited in the revised law was:

There shall be no deliberate release into the envi-ronment, that is the sewers, drains, or the air, ofany organism containing recombinant DNA andfurther that any accidental release shall be re-ported to the Commissioner of Health and Hos-pitals within five days. ”

The new law created a system of accountabilityaccording to which biotechnology firms were re-quired to have special licenses for large scalework. The system included periodic inspectionsto ensure that the environmental release provi-sion was respected by the technology and prac-ticed by the institution.

Against Regulation

The principal opposition to local regulation ofrDNA research in 1976 came from scientists, grad-uate students, and university administrators. Theyemphasized the confidence that the vast majorityof scientists had in the NIH guidelines. RAC wascited as an exemplary system of oversight and onethat a local community could not duplicate. Theimportance of uniform national guidelines wasstressed. Science, it was said, cannot flourish ina patchwork of regulations. If Cambridge enactsrestrictive rDNA regulations, scientists will findit necessary to move away from the city to otherareas more conducive to their research. Theuniversality of the scientific method requires uni-formity in the social context within which researchis carried out. This norm would be violated if eachcommunity passed its own research guidelines.

Opponents of regulation also stressed the ben-efits of rDNA research. These benefits might bedelayed significantly if restrictive local regulations

1 IKri~~kY, et ~1,, Munjcjpa] and State Recombinant DNA Laws.

op. cit.

were established. Those critical of local regula-tion emphasized that the risks of rDNA researchwere at best hypothetical and quite likely non-existent, while the benefits were real. Not a sin-gle case of illness was linked to an agent of anrDNA experiment. In their view, a significantmargin of safety was already provided by the NIHguidelines.

The Case of Chemical WeaponsResearch

For Regulation

The arguments for regulating chemical warfareagents centered around the potential adverse pub-lic health consequences associated with their ac-cidental or intentional release. The CambridgeScientific Advisory Committee examined severalworst-case scenarios in which quantities of 10,100, and 500 ml. of nerve agent were hypotheti-cally released from the testing facility. SAC con-cluded that such an accident was unlikely but notimpossible; in the event of a 100 ml. release, mem-bers of the general public might be located withinrange of lethal doses of such agents .12 The com-mittee cited an independent consultant report thatestimated between 10 to 30 members of the gen-eral public might be located within range of lethallevels of such agents in one of several worst-casescenarios. The case in question involved a sud-den release of 100 ml. of sarin in the form of agaseous cloud. ’3

The SAC report stated that there were no satis-factory regulatory mechanisms for managing theuse of supertoxic agents in the city. Having con-cluded that even relatively small quantities ofchemical warfare agents used in R&D could posea risk to the public, the committee proposed a mu-nicipal ordinance for regulating such agents in par-ticular and supertoxins in general. SAC made nodistinctions in its regulatory program betweenR&D or between university and nonuniversityuses of supertoxins.

lzscientific Advisory Committee for the City of Cambridge, oP.

cit., p. 2.13TR(: Environmental Consultants, Inc., Community Risks from

Experiments w“th Chemicaf Warfare Agents at ArthurD. Little (Hart-ford, CT: 1984).

Page 108: The Regulatory Environment for Science - Princeton

111

More than half the members of the committeefavored a ban on any research involving chemi-cal warfare agents on the grounds that the “risksassociated with any such work [are] unaccept-able. ” A smaller number of members expressedopposition to the research on ethical grounds—that any work on chemical weapons is morallyreprehensible. They believed that no clear distinc-tion can be drawn between offensive and defen-sive research. The city’s legal arguments for itsregulation, however, focused exclusively on is-sues of public health and safety. City council de-bates also centered on public health issues in con-trast to the rDNA episode when some councillorsquestioned the morality of genetic engineering.To some degree, the psychological impact of theterm “chemical warfare agents” was a relevant fac-tor, however, in the public’s sensibility to theissue.

Against Regulation

Arthur D. Little’s case against the city’s ban canbe classified according to the following catego-ries: 1) safety of the facilities; 2) errors and defi-ciencies of the SAC report; 3) discriminatory na-ture of the action; 4) misunderstood goals of theresearch; 5) compliance by ADL to all Federal,State, and local laws and regulations; and 6) vio-lation of Federal supremacy.

1. The company maintained that its laboratoryis among the safest that exists for the work in-tended. The laboratory satisfied DOD specifica-tions for handling chemical warfare agents. ADLwas also in compliance with Federal and Stateenvironmental regulations. The firm argued thatits laboratory advances the state of the art for thesafe handling of hazardous substances. To furtherincrease the margin of safety, ADL agreed not tostore more than certain minimum volumes of thechemical agents.

2. ADL also argued that the committee’s tech-nical analysis was flawed. According to companyspokespersons, the report drew conclusions fromassumptions that do not reflect ADL’s operations.One of the risk scenarios developed by SAC as-sumed greater quantities of chemicals than ADLclaimed it would ever have on hand. Furthermore,SAC did not determine the probability of its

worst-case accidents. It did not describe howchemicals stored in secure containers could be re-leased into the environment from some accident.The SAC report did not take account of the manybarriers there are to the kind of accident it postu-lated. In fact, if there were an accident, the com-pany held, the effects would not be felt beyondADL. According to the company, the city’s at-tempt to ban the five chemicals was unreasona-ble and invalid because it was not shown that theresearch posed any potential health hazard.

3. The company also believed that the city’s ac-tion was discriminatory. Selected city officials,including the city manager, were first informedabout ADL’s plans for the laboratory in January1983, but it was more than a year later, and afteran occupancy permit was issued, that ADL wasordered to cease its testing. In its letter to the pub-lic, ADL wrote: “We worked closely with theCambridge City Manager and the relevant pub-lic safety officials throughout the planning andconstruction of the facility, and they expressedcomplete confidence in its safety and security. Wehired outside consultants to check our findingsand designs. ”

ADL also faulted the city for not allowing thecompany to remedy any defects that may havebeen found in its safety program. As a result, thecity’s prohibition imposed upon ADL nearly amillion-dollar loss in the cost of the laboratoryin addition to substantial losses in present and fu-ture DOD contracts.

ADL also argued that it had been selected outfor regulation. According to the company, thereare many risks to the people of the city that arefar greater than its testing program, yet the cityfocused attention on a state-of-the-art testing lab-oratory that uses small quantities of chemicals.If the city wishes to regulate toxic substances,ADL proposed, it should treat all institutions andall substances on a comparable basis. The de-termination to regulate should not depend onwhether the research is done at a profit or non-profit institution, involves basic or appliedscience, or is carried out under contract fromDOD or under a grant from NIH.

4. ADL correctly surmised that some of thepublic concern over its research was motivated

Page 109: The Regulatory Environment for Science - Princeton

112—

by concerns over the morality of chemical weap-ons research. In a letter to the public, ADL clari-fied the ethical basis of its contract with DOD:

We believe something must be done to controlthe threat of uncontrolled toxic chemicals in theenvironment. We have the professional capabil-ities and the resources to help solve some of theinherent problems. That is why we went to theexpense of constructing a safe, secure, facility forresearch designed to find better ways of protect-ing people from the effects of uncontrolled envi-ronmental hazards, ”

The firm assured the citizenry that its research onchemical and nerve agents is exclusively for “de-fensive and protective purposes.”

We are using existing substances in analytictests in order to develop better methods of de-tecting minute quantities of these agents in theenvironment and safer, more effective means ofdestroying them on a large scale. We are alsoworking to develop better protection, includingclothing for people who might be exposed tothese substances .14

5. All Federal, State, and local regulations hadbeen met before ADL’s lab went into operation.The facility had been inspected by DOD, Stateagencies, and city officials. The company receivedan occupancy permit. The city’s ban thus was per-ceived by the company as an afterthought to allregulations that were in effect prior to and dur-ing the time the laboratory was under con-struction.

14John F. Magee, President of Arthur D. Little,lie, Jan. 28, 1985,

Letter to the Pub-

GENERAL POLICY IMPLICATIONS

The central issue underlying both case studiesis the extent to which local communities are justi-fied in regulating research. Beyond this similarity,there is considerable variation in how these casesrelate to issues of scientific freedom and social ac-countability. The rDNA case involves a well-defined scientific population, a Federal fundingagency, local universities, and a city government.The case of chemical weapons testing is about pri-vate contract research. It too involves city gov-

6. The supremacy arguments have been out-lined in detail in the section of this report com-paring the rDNA research and chemical weaponstesting. In summary, ADL contended that the cityhas no authority to interfere with a contract ofthe Federal Government when all Federal andState safety standards are met. The city’s ban onthe testing and storage of the agents is argued toconflict with the Federal authority governing na-tional defense and is therefore unconstitutional.If other municipalities passed similar prohibitions,there would be a direct conflict between the pol-icies of the U.S. Government and the actions oflocal communities. Under such conditions, thepolicies of the Federal Government are preemp-tory, the company stated.

Although the principal opposition to the city’saction banning the testing and storing of fivechemical warfare agents came from ADL, therewas some criticism expressed by university rep-resentatives about the proposed regulations forsupertoxins contained in the SAC report. MITofficials argued that SAC’s approach to chemi-cal regulation would have a “harsh and adverseeffect on the conduct of research in chemistry, bi-ology, nutrition and food science” at universities.Because SAC made no provisions for volume ex-clusions in its proposed regulations of chemicalwarfare agents or closely related chemicals, manysubstances used in the course of research wouldfall under the proposed criteria. According to theMIT officials, if enacted, these criteria would bean obstacle to scientific research without offer-ing any additional protection to public health.

ernment, and a Federal funding agency. But awell-defined scientific constituency is absent.

Three policy issues stand out in the rDNA epi-sode. First, should science be self-regulated andtherefore insulated from State and local laws? Sec-ond, does NIH oversight of rDNA experimentsprovide a legal basis for Federal supremacy and,if not, should Congress establish legislationtoward that purpose? Third, to what extent, if at

Page 110: The Regulatory Environment for Science - Princeton

113

all, is scientific research a right granted under theFirst Amendment?

NIH has been the de facto regulator of feder-ally funded rDNA experiments. Scientists, how-ever, have had an influential role in the establish-ment and implementation of guidelines. Throughthe NIH structure, the molecular geneticists havehad what has been ostensibly a self-governingapparatus somewhat analogous to a peer reviewprocess. The Cambridge debate threatened thistradition of self-governance which began at Asilo-mar and evolved into the formation of the Re-combinant DNA Advisory Committee. The cityalso challenged the idea of uniform safety stand-ards for experiments in molecular genetics.

Although Cambridge scientists were the onlyones directly affected by the city’s intervention,the possibility of multiple sets of guidelines forrDNA technology, based in part on local stand-ards, troubled scientists throughout the country,Many biologists who opposed congressional inter-vention, preferred it over a patchwork of regula-tions. According to Rockefeller University biol-ogist Norton Zinder, the uniformity of scientificpractice transcends local interests:

The proliferation of local options with differ-ent guidelines in different states and different cit-ies can only lead to a situation of chaos, confu-sion, and ultimately to hypocrisy amongst thescientists involved. 15

Most legal scholars agreed that the NIH guide-lines did not provide a-basis for preempting theCambridge law. No judicial challenge was madeon the reasonableness of the Cambridge rDNAlaw in the context of the Federal guidelines. Per-haps because the Cambridge rDNA laws (first andsecond) added very little to the substance of theNIH guidelines, a legal challenge was avoided.Had the city banned rDNA research, the ques-tion of preemption most certainly would havebeen addressed in litigation, if not through con-gressional action.

Preemption was not the only legal questionraised in the early rDNA debate. Facing the pros-pect of Federal regulation, some scientists arguedthat rDNA legislation would infringe on their

1’National Academy of Sciences, Research 1}’ith RecombinantDNA: Academy Forum (Washington, DC: December 1977).

rights to engage in research. Prompted by severalinquiries, in 1977 the American Civil LibertiesUnion (ACLU) began a task of formulating a pol-icy on whether, or to what extent, scientific in-quiry is a civil liberty protected under the firstamendment. Special committees of the ACLU be-gan drafting policy statements that provided acivil liberties perspective on scientific research.Thus far, the Board of Directors of ACLU has notreached a consensus on the wording of such apolicy.

ADL’s legal battle with Cambridge did not at-tract sympathetic support from other scientists.Most university-affiliated scientists did not viewthe possible restriction on specific contract re-search as a conflict between the local communityand freedom of scientific inquiry. The applied na-ture of the testing work and the fact that the re-sults would probably be classified contributed tothis attitude.

The policy dilemma is best interpreted as a con-flict between the rights of a firm to accept Fed-eral contract research under Federal guidelines andthe rights of a city to set its own standards of pub-lic health and safety including a prohibition of re-search it deems hazardous. The outcome of theADL case has implications for any federally con-tracted research on nongovernmental propertythat involves hazardous or potentially hazardousprocedures or materials. For example, a commu-nity might decide to establish prohibitions againstcertain animal experiments. As a consequence,contract research and basic science would be af-fected adversely. Cases of this nature have notbeen widespread; but they are appearing. InWashington Grove, Maryland, residents have ex-pressed opposition to the testing of chemical nerveagents in the vicinity of a school. Morris Town-ship, New Jersey, has been the site of a con-troversy involving Bell Communications Research(Bellcore), an AT&T spin-off company. At issuehas been the use and storage of highly toxic gases,such as arsine, commonly used in semiconductorresearch (see discussion in app. C). Neither con-gressional policies nor case law has settled the de-bate over Federal supremacy in these cases. If theADL litigation continues beyond the Massachu-setts courts, Federal judicial interpretation mayset some explicit parameters for local control ofprivate sector research.

Page 111: The Regulatory Environment for Science - Princeton

Chapter 8

Research Policy Issues ThatMay Warrant Congressional

Attention in the Future

Photo credit: National Institutes of Health

Page 112: The Regulatory Environment for Science - Princeton

Chapter 8

Research Policy Issues ThatMay Warrant Congressional

Attention in the Future

In OTA’s brief survey of laboratory directorsand research administrators, * the respondentswere asked to reflect on how things have changedfrom when they were just starting out in scienceand to list the major constraints then as opposedto now, The trend most noted by both groups isunderstandably the increase in administrative and“bureaucratic” requirements of grant procurementand administration. Fifty percent of the univer-sity administrators and 32 percent of the lab di-rectors noted that administrative requirements forthe investigator have increased substantially.Time spent on detailed administrative work meansless time and money spent on research.

Other changes in the regulatory environmentnoted by the respondents were the greater chanceof litigation and the appearance of more actorsinvolved in the scope and definition of research.Respondents see this latter trend as a result of in-creased Federal funding for research, which nec-essarily involves more political actors, and in-creased media coverage, which attracts morepublic attention to the research process. Manyalso mentioned increased controls on dissemina-tion of research results as being a significant differ-

‘~c~r ~c(;fi~, ~~~ ~c~x in ch. b .

ence between the climate of, say, 30 years ago andtoday.

About one-third of the respondents stated thatthey were not aware of any research areas wherethe trend is toward fewer rather than more con-trols. Of those laboratory directors who did citean area where controls have eased, 16 percent feltthat changes in the National Science Foundation(NSF) procurement and granting procedures havehelped to ease the administrative controls result-ing from grants arrangements with that agency.Recombinant DNA research was also listed as anarea where the trend has been toward more re-laxed regulations, Other areas mentioned were hu-man subjects research (where expedited reviewprocesses and exemptions have made the approvalprocess less difficult) and a tendency toward de-creased controls at the Federal level with simul-taneous increase in controls at the State, local,and institutional levels.

If these research policy issues which have dom-inated the discussions for the last decade appearto be either resolving or, at least, not creating ma-jor controversies among the research community,then what issues do appear to be emerging forcongressional attention?

WHO BEARS THE BURDEN OF PROOF?The burden of proof for control of research ap- ample, cryptology researchers were asked to carry

pears to be changing. Increasingly, the individ- the burden of deciding which papers to submitual researcher or research facility must prove that to the National Security Agency for review.1 Athe research is safe rather than the regulator prove similar shift in the burden of responsibilitythat it is unsafe. occurred in 1980 changes in NSF grant policy,

A shift in responsibility is clearly occurring inwhich made the grantee responsible for notifying

the case of restrictions on scientific and technical .communication. Under schemes proposed in 1980 I Massachusetts Institute of Technology, Interim Report of the

by the American Council on Education, for ex-Committee on the Changing Nature of Information (Cambridge,MA: Mar. 9, 1983), Section 4.5.

117

Page 113: The Regulatory Environment for Science - Princeton

118

NSF if, in the course of an NSF supported project, the burden of proof on the government to show“information or materials are developed which a compelling need to infringe. But some legal scho-may affect the defense and security of the United lars argue that the situation is now muddied be-States.”2 cause it is increasingly difficult to distinguish be-

If a fundamental constitutional right is in-tween pure speech and “impure” special action.

volved, then in the past the courts have placed

‘National Science Foundation, Grant Policy Manual, Section 794c.

THE SCIENTIST’S ROLE IN ASSURING SAFE RESEARCH

One of the principal unresolved issues is thatof who should be involved in the regulatory proc-ess. What is an appropriate role for the individ-ual scientist, for a professional science or engi-neering society, or for the public?

To what degree should the scientific commu-nity itself take central responsibility for both polic-ing its own safety procedures and participatingin the broadscale development of regulation?There are differing views on the extent to whichscientists should be involved. Do scientists havesome special right to be exempted from consid-eration of these issues? Or is it as John Edsallwrote in 1975:3

The responsibilities are primary; scientists canclaim no special rights, other than those pos-sessed by every citizen, except those necessaryto fulfill the responsibilities that arise from thepossession of special knowledge and of the in-sight arising from that knowledge.

The conflict between these varying interests ismade clear in the specific provisions of the Ex-port Administration Act, for example, wherescientists, whether employed by academic insti-tutions or industry, are expected to comply withthe requirements of the Act and other “applica-ble provisions of law” when communicating re-search findings “by means of publication, teach-ing, conferences, and other forms of scholarlyexchange.” 4

‘John T. Edsall, Scientific Freedom and Responsibility, report ofthe American ksociation for the Advancement of Science, Com-mittee on Scientific Freedom and Responsibility (Washington, DC:American Association for the Advancement of Science, 1975), p. 5.

4Harold C. Relyea, “The Export Administration Act of 1985: Im-plications for Scientific Communication, ” memorandum to the Com-mittee on Scientific Freedom and Responsibility, American Asso-ciation for the Advancement of Science, June 8, 1985, p. 9.

Many agencies have reached out to the affectedresearch community, asking scientists to reviewproposed regulations, both formally and infor-mally, and thereby hoping to assure that the reg-ulations are written in such a way that they areenforceable and can and will be complied with(i.e., are not far-fetched). Such an approach teststhose scientists’ belief that the regulations are nec-essary to protect society. For example, some

. . . social scientists argue that in the case of re-combinant DNA the process was flawed, preciselybecause the political authorities put too much reli-ance in the judgement of the researchers them-selves” and that this situation led to “the captureof a regulatory agency by those it is supposed toregulate.” 5 Others argue that the recombinantDNA case was “a model of responsible public pol-icy decisionmaking for science and technology.’”

How much should research be controlled by le-gal regulation, how much by institutional rules,and how much left to informal practice or to thecodes or guidelines of professional societies?Strong arguments can be made that, when re-straint is desirable, it should not involve the gov-ernment. Regulatory enforcement, court cases, orcongressional legislation may be inappropriate set-tings in which to make social decisions about thedangers and risks of research. Neither the currentregulatory laws nor the agencies that enforce themare geared to address social or ethical issues. Formany of the recent regulatory debates involving

‘Susan Hadden, as quoted in Sanford A. Lakoff, “Moral Respon-sibility and the Galilean Imperative, “ Ethics, vol. 191, October 1980,pp. 110-116.

“Harold P. Green, “The Boundaries of Scientific Freedom, ” Reg-ulation of Scientific Inquiry, Keith M. Wulff (cd. ) (Boulder, CO:

Westview Press, 1979); reprinted from Newsletter on Science, Tech-nolog~’, & Human Values, June 1977, p. 118.

Page 114: The Regulatory Environment for Science - Princeton

119

science, Congress has legislated solutions to fit oneparticular situation or crisis, While these proce-dures or rules may work well to adjudicate amongdiffering scientific or legal aspects of problems,they are not always constructed in such a wayas to resolve or negotiate compromise easily onmoral or ethical points. Critics of a new line ofresearch may be left to feel that they have no realforum from which to effect change.

How extensive and complete should regulatorylegislation be? If the decision is to rely on self-regulation, what criteria will be used? The phi-losophy behind both the Institutional ReviewBoard (IRB) system and the Institutional BiosafetyCommittees is a form of “monitored self-regula-tion, ”7 in which the process of regulation is sub-ject to review and monitoring by government au-thorities. The extensive use in the U.S. regulatory

‘Harvey Brooks, Ben]amin Peirce Professor of Technology andPublic Policy, Harvard University, personal communication, 1985,

system of consensual voluntary codes and stand-ards* is in this tradition, but this self-regulationfor certain forms of research appears to have beenquestioned in many recent cases (e.g., animal ex-perimentation). Should sanctions be imposed onprofessional communities or institutions that failin their self-regulation? Or shall the disciplinaryaction continue to be directed at individuals?

One alternative to increased regulation mightbe better education of the young scientists in therationale for and the ethical aspects of regulation.Today, such education occurs primarily throughapprenticeship, through informal learning. Con-gress might be asked to consider encouraging—e.g., through fellowships—education in the ethicsor procedures of regulation.

*See discussion in ch. 4.

EX POST FACTO RESTRICTIONS ON RESEARCH COMMUNICATION

An individual researcher and the Federal Gov-ernment often can have overlapping but not iden-tical interests in suppressing or disseminatingscientific and technical information. In this re-spect, controls on research resemble governmentcontrols in all parts of society. “Most decisionsabout regulation involve decisions among com-peting societal ‘goods, ’ not decisions between‘goods’ and ‘bads’.”8 To achieve greater benefits,society may be inclined to accept greater risks;but in some situations the risks are experienceddifferentially by particular groups or individuals.The Department of Commerce has, for example,interpreted such normal scientific activities as pre-senting papers and talking with colleagues as apotential “export” of technology, which could beconstrued as requiring a scientist to obtain an ex-port license before participating in such activities.In the opinion of some observers, this interpre-tation constitutes a prior restraint on speech, a“governmental intrusion on the scholarly ex-change of ideas.”9

‘Ibid.‘American Civil Liberties Union, Free Trade in ideas: A Constitu-

tional Imperative (Washington, DC: May 1984), p. 18,

These interests have been placed in especiallysharp contrast when the Government has at-tempted to restrict communication about researchundertaken independently and without Federalsupport, or when the Government classifiesretrospectively research that was not conductedunder classification or even with military fund-ing. Similar issues are raised when there are at-tempts to control the dissemination of militarilyor internationally “sensitive” but previously un-classified information or to control access to fa-cilities. 10 In the decision to classify or controlscientific information, the risks to national secu-rity must be weighed against the long-term valueof free flow of information among a nation’s scien-tists and against principles of scientific and aca-demic freedom.

One consequence of U.S. restrictions may bethe inhibition of U.S. scientists’ access to infor-mation abroad. Several European members of the

l~Haro]d C, Re]yea, IVatjonaj Secun’tv Contro)s and sCkntifJ’C in-

formation (updated 09 ’11 84), issue brief IB 82083, Library of Con-gress, Congressional Research Service, Government Division, Wash-ington, DC, 1984.

Page 115: The Regulatory Environment for Science - Princeton

120

North Atlantic Treaty Organization (NATO) arereported to be considering the establishment ofa new technology transfer agency to coordinatetheir political response to the controls placed bythe United States on the flow of advanced tech-nology.” The NATO Science Committee recentlywrote that, around the world, “certain importantresearch institutions are . . . already being over-cautious” in communication of research results. 12

“NATO Science Committee, “Open Communication in Science, ”NATO Science & Society, 1983.

‘zIbid.

The Committee expressed fear that the combina-tion of an increasing amount of classification—for reasons relating both to national military andeconomic security—and increased internationalindustrial competition could impede cooperationbetween the scientific communities of friendly na-tions, They also emphasized that restrictions willmake it more difficult for small nations to obtainaccess to much-needed research results and thatmore classification may lead to more costly dupli-cation.

“GRAY AREAS” OF SENSITIVE INFORMATION ANDBROADER CLASSIFICATION

“Significant attempts have been made to restrictthe flow of information in cases where it has beenfelt that, though unclassified, it was of such sen-sitive nature that our ‘enemies’ could use it to theiradvantage.” 13 For example, Executive Order12356, a classification order issued by PresidentReagan, “appears to allow classification to be im-posed at any stage of a research project and tobe maintained for as long as government officialsdeem prudent.”14 John Shattuck, of HarvardUniversity, observes that that order “could inhibitacademic researchers from making long-term in-tellectual investments in nonclassified projects

“Alan McGowan, Scientists’ Institute for Public Information, NewYork, NY, personal communication, 1985,

IdJohn w. Shattuck, Feder~ Restrictions on the Free F1OW of Aca-

demic Information and Ideas (Cambridge, MA: Harvard Univer-sity, January 1985).

with features that make them likely subjects forclassification at a later date.”15

As discussed in chapters 3 and 4, the use of Ex-port Administration Regulations and InternationalTraffic in Arms Regulations to identify and con-trol “gray areas” of research previously unclassi-fied and usually not considered covered by thoseregulations has raised a number of questionsabout the potential of long-term adverse effectson the U.S. scientific base. Increasing the areasof unclassified but severely restricted informationnot only inhibits communication among col-leagues who could benefit from interaction butmay also point out to opponents those scientificareas of potential fast progress.

‘51bid.

IMPACT OF NEW COMMUNICATION TECHNOLOGIES

The information revolution creates new oppor-tunities or methods for delaying or classifying in-formation as well as opportunities for more opendissemination. “As long as there were significantdelays in publication of new scientific results, thereview process for commercially sponsored re-search offered only modest impediments to scien-tific openness.”16

‘“Brooks, op. cit,

If recognized by the scientific community as alegitimate publication that signifies a claim to pri-ority, publication on electronic networks couldprovide a new channel for scientific interaction.It could also have the effect of increasing theamount of classification of scientific informationif, because of the speed of publication on such net-works, the Government feels compelled to actquickly to classify without adequate informationto justify such classification.

Page 116: The Regulatory Environment for Science - Princeton

121—

New technologies also give rise to questionsabout how much control an originator/creatorhas over the research project’s results or data, its

PATENTS

Many observers propose the need for revisionof the patent system because they believe that theexisting policy inhibits the progress of science andstifles invention and innovation. Federal Govern-ment policy has been to retain title and rights toinventions resulting from federally funded re-search and development (R&D) made either bygovernment contractors or grantees or by in-house government employees.17 However, onlyabout 5 percent of the 25,000 to 26,000 patentscurrently held by the government have been used.

A related issue is whether there is a need fora uniform government-wide patent policy. Sev-eral pharmaceutical firms have also begun to usethe patent law to restrict research uses of patentedproducts and procedures, even for experimentaluse (heretofore regarded as exempt).18 Such ac-

‘-William C. Boesman, “Government Patent Policy: The Owner-ship of In\’cnt ions Resulting From Federally-Funded R& D,” issuebriet ]B78057, Library ot Congress, Congressional Re\earch Set-wice, Science Policy Research Division, 1985.

‘5 Jeffrey L. Fox, “Patents Encroaching on Research Freedom, ”Sc~ence, vol. 224, lune 3, 1Q84, p. 1080.

intellectual property. This issue is discussed inOTA’s forthcoming report on intellectual property.

tion raises new questions about the interpretationof current law when there is competition in a fast-moving field and where the language of the lawmay be at variance with contemporary researchpractices.

An Organisation for Economic Co-Operationand Development (OECD) task force has recentlyrecommended that OECD countries adopt theU.S. policy of making the date of conception,rather than the date of filing, the legal date of apatent. If this were followed, then pressure to keepdata confidential pending patent filing would bemuch reduced because there would be a graceperiod of 12 months after publication for filingof a patent application. The main reason for de-lay of publication in most university-industryagreements is to allow time for filing for foreignpatents. Revisions in the patent laws could there-fore result in significant long-term effects on re-search.

PUBLIC EDUCATION ON THE BASIS ANDPROCEDURE FOR REGULATION

Understanding of and education in science playa vital role in the public’s willingness to supportthe regulation of research. If the public under-stands the inadvertent and unintended effects ofgovernment regulations, then it may be morelikely to support changes in policy which moreaccurately implement the intent of the law. Manyobservers have described to OTA a growing dis-association between what the public believesshould be controlled and what the governmentactions are controlling. The government is pre-

sumed to be acting on behalf of the public, but,for example, is there evidence of public supportfor the increased controls being placed on scien-tific communication?

As the case studies presented in chapter 7 andappendix C show, how the public perceives or cal-culates the risks of research may greatly influenceits willingness to control research and may simi-larly influence public beliefs about when controlsor legal regulation should be imposed.

Page 117: The Regulatory Environment for Science - Princeton

122

SHIFT IN THE JURISDICTION FOR REGULATION

The discussions of such actions as the right-to-know legislation, the Arthur D. Little and Bell-core cases (chapter 7 and appendix C), and theanimal experimentation controversy show that thepublic—through either community protest or ref-erendum—can act to control, direct, or influencethe topic choice, experimental procedures, orcommunication of science. Although the evidenceis limited, such cases hint at the beginning of ajurisdictional shift in the regulatory arena forscience, especially from the Federal to the Stateand local. This change may be a reaction to ei-ther real or perceived laxity in Federal regulationsfor health and safety protection, it may relate tobroader issues of the exertion of local control overland use and community activity, or, in somecases, it does relate to the larger agenda of na-tional political groups—e.g., protests linked to na-tionwide efforts to stop all nuclear power, abor-tions, or the use of animals in research.

This jurisdictional shift raises the spectre of anumber of negative effects on research caused by

GENERAL ISSUES

Underlying many of these issues are questionsnot resolvable through legislative activity but towhich the Science Policy Task Force of the HouseCommittee on Science and Technology, in its de-liberations, should attend. Once societal con-straints may be imposed, a fundamental questionis that of what constitutes “research. ” For exam-ple, does there exist some constitutional protec-tion for research, and if so what does the legaldefinition of “research” include? Does it includenot only thinking about a problem or talking toother scientists but also experimentation? The def-inition of what is or is not basic research currentlyplays a role in the dissemination of Departmentof Defense (DOD) -sponsored research results. De-fining a project as falling within Federal budgetarycategory 6.1 (fundamental research), for exam-ple, can determine whether or how it is classifiedby DOD. ’9 The definition of what is or is not re-— — . — .

19 Janice R. Long, “Scientific Freedom: Focus of National Secu-rity Controls Shifting,” Chemical & Engineering News, July 1, 1985,pp. 7-11.

inconsistencies or variations in the strictness ofFederal and local regulations. As Allen G. Marr,Dean of the Graduate Division of the Universityof California, argued in a letter to OTA:

Regulations promulgated uniformly on the ba-sis of federal law are far superior to patchworkregulation by state law or local ordinance. Codesof ethical professional practice are art importantcomplement but not a full substitute.

An issue raised by participants in the ArthurD. Little case (see ch. 7), was that protests overresearch involving hazardous chemicals mighthave the unintended result of segregating such re-search. States without the resources for develop-ing comprehensive regulations (or for assuringcompliance) might become dumping grounds forresearch no other States want.

Is there a need for a new jurisdictional frame-work by which Congress can deal with these is-sues, or are they best resolved at State and locallevels”!

search also plays a role in regulation of biomedi-cine. Experimental surgical procedures, for exam-ple, may be justified as therapeutic and not besubjected to review by an ethics committee. Amedical researcher who refers to a project as a“pilot study” or as “innovation” can keep itoutside such regulatory control mechanisms asIRBs. 20 Better understanding of these and otherdefinitional questions will be essential to futureattempts to resolve many of the issues mentionedabove.

In setting an agenda for science, should policy-makers look only to the potential benefits of theresearch proposed or should equal considerationbe given, before funding, to the risk posed by theresearch? If so, what parameters should be usedto make those determinations? Andre´ Hellegersonce said that he, for example, would “assign a

ZOArthur Cap]an, T~ Hastings Center, Hastings-on-Hudson, NY,persona] communication, 1985.

Page 118: The Regulatory Environment for Science - Princeton

123-.

very low priority” to any inquiry that “does not,in the inquiry, harm nature, but which may bedangerous in its consequence. . .“.21 Ruth Macklinmade a similar point in her essay “On the Ethicsof Not Doing Scientific Research” when she wrote:

There is surely some disutility attached to anoutcome that fails to benefit people who mightotherwise have been helped by research. But un-less we subscribe to a research imperative thatplaces freedom of scientific inquiry above all othervalues when potential danger lurks, we need toexamine closely the value dimensions of each in-stance of decisionmaking under certainty. ’z

To approach full understanding of this question,one must also consider what consequences—e.g.,only the most probable or only the most nega-tive—are to be included in such a determinationand also what relative weights should be assignedto various potential outcomes. Is there not justone but a spectrum of possible ways in which so-ciety might use the results, and what relativeweights can be assigned to the better or worse con-sequences? 23

In setting funding priorities, Congress may in-creasingly have to confront determinations of

‘iAndre’HeIlegers, Regulation of Scientific Inqul~’, Keith hf. Wu]ff(cd. J (Boulder, CO: Westview Press, 1979).

“Ruth Macklin, “On the Ethics of Not Doing Scientific Research, ”Hastings Center Report r vol. 7, December 1977, pp. 11-15.

2‘Brooksf op. cit.

what are the boundaries of control of science’soverall agenda. Such questions have been raisedin connection with the current shift toward mili-tary dominance of basic research funding and withthe increased numbers of arrangements betweenuniversities and industry. Will such shifts resultin increased, long-term restrictions on communi-cation, and in controls on procedures as well ason agenda-setting? How might such changesaffect—positively and negatively—the researchprocess and the openness of scientific communi-cation?

And, finally, as the case studies and many ofthe examples show, the flow of public informa-tion plays a significant role in the regulatory envi-ronment for science. There is, of course, an ur-gent need for truly sensitive information to beprotected by the classification system, whether forreasons of military security or economic protec-tionism, and such arguments are equally valid forindustrial or academic protection of intellectualproperty. Arbitrary and capricious use of secrecyand classification, however, may inadvertentlydamage the progress of science by inhibiting thefree flow of information among researchers andthe flow of information to the public. In the lat-ter case, inadequate or incomplete informationcould, in fact, increase the probability of arbitraryregulation at the local level and, on matters re-lating to national policy debates, inhibit free po-litical discourse.

Page 119: The Regulatory Environment for Science - Princeton

Appendixes

Page 120: The Regulatory Environment for Science - Princeton

Appendix A

The Regulatory Environment for Science

Regulatory Forces on SpecificFields—Two Case Studies

These case studies illustrate the regulatory forces atwork on two different research areas—agricultural re-search and research on acquired immunodeficiencysyndrome (AIDS). The first area, agricultural research,is highly organized and highly controlled. Regionalneeds largely determine the substance and agenda ofresearch, even though the actors who set the agendamay be part of centralized government or in a mul-tinational corporation. Recently, two well-publicizedlegal actions have had a dramatic influence on the re-search process in agriculture, attempting to prohibitresearch from progressing on certain topics or in a cer-tain manner. The second area, research on AIDS,offers an interesting contrast in the types of forces reg-ulating a “hot” research field. Here, the methodologi-cal traditions of science and some significant liabilityand privacy issues may be colliding with the push byadvocacy groups for acceleration of the research.

Regulatory Forces onAgricultural Research1

In April 1863, the U.S. Department of Agriculture(USDA) was given authority to use about 40 acres ofland at the west end of the Capitol Mall in Washing-ton, DC, as an experimental farm. Since then, foodand agricultural research in the United States has ex-panded and now includes three major performers: theUSDA; the State agricultural experimental stations(SAES), and private industry, all of which do bothmission-oriented research and basic research.

For its research activities, USDA maintains threeseparate operating agencies—Agricultural Research,Cooperative Research, and Extension Service-and anOffice of Science and Education, which sets broad agri-cultural research policies. The Agricultural ResearchAgency is responsible for most of USDA’s in-houseagricultural research and is accountable and respon-sive to Congress and the executive branch for broadregional, national, and international concerns. TheCooperative Research Agency administers Federalfunds that go to States for agricultural research. TheExtension Service disseminates research resultsthrough, for example, publications, public meetings,

‘This section summarizes material In U.S Congress, Off Ice of TechnologyAssessment, Lln~ted States Food and Agricultural Research System (Wash-ington, DC U S Goverment Pr]ntlng OftIce, D e c e m b e r 1 9 8 1 )

and demonstrations. The four SAES geographical re-gions are each headed by a deputy administrator andtypically include a central station located often on thecampus of a State’s land-grant university, and a num-ber of branch stations throughout the State. Major pri-vate performers in agricultural research include suchcompanies as General Foods Corp., Ralston PurinaCo., and Campbell Soup Co..

In general, public sector agricultural research focuseson biological technologies, while the private sectorsponsors research on mechanical and chemical tech-nologies. In 1978, total Federal expenditures for all re-search and development (R&D) were $26.2 billion.USDA’s expenditures were $381 million, or only 1.5percent of the total U.S. R&D budget. The total pri-vate sector agricultural R&D budget is about three-fourths of the total USDA contribution.

A multiplicity of actors—consumers, producers, in-vestors, in-house scientists, scientific societies, the reg-ulatory agencies, the executive branch and Congress—thus determine research priority setting in agriculturalresearch. Within the SAES system, for example, lineitem administrators and scientists set specific projectpriorities according to their assumptions about whatis the greatest need in their field and what would beof greatest value to the State. Traditionally, federallysponsored research has been managed through a clas-sification system based on geography, type of research(i.e., basic or applied), the problem area, and programstructure. A less direct, but no less influential deter-minant of research direction has been Federal environ-mental and safety regulations, such as regulations onchemical residues or additives in food.

Recently, two separate legal actions have introducedcontroversy into what has usually been a quiet regionof the scientific community. In 1979, attorneys fileda lawsuit, on behalf of 17 farm workers and the Cali-fornia Agrarian Action Project, which charged theUniversity of California with unlawfully spending pub-lic funds on mechanization research that displacedfarm workers. That suit is still in litigation.

Mechanization research includes the developmentof machinery, crop varieties, chemical herbicides,growth regulators, and laborsaving methods of han-dling, transporting, and processing crops. Lawyers forthe farm workers allege that such research displacesfarm workers, eliminates small farms, harms con-sumers, impairs the quality of rural life, and impedescollective bargaining, thereby failing to satisfy the gov-ernment’s obligation to consider the needs of small andfamily farmers, as specified in various Land-Grant

127

Page 121: The Regulatory Environment for Science - Princeton

128

Acts and the Hatch Act of 1877, which authorizesAgricultural Experiment Stations.

The plaintiffs are demanding that all mechanizationresearch at the University of California be halted un-til a fund is created to be used to assist and retrain farmworkers. z Their supporters feel that Federal fundingfor research on laborsaving devices is an improper useof Federal money; it is best supported by the market-place because agribusiness is the primary group thatstands to gain most from the benefits of such research.Opponents see it as a battle between consumerism andgood science at best, and the imposition of Federal con-trols on research and a violation of academic freedomat worst. They cite many cases where mechanizationresearch has resulted in lower prices for the consumerand more humane working conditions for the farmworker. The case, as yet unresolved, raises issuesabout: 1) the social costs of innovation through agri-cultural research, and 2) the legal and social responsi-bilities of those who conduct research that might ad-versely affect certain populations. In the legal sense,the case raises questions about the propriety of Fed-eral expenditures for research activities that might pri-marily benefit private interests.

The other legal action and public protest, launchedby activist Jeremy Rifkin and the Council on EconomicTrends, has attempted to halt the deliberate release ofgenetically engineered products into the environment,a technology of potential use to the agricultural indus-try as a means of increasing and improving crop pro-duction. A Federal appeals court has ruled that exper-iments involving the release of genetically alteredorganisms into the environment can proceed, providedthat their potential ecological effects have been prop-erly evaluated. Rifkin’s group has also filed suit alongwith the Humane Societies of the United States andthe Minor Breeds Conservancy against the Departmentof Agriculture on the issue of transferring genes be-tween species, such as injecting genes for humangrowth hormone into livestock to promote more rapidand exaggerated growth, As of September 1985, ahearing date had not been scheduled,

These actions demonstrate that even the most highlycontrolled of research fields may be disrupted by newregulatory forces.

Research on AIDS3

Much of what we know about the biology of ac-quired immunodeficiency syndrome (AIDS) is also a

‘Calitornja Agrarian Action Project (CAAP) et al v The Regents of theUniversity of California, Case 516427-5, Superior Court of the State ot Call-forn,a, Oakland (filed Sept. 4, 1979)

‘This section summarizes material m U.S. Congress, Ottlce d Technology

Assessment, RevJew, of the Public Health .%vice’$ Re3ponse to AIDS (Wash-ington, DC U.S. Government Printing Ofhce, February 1985)

result of federally sponsored research activities; for ex-ample, Public Health Service (PHS) grantees “discov-ered” AIDS as a syndrome, PHS has conducted sur-veillance of AIDS, and PHS investigators and othershave made significant scientific advances, including thediscovery of the probable etiologic agent for AIDS.Furthermore, PHS investigators-are extensively in-volved in collaborations with non-Federal researchers,both nationally and internationally. Thus, the envi-ronment in which AIDS research is funded, conducted,and reported is influenced not only by the politics ofthe disease (i. e., the special populations which it ef-fects), but also by the traditional funding mechanisms,the grants review process, and the way in which com-mercial interest turns basic research results into vac-cines. In the case of AIDS, these forces appear to haveboth contributed to and impeded the research.

Although AIDS funding increased substantially infiscal years 1984 and 1985, the history of specific fund-ing for AIDS has been marked by tension among theindividual PHS agencies, Department of Health andHuman Services (DHHS), and Congress. Through theAssistant Secretary for Health, individual PHS agen-cies have consistently asked DHHS to request specificsums from Congress; DHHS has submitted requestsfor amounts smaller than those suggested by the agen-cies; and Congress typically has appropriated amountsgreater than those requested by the Department, Ex-cept when prodded by Congress, DHHS has main-tained that PHS agencies should be able to conductAIDS research without extra funds. However, threat-ened cuts in overall funding and personnel levels haverestricted the ability of affected agencies to redirect re-sources.

An additional indirect influence on the research hasbeen the uncertainty of project staff levels. At criticaltimes in the planning stages”, the number of personnelneeded to conduct and support the research-has actu-ally been reduced in several of the PHS agencies. Asa result, the PHS agencies have been unable to plantheir activities adequately because they have notknown how much funding and staff will be availableto them, Now that an etiologic agent for AIDS hasbeen discovered and the research could move intoareas where several agencies have overlapping exper-tise, the jurisdictional uncertainty—because of the un-certain}’ in resource allocation—has intensified com-petition.

The question of whether AIDS funds should comeout of existing PHS agency budgets, or whether suchfunds should augment agency budgets, also reflectsconcern about the perceived appropriateness of PHSfunding of AIDS-related research and the perceivedmagnitude and importance of the AIDS epidemic.Some scientists have expressed concern that ‘researchon other diseases will suffer because funds are being

Page 122: The Regulatory Environment for Science - Princeton

129—

transferred from these areas to AIDS-related research;other observers believe that AIDS-related researchmay be delayed because of wrangling over suchtransfers.

In AIDS research, some sharing of information hastaken place through the informal networks that existamong PHS agencies and among their researchers, buttensions also surround formal communication, Sincethe National Cancer Institute’s work on HTLV-III wasannounced in April 1984, formal information-sharingon a management level has increased substantially, andcentralized coordination of activities is also on the in-crease. Members of the PHS Epidemiology and Pre-vention Task Force have agreed to distribute articlesprior to publication, and to discuss studies at the plan-ning stage in order to avoid unnecessary redundan-cies and to ensure that all the necessary areas are be-ing covered. Many of these sharing and coordinatingactivities would have taken place regardless of anydirective from PHS central management, but in otherinstances, earlier PHS guidance might have led to bet-ter coordination —e.g., PHS might have directed theNational Cancer Institute (NCI) to share AIDS virusculture with the Centers for Disease Control, an ac-tion which was not taken.

Another factor that may have impeded the genera-tion and dissemination of information is the Federalgrants application and approval process for extramuralresearch. National Institutes of Health (NIH) researchgrants take about 16 months from conceptualizationto awards; contracts, about 14 months. The first roundof extramural grants awarded by NIH for AIDS re-search took 14 months to be processed, in part becauseof negotiations with the Office of Management andBudget over the specific language used in the agree-ments. Since that time, some steps have been takento speed up the process, such as mail balloting insteadof face-to-face meetings by reviewers. Shortening theprocess, however, may only increase concerns aboutthe quality of the research activities funded.

Federal regulations covering commercial develop-ment of drugs, biologics, and devices have alsoimpeded open communication. In the United States,Federal policy tends to leave commercial developmentof technologies, including technologies derived fromFederal biomedical research, to the private sector.Once under commercial sponsorship, R&D activitiesare considered proprietary and will not be made pub-lic unless voluntarily released. The Food and DrugAdministration (FDA), therefore, cannot even divulgethe protocols being used by the five companies underlicense from NCI to develop AIDS screening tests to

Federal researchers who are not directly involved inthese activities. Yet the Federal researchers will gen-erally share their research, including their research ma-terials; their primary concerns are the qualificationsof the private researchers and the quality control proc-esses they have established. In the case of AIDS, thesharing of information developed by commercial firmswas enhanced in small part because PHS selected thecompanies that would get the HTLV-III culture devel-oped by the NCI laboratory. Other laboratories, how-ever, have now cultured the virus and sold or givenit to companies other than those selected by PHS, andthe status of those companies’ activities is formally un-known to any Federal researcher at FDA.

Finally, AIDS has been described as a “legal emer-gency” as well as a medical crisis. Much of the con-cern centers on social discrimination experienced bymembers of the groups at risk to contract the disease,especially homosexual men and intravenous drugusers. Two sections of the Public Health Service Act,therefore, have been used to protect confidentiality infederally sponsored research. Section 242a authorizesthe Secretary of DHHS to protect the privacy of indi-viduals participating in research on mental health, in-cluding research on the use and effect of alcohol andother psychoactive drugs, by: 1) withholding from allpersons not connected with the conduct of such re-search the names or other identifying characteristicsof such individuals; and 2) prohibiting persons author-ized to protect the privacy of such individuals frombeing compelled to identify them in any Federal, State,or local civil, criminal, administrative, legislative, orother proceedings. Section 242m(d) provides that in-formation may not be used for any purpose other thanthe purpose for which it was supplied unless consenthas been given.

As more is known about the relationship betweenHTLV-III and AIDS, or as other diagnostic tests aredeveloped, confidentiality safeguards will have to beimplemented without sacrificing the surveillance needsof public health officials or data sharing among re-searchers. Informed consent will be an especially dif-ficult issue, for example, because the first test to beapplied will detect exposure to HTLV-III only throughthe presence or absence of antibodies and persons whotest positive may actually be carriers of HTLV-III, maydevelop AIDS, or may remain well. Regulatory deci-sions, therefore, will have to balance the public healthconcerns surrounding the transmissibility of AIDSwith the social stigma or employment discriminationthat may accompany identification as a potential car-rier or potential victim.

Page 123: The Regulatory Environment for Science - Princeton

Appendix B

Public Attitudes Toward Science*

The attitudinal environment for science includes theattitudes of the public toward science and technologyin general or institutional terms. These attitudes maybe usefully grouped as hopes and expectations, reser-vations and concerns, and confidence in the leader-ship of the scientific community. This appendix willexamine both the literature and the data relevant toeach of these three sets of attitudes.1

Hopes and Expectations

In broad strokes, the literature and the publicationdata from the 4 decades since 1945 portray a publicthat has a high regard for the past achievements ofscience and technology and high hopes for even more

*This appendix is based on an OTA contractor report written by Jon D.Miller, Northern Illinois University.

‘The following data sources were utilized in secondary analyses. The 1957National Association of Science Writers study was based on personal in-homeinterviews with a national probability sample of 1,919 Americans. The in-terviews were conducted by the Survey Research Center at the Universityof Michigan. The field work for this survey was completed just prior (about2 weeks) to the launch of Sputnik 1 by the Soviet Union and is the last meas-urement of American attitudes toward science and technology prior to theSpace Age For a full description of the study and results, see R.C. Davis,The Public Impact of Science in the Mass Media, Survey Research Centermonograph No. 25 (Ann Arbor, MI: University of Michigan, 1958).

The 1979 survey of public attitudes toward science and technology wasbased on personal In-home interviews of a national probability sample of1,635 adults. Sponsored by the National Science Foundation (C-SRS78-16839),the field work was conducted by the Institute for Survey Research at TempleUniversity. For a description of the design and results of the study, see JonD Miller, et al , The Attitudes of the US. Public Toward Science and Tech-nology (Washington, DC. The National Science Foundation, 1980)

The 1981 survey of public attitudes toward science and technology wasbased on telephone interviews with a national probability sample of 3,195adults The study was sponsored by a grant from the National Science Foun-dation (NSF 8105662) and was conducted by the Public Opinion Laboratoryat Northern Illinois University. For a description of the sample and results,see Jon D Miller, A National Survey of Public Attitudes Toward Scienceand Technology (DeKalb, IL Northern Illinois University, 1982).

The 1983 survey of public attitudes toward science and scientists was basedon telephone Interviews with a national probability sample of 1,630 adults,Sponsored by the Annenberg School of Communications at the Universityof Pennsylvania, the survey was conducted by the Public Opinion Labora-tory at Northern Illinois University. For a description of the design and re-sults of the study, see Jon D. Miller, A National Survey of Adult AttitudesToward Science and Technology in the United States (Philadelphia, PA:University of Pennsylvania, Annenburg School of Communications, 1983),

The 1985 data on attentiveness to science policy were taken from a tele-phone survey of a national probability sample of 1,514 adults. The studywas sponsored by Family Circle magazine and conducted by the Public Opin-ion Laboratory at Northern Illinois University, A technical report on this studywill be released by the Public Opinion Laboratory in the fall of 1985,

The 1981-82 study of the attitudes of science policy leaders was based ontelephone interviews with a national sample of 282 individuals, Sponsoredb y a grant from the National Science Foundation (NSF 8105662), the surveywas conducted by the Public Opinion Laboratory at Northern Illinois Univer-sity. For a description of the study design and the methods used to identifyand sample science policy leaders, see Jon D, Miller and Kenneth Prewitt,“National Survey of the Non-Governmental Leadership of American Scienceand Technology, ” a report to the National Science Foundation, 1982,

spectacular results in the future. Even though mostAmericans still see science as a magic black box, ’ theevidence is clear that they also believe that their cur-rent standard of living is in large part the result of mod-ern science and technology. The substantial gains inmedical science, for example, symbolize the achieve-ments of science for most Americans.

The first national study of public attitudes towardscience in the post-war years was conducted in 1957by the Survey Research Center of the University ofMichigan. Sponsored by the National Association ofScience Writers (NASW) and the Rockefeller Founda-tion, the study was designed to assess the public’s in-terest in and knowledge about science and technology,the major sources of information on current scienceissues, and the appetite of the public for science news.The study included personal interviews with a nationalprobability sample of 1,919 adults and the field workwas completed in early October 1957, just 2 weeksprior to the launching of Sputnik I. Inadvertently, the1957 NASW study became the only existing set ofmeasures of the attitudes of the American publictoward science prior to the beginning of what is oftentermed the Space Age. The study offers, therefore, aunique opportunity to look back to the calm of themid-1950s.

The 1957 NASW study found a public that believedthat science and technology had won the war, created“miracle” drugs, and would continue to produce a cor-nucopia of benefits for American society.3 Almost 90percent of the American adults polled said that theworld was “better off because of science. ” When askedwhy they thought so, slightly more than half citedmedical advances and about 40 percent pointed to theAmerican standard of living. When asked to namesome potential “bad effects” of science, 90 percent ofthe adults in the study could not think of a single pos-sible negative effect. Ninety-four percent of the pop-ulation were willing to agree that science was makingtheir lives “healthier, easier, and more comfortable. ”Ninety percent agreed that “most scientists want towork on things that will make life better for the aver-age person” and 88 percent felt that science was “themain reason for our rapid progress. ” It would be hardto imagine a more supportive public,

Despite the influence of the space race and the con-tinued growth of post-war science and technology,

‘Jon D. Miller, “Scientific Literacy: A Conceptual and Empirical Review, ”Daedalus vol. 112, No. 2, 1983, pp. 29-48,

‘Davis, op. cit.

130

Page 124: The Regulatory Environment for Science - Princeton

131

there were few efforts made in the 1960s to measurepublic attitudes toward science. The next systematiceffort to assess the attitudes of the American peopletoward science and technology was initiated by theNational Science Board (NSB) in 1972. As a part ofits new Science Indicators series, the NSB decided toinclude a chapter on public attitudes toward science.The NSB staff prepared a set of questions and usedthe Opinion Research Corporation to collect nationaldata sets in 1972, 1974, and 1976.

These NSB surveys found a public that still heldscience and technology in high regard, although lessso than in 1957. In contrast to the almost 90 percentof the public that thought in 1957 that the world was“better off” because of science, only 70 percent of thepublic held the same view in 1972. ’ Similar results wereobtained in the 1974 and 1976 studies.5 While the abso-lute level was down somewhat, a substantial majorityof the total public held very positive views of the con-tributions of science to American society.

In a 1979 national study of public attitudes towardscience and technology, also sponsored by NSB, sev-eral of the 1957 questions were repeated, offering anopportunity for comparison across 2 decades. In 1979,81 percent of the public still agreed that scientific dis-coveries were making their lives “healthier, easier, andmore comfortable” and 86 percent expressed the viewthat scientific discoveries were “largely responsible”for the standard of living in the United States. ’ In acomparable national study in 1983, Miller found that85 per cent of American adults continued to agree thatscience made their lives healthier, easier, and morecomfortable. 7

Although the material gains attributable to sciencehave undoubtedly influenced public attitudes towardscience, there is evidence that Americans also have acommitment to the value of science per se. In a 1983national survey, Louis Harris found that 82 percentof American adults agreed that scientific research“which advanced the frontiers of knowledge” wasworth supporting “even if it brings no immediate ben-efits.’” Only 14 percent of Americans rejected this idea.

Evidence from the European Barometer indicatesthat western Europeans hold very similar views aboutthe positive contributions of science and technologyto their standard of living.9 (See table B-l. ) Approxi-

‘National Science Board, Science Indicators—lQ72 (h’ashington, DC1973 )

‘N’atlonal Science Board, .%-)errct= [ndicators—lvd (Washington, DC.1975 ), and Nat]onal Science Board, .Sc]ence IrrdJcatom — 2976 (Washington,DC IQ77)

‘Miller, et al , op c]t“Miller, A Nat]ona/ Surve}, of Adult AttJ/udes To~\ard Sc]ence and Tech-

nology In the Lrrf/ted S ta t e s r o p clt“Louis Harris ‘The Road After 1984. The Impact ot Technology on Soci-

ety, a report prepared tor the SOu them New England Telephone Cornpan>’,1Q84

9Jacques-Rene Rabler Euro-baromefer I(?a .%-~entlf]c F’rlorjtie, in the Furo-pean Commurrlty octoher-,~f”~remher,” 1978 ( A n n A r b o r M I . Inter-Unlvers]ty Consort] urn for Pol]tlca] and Soc]al Research 1981)

Table B-1 .—European Attitudes Toward Science, 1978

Country Percent agree

Northern Ireland . . . . . . . . . . . . . . . . . . . . . . 800/0United Kingdom . . . . . . . . . . . . . . . . . . . . . 79Italy. ., ., . . . . . . . . . . . . . . . . . . . . . . . . ., 76France ., . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75Ireland . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75West Germany . . . . . . . . . . . . . . . . . . . . . . . 71Luxembourg . . . . . . . . . . . . . . . . . . . . ., ... 70Denmark. . . . . . . . . . . . . . . . . . . . . . . . . ... 67Belgium . . . . . . . . . . . . . . . . . . . . ... ... . 66Netherlands . . . . . . . . . . . . . . . . . . . . . . . . . . 65—

“Science will continue in the future as it hasdone in the past to be one of the most impor-tant factors in improving our lives. ”

mately threequarters of western Europeans were will-ing to agree with a statement that science had been andwould continue to be a major factor in improving theirlives. There was a high degree of consensus amongcountries, ranging from 80 percent in Northern Irelandto 65 percent in the Netherlands.

Another facet of the attitudinal environment forscience is reflected by the public’s expectations for fu-ture achievements. The belief that science has contrib-uted to the health and comfort of the society is, ofcourse, inherently retrospective; and so it is importantto inquire whether the public expects similar achieve-ment from science in the future, or whether the pub-lic thinks that the frontiers of science have been ex-plored thoroughly. Beginning with the 1979 study forNSB, one series of questions have asked respondentsto assess how likely it is that science will achieve cer-tain results in the next 25 years. The results indicatethat a large segment of the public holds high expecta-tions for future outcomes from science and technol-ogy. By 1983, a majority of the public thought thatit was “very likely” that within the next 25 yearsscience would find a cure for the common forms ofcancer, have people working in a space station, andfind efficient sources of cheap energy (see table B-2).

In contrast, a substantial portion of the public indi-cated that they did not expect science to be able to curemental retardation, communicate with alien beings,or put whole communities of people into outer space.These results indicate that the public does have someability to differentiate between likely and less likelyoutcomes and that the optimism found in several pre-vious responses is not a simple yea-saying reaction.

From these aggregate results, it would appear thata significant portion of the American people hold somepositive general attitudes toward science. It is also im-portant to inquire whether the attentive public* for

‘ln brle~, the basic dimensions of an ‘attentive public” tor science pollcyare h]gh level of Interest in the topic, comb]ned with the perception of be]ngadequately lntormed For a discussion of th]s concept In more depth, see l(~nD Miller, I’ubl]c Attitudes Toward the Regulation of Research, c{~ntra~-tor report prepared for the U S Congress, OttIce of Techno]og}, Assessment

Page 125: The Regulatory Environment for Science - Princeton

132

Table B-2.—Expectations for Future Scientific Achievements, 1979-83

Percent saying that the following results will Possible, but Not likely Number ofbe achieved in the next 25 years: Year Very likely not too likely at all people surveyedA cure for the common forms of cancer. . . . 1979 46 44 - 8 1,635

1983 57 36 6 1,630A cure for mental retardation . . . . . . . . . . . . 1983 11 40 47 1,630A way to put communities of people in

outer space . . . . . . . . . . . . . . . . . . 1979 17 38 42 1,635People working in a space station . . . . . . . . . . 1983 52 34 12 1,630Humans communicating with alien beings . 1983 1 4 33 51 1,630More efficient sources of cheap energy . . . . . 1979 57 34 7 1,635A safe method of disposing of

nuclear wastes . . . . . . . . . . 1983 29 41 26 1,630

“NOW let me ask you to think about the long-term future. I am going to read you a list of possible scientificresults and ask you how likely you think it is that each of these will be achieved in the next 25 years or so. ”

science policy shares these same positive views of thepast and future results of science. Fortunately, the na-tional data sets collected in 1957, 1979, 1981, and 1983

a r e a v a i l a b l e f o r r e t a b u l a t i o n f o r t h i s p u r p o s e .

The one question relevant to this section of the anal-ysis that has been asked repeatedly throughout the last3 decades has been the agree-disagree statement con-cerning the contribution of science to making our lives“healthier, easier, and more comfortable. ” A retabu-lation of three previous studies indicates that there hasbeen some decline in the proportion of both the at-tentive public and other citizens willing to agree withthe statement, but 9 of 10 members of the attentivepublic for science policy and 8 of 10 other Americansstill hold that belief (see table B-3). At all three meas-urement points, at least 10 percent more of the atten-tive public were willing to agree with this view thanwere other citizens.

Some of the more recent data sets allow an exami-nation of the difference in expectations between thosewho follow science policy matters and those who donot. In general, people who were attentive to sciencepolicy issues were more optimistic about the futureachievements of science and technology than werethose who were nonattentive (see table B-4). The gen-eral pattern of expectations by the attentive public andby the others did not differ significantly.

In summary, both the existing literature and selectedretabulation of available data indicate that mostAmericans have a positive image of science and/or

Table B.3.—Attitude Toward Contribution

scientific research. Those who have a high level of in-terest in science and who feel reasonably well informed

about it tend to hold even more positive views aboutthe past and future benefits of science.

Reservations and Concerns

Throughout the post-war years, there has been somelevel of wariness about some of the possible negativeeffects of science among a substantial minority of theAmerican people. On balance, these reservations havenot offset the high levels of positive affect and expec-tation described above, but it is necessary to reviewand understand the magnitude and substance of theseattitudes.

The 1957 NASW study found some reservationsabout the effects of science, but it was muted and mostoften accepted as the price of gaining good things fromscience. Slightly over 40 percent of the public werewilling to agree that science “makes our way of lifechange too fast” and 23 percent agreed with the state-ment that “one of the bad effects of science is that itbreaks down people’s ideas of right and wrong. ” Al-though 70 percent of the adults in the 1957 studyagreed that “the things that happen in this world aremostly controlled by God” and about half felt that“one of our big troubles is that we depend too muchon science and not enough on faith, ” when asked toassess the net effect, 9 out of 10 concluded that theworld was better off because of science. There was

of Science, by Attentiveness, 1957.83

Attentive Not Number surveyedYear public attentive Attentive Not attentive

Percent agreeing that science makes our 1957 980/o 930/0 183 1,736lives healthier, easier, and more 1979 89 79 232 1,313comfortable 1983 92 82 398 1,232—

Page 126: The Regulatory Environment for Science - Princeton

133

Table B-4.–Expectation for Future Achievements, by Attentiveness, 1979-83

Percent say ing i t is “very I ike ly ’ ’ that the fo l lowing resul ts wi I I be

achieved in the next 25 years: Year Attentive public Not attentive—Ž A cure for the common forms of cancer, . . . . . . . . .

A cure for mental retardation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .A way to put communities of people in outer space . . . . . . . .People working in a space station . . . . . . . . . . . . . . . . . . . . . . .Humans communicating with alien beings. . . . . . . . . . . . . . . . . . . .More efficient sources of cheap energy . . . . . . . . . . . . . . . . . . . . .A safe method of disposing of nuclear wastes . . . . . . . . . . . . . . . .

N (1979) =N (1983) =

1979 55 “/01983 681983 131979 131983 621983 171979 761983 35

307398

4454101049135327

1,328 1,232

“Now let me ask you to think about the long-term future. I am going to read you a list of possible scientificresults and ask you how likely you think It is that each of these will be achieved in the next 25 years or so.

some wariness, but not enough to offset the desire forincreased health and comfort.

Karen Oppenheim repeated some of the 1957 NASWitems in a national survey conducted by the NationalOpinion Research Center in 1964 and found that thelevel of public wariness or concern was increasing .’”The NSB-sponsored studies in the early 1970s foundthe same trend.

Four of the items originally used in the 1957 NASWstudy were replicated in a 1983 survey sponsored bythe Annenberg School of Communication at theUniversity of Pennsylvania. ” A comparison of the re-sults indicates that the reservations expressed by thosecitizens included in the 1957 study have remainedlargely unchanged over the last quarter century.

The concern over the impact of science on the paceof change in society has also continued at virtually thesame level. In both years, about 4 in 10 Americans ex-pressed some concern that science was causing ourlives to “change too fast” (see table B-5).

—‘Karen [)ppenhelm Acceptance and [)lstrust Att]tudes C)I Amerlcdn

Adults Toward Science, ma~ter s the~]s Llnlversi t}. ot C hlcagc), I WJOI I hfllier .-i ,\’a IIorIa/ $ur~’e~ (): .A(fult .IftItUdCS T(JH ard SC 1~’nce .?n~j ~e~ h-

rr(llo~~’ /n the ( ‘n)tml .5tJ(es OF ( it

Table B-5.— Public Concerns About Science, 1957.83

Year

1957 1983

● We depend too much on scienceand not enough on faith . . . . . . . 500/0 500/0

• One trouble with science IS thatit makes our way of life changetoo fast . . ... . . . . . . . . . . 43 44

• The growth of science means that a

f e w p e o p l e c o u l d c o n t r o l our lives . 3 2 35. One of the bad effects of science is

that it breaks down people’s ideas ofright and wrong . . . . . . . . . 23 29

Number. . . . . . . . . . . . . . . . 1,919 1,630—

Finally, these data indicated a persistent public con-cern about the potential for a few people to controlthe lives of the total society, using the power of science.In both years, about one-third of the adult populationwas willing to agree with the statement that the“growth of science” meant that a few people could“control our lives. ”

Recent studies indicate a renewed concern about thetie between science and weapons. A 1983 national sur-vey by Harris found that 74 percent of adults in theUnited States were willing to agree with the statementthat “with the development of nuclear, chemical, andbiological weapons, science and technology may endup destroying the human race.“12 Another 1983 studyfound that one-quarter of the adult population thoughtthat it was very likely that “wars in space” would oc-cur in the next 25 years and an additional 36 percentof the American people thought that space wars were“possible.’’”

Do people who pay more attention to science (the“attentive public” ) have the same kinds of reservationsas those reviewed above? A retabulation of the 1957and 1983 data indicated that the attentive public holdsmany of the same reservations found in the previousdata, but that the proportion of persons holding thosereservations is slightly lower among the attentive pub-lic than other people (see table B-6). Although 4 in 10members of the attentive public were concerned thatsociety depends too much on science and not enoughon faith, only 2 in 10 felt that science tended to breakdown people’s ideas of right and wrong. The propor-tion of the attentive public concerned about changesin the pace of life and in the loss of the control of theirlives to science did not differ significantly from theproportion for the nonattentive public.

“}iarrl~ op cltI ~Nll]]er ,,1 ,\.<ltlOn<l/ ,sLirL,(,} ~)f ,Adu/( .A ttjtudtx T( )L* ,~rd .% Ien( c’ .Ind Te( b

nt)lo~}r in the Z ‘nl(td Stales t)~ c]t

Page 127: The Regulatory Environment for Science - Princeton

134

Table B-6.—Concerns About Science,by Attentiveness, 1957-83

Year

NotAttentive attentive

Percent agreeing that . . . . . 1957 1983 1957 1983●

We depend too much onscience and not enoughon faith . . . . . . . . . . . . . . . 440/0 430/0 50% 53 ”/0One trouble withscience is that it makesour way of life changetoo fast . . . . . . . . . . . . . . . 34 36 44 48The growth of sciencemeans that a few peoplecould control our lives . . 31 31 32 38One of the bad effectsof science is that itbreaks down people’sideas of right andwrong . . . . . . . . . . . 16 24 24 32

Number . . . . . . . . . . . 183 321 1,736 1,309

In summary, from the literature and from selectiveretabulation of previous data, it appears that a sub-stantial portion of the American people hold some res-ervations about the impact of science on society. Inthe context of the very positive views found in thepreceding section, it would appear that a significantportion of the American people recognize and under-stand that science involves both the potential for sub-stantial benefits and the possibility of damage ormisuse.

Confidence in Science

Given the combination of positive hopes and expec-tations and the simultaneous level of concern, howdoes the public reconcile these attitudes? Is there anoverall view of science? In general terms, the attituderesearch data from the last three decades suggest thatpeople have concluded that the benefits outweigh thepotential harms from organized science in the UnitedStates.

As noted earlier, 88 percent of the adults studied in1957 reported that they felt that the world was betteroff because of science, 14 The p re ced ing a n a l y s e s h a v e

demonstrated that the level of concern was as high in1957 and in the early 1980s. The conclusion that theworld was still better off for the contributions ofscience could be interpreted, therefore, as an assess-ment that the benefits outweighed the past and pro-spective risks.

“Ibid,

Beginning in the 1970s, the surveys sponsored bythe NSB asked each respondent to make an assessmentof the relative benefits and harms and to weigh thetwo. Similar questions have been asked since that timeby Cambridge Reports. ’s

The data from the last 15 years indicate that a solidmajority of Americans believe that science does moregood than harm (see table B-7). Only about 1 in 20Americans believe that science does more harm thangood, but about one-third of U.S. adults are unsureas to where the balance falls. Some of this uncertaintymay reflect a lack of interest or information. About5 percent of current respondents are simply unable toanswer the question.

Although the exact items discussed above have notbeen used in a survey that would allow the separationof attentive and nonattentives for analysis purposes,the 1983 Annenberg study did include two items thatreflect the same attitude. Each respondent was askedto agree or disagree with the statement that “the ben-efits of science outweigh whatever harm it does. ” Two-thirds of the attentive public agreed with the statementin comparison to 55 percent of nonattentives.16 Thesame sentiment was measured with a paired statementworded in the other direction. When asked to agreeor disagree with the statement that “science is likelyto cause more problems than to find solutions, ” only16 percent of the attentive public and 27 percent ofthe nonattentive public agreed. These results suggestthat those who are interested in science issues and whofollow science policy matters believe that the benefitsof science outweigh its potential harm. This same viewis reflected in the larger public, but it is not likely tobe as solidly rooted in the larger population as it wouldbe in the attentive public.

A second approach to reconciling the potential forgood and harm. from science is reflected in measuresof confidence in the leadership of organized science.Most Americans have considerable pressure on theirtime and do not normally set aside a significant por-tion of time to consider the flow of issues in areas likescience policy. If people (especially those attentive toscience) have confidence in the leadership of majorscientific organizations and corporations, then theleaders can be relied on to monitor the process andthe public can wait until a real controversy emergesbefore becoming concerned about the issue.

The evidence from the General Social Survey17 in-dicates that the leadership of the scientific community

I sNatlon,~cience bard, Science Indicators— 1984 (Washington, IX.

1985).lbM1]]er A Natjona] Survev of Adult Attitudes Toward Science and Tech-

nology in (he United States, op. cit,“J. A D,IVIS and T. Smith, General Soc]al Surveys 1972-1984: Cumula-

tit,e Code Book (Chicago, 11.: National Opinion Research Center, 1984).

Page 128: The Regulatory Environment for Science - Princeton

135

has been and continues to be held in high esteem (seetable B-8). The only major institution in American so-ciety that has consistently claimed a high level of con-fidence from more Americans has been medicine,which may be viewed as at least closely related to thescientific community.

In summary, the literature and the reanalysis of pre-vious data can be interpreted to indicate that mostAmericans see the benefits of science as greater thanany potential harms or risks. This view is apparentlyheld even more firmly by the attentive public forscience policy. The evidence may also be interpreted

to indicate that the leadership of the scientific com-munity is held in high regard, implying a degree oftrust in their monitoring of the work of organizedscience. But it should be noted that no major surveyto date has specifically addressed the philosophical andpolitical issue of direct regulation of scientific research.Episodes such as those discussed in chapter 7—and theexistence of considerable congressional legislativeactivity resulting in regulation—can be interpreted justas strongly as indicators of, if not a lack of trust, atleast a wariness on the part of some communities andconstituencies.

Table B-7.— Public Assessment of the Risks and Benefits of Science, 1972-85

Science and technology . . . . . do about thedo more good do more harm same amount dent know/

Year than harm than good of each not sure Number

1972 ...., . . . . . . . . . . . . . . . . . . . 54 4 31 11 2,2091974 ..., . . . . . . . . . . . . . . . . 57 2 31 10 2,0741976 .., . . . . . . . . . . . . . . . . 52 4 37 7 2,1081983. , . . . . . . . . . . . . . . . . . 73 3 21 3 1,4661984. , ... . . . . . . . . . . . . . . . . . 63 5 27 5 1,8641985. , ... . . . . . . . . . . . . . . . . 58 5 32 5 1,866

“Overall, would you say that science and technology do more good than harm, more harm than good,or about the same amount of each?”

SOURCES Opinion Research Corp. (1972 1974 1976), Cambridge Reports (1963, 1984, 1985)

Table B-8.– Public Confidence in Science and Selected Other Institutions, 1973-84

Have a “great deal of confidence” in: 1973 1974 1975 1976 1978 1980 1982 1984

Medicine ., . . . . . . . . . . . . . . . . . . . . 54 60Scientific community . . . . . . . . . . . . . 37 45Education . . . . . . . . . . . . . . . . . . . . . . 37 49Organized religion . . . . . . . . . . . . . . . . . 35 44Military . . . . . . . . . . . . . . . . . . . . . . 32 40Major companies . . . . . . . . . . . . . . . 29 31Press . . . . . . . . . . . . . . . . . . . . . . . . . 23 26Television ... . . . . . . . . . . . . . . . . . . . . 19 23Organized labor . . . . . . . . . . . . . . . . . . 15 18Executive branch . . . . . . . . . . . . . . . . . 29 14Congress . . . . . . . . . . . . . . . . . . . . . . . . 23 17Supreme Court. . . . . . . . . . . . . . . . . . . . 31 33

N = 1,504 1,484

50 56 4638 43 3631 37 2824 30 3135 39 2919 22 2224 28 2018 19 1410 12 1113 13 1213 14 1331 35 28

1,490 1,499 1,532

52413035282722161512

925

463833323123181412191330

5247293237321713

9191335

1,469 1,506 943

“I am going to name some institutions in this country. As far as the people running these institutionsare concerned, would you say you have a great deal of confidence, only some confidence, or hardly anyconfidence at all in them ?“

SOURCE James A. Davis and Tom W Smith General Social Surveys Cumulative File, 1972-1984 (Ann Arbor Inter-University Consortium for Political and Social Research,1984), p 152

Page 129: The Regulatory Environment for Science - Princeton

Appendix C

Environmental Concernsand Laboratory Siting:

The Morris Township-Bellcore Case*

From February 1984 to May 1985, Morris Town-ship, New Jersey, was embroiled in a controversy overthe siting of a new research facility for Bell Commu-nications Research, Inc. (Bellcore). The site plan of theproposed telecommunications research complex wasdebated extensively before the township planningboard. At issue was the storage, use, and disposal ofhighly toxic and flammable gases. A group of residentsformed the Concerned Citizens of Morris Township(CCMT)—an organization that spearheaded opposi-tion to the research facility on the grounds that thework being planned there was potentially hazardousto public health and environmental quality. The Mor-ris Township-Bellcore case draws attention to the ef-forts by citizens to set community standards of accept-able risk for privately financed research that requiresthe use of toxic materials. This brief case study sum-marizes the key events of the controversy, examinesthe legal basis of local regulation for the proposedresearch, reviews the justifications for and against sit-ing the research facility, and finally, casts some pre-liminary comparisons to the two Cambridge, Massa-chusetts, cases discussed in chapter 7, in whichrecombinant DNA research and chemical warfareagents were regulated.

Historical Background

This case began like many land use decisions in com-munities throughout the United States. In the late1970s, residents of a suburban neighborhood in MorrisTownship, New Jersey, raised concern over the devel-opment of a parcel of land adjacent to single-familysubdivisions and a recreational area. The issues ex-pressed during this period were predominantly thoseof traffic, noise, density, and aesthetics. In February1980, after 15 public hearings over a 12-month period,the Morris Township planning board approved a plansubmitted by the Southgate Corporation, developerof the site. The 58-acre parcel, called the SouthgateOffice Park Complex (Southgate Complex), was des-ignated exclusively for office use.

‘This case study was prepared for OTA by Sheldon Krimsky, TuftsUniversity.

Three years later, during the summer of 1983, withthree office buildings under partial completion, theSouthgate Corporation leased the site to Bell Commu-nications Research, Inc. (Bellcore), a research organiza-tion owned by seven regional telephone companies.Bellcore is a by-product of AT&T’s court-ordereddivestiture of the Bell System. The Bell System Planof Reorganization stipulated that the regional tele-phone companies create a central services organiza-tion to provide them with research and technicalservices.

On behalf of its tenant, the Southgate Corporationsubmitted an amended site plan on December 1983,which included the construction of an additional build-ing devoted to research, and the use, as a laboratory,of two floors of a building previously approved as of-fice space. Bellcore had planned to locate its MorrisResearch and Engineering Center at the Southgate loca-tion. A. number of AT&T employees at Bell Labs fa-cilities in Murray Hill, New Jersey, and Whippany,New Jersey, were expected to be transferred to the newcenter.

The proposed facility was devoted to advanced re-search in semi-conductors and fiber optics. This typeof research commonly employs toxic gases such as ar-sine, phosphine, and diborane as well as liquifiedhydrogen.

Residential abutters to the site had attended a plan-ning board meeting in February 1984 to discuss traf-fic patterns when they learned that toxic and flamma-ble gases would be used under the amended site plan.Within a month, a core group of residents organizedthemselves into CCMT.

The citizens framed their opposition to the researchfacility on two principal grounds: 1) the health effectsof an accidental release of toxic gases into their neigh-borhooods, and 2) a potential release of untreated orpartial] y treated toxic effluent from the research facil-ity into Loantaka Brook—a major tributary of theGreat Swamp National Wildlife Refuge.

CCMT’s main effort to prevent construction of theresearch facility was directed at the Morris TownshipPlanning Board, a body consisting of nine appointedmembers legally responsible for land use decisions.Over two dozen public hearings were held by the plan-

136

Page 130: The Regulatory Environment for Science - Princeton

137

ning board on the Bellcore case between December1983 and May 1985. CCMT brought in paid consul-tants, some from outside the State, to testify in its be-half on the potential hazards to the community of theproposed facility. Eventually, CCMT drew supportfrom a broad range of constituencies covering MorrisTownship and neighboring communities. Includedamong these were: Harding Township EnvironmentalCommission; over 50 Harding residents; the GreatSwamp Watershed Association; 14 civic associationswith a putative representation of 2,000 households inMorris Township and neighboring communities; andan official of the U.S. Fish and Wildlife Service.

A letter signed by 14 representatives of civic asso-ciations expresses the intensity of public opposition:

We question the need for Bellcore to impose the lab-oratory on a community that does not want it . . .We emphatically state that the Bellcore laboratory isnot welcome and that we will pursue every means avail-able to expose and publicize the fact that, in this in-stance, Bellcore has failed to fulfill its role as a respon-sible corporate citizen . .1What started out as a controversy involving abut-

ters to an industrial site, soon evolved into a regionalconflict over a proposed research and engineering cen-ter. As community pressure grew, so did Bellcore’s im-patience with the uncertainty of locating its new re-search home. The company made serious attempts tocommunicate its position that “the small quantities ofchemicals that [the company] plans to use and the‘state-of-the-art’ safety systems and procedures that itplans to employ will make the Southgate facility safebeyond any reasonable question whatsoever.’”

In May 1984, Bellcore submitted an environmentalinformation document to the planning board, describ-ing its prospective laboratory operations, providinga representative chemical inventory for the new com-plex, and outlining safety procedures for the storageand handling of toxic materials. The company alsohired risk assessment consultants to present its case be-fore the planning board. Bellcore scientists providedan additional source of technical assistance to the com-pany during the protracted debate.

In the 18-month period during which the planningboard held public hearings on Bellcore’s proposed re-search complex, proponents and opponents of theamended site plan were assigned scheduled sessions atwhich to present their respective arguments. On May3, 1985, the planning board prepared for a final voteon the site plan. However, at the outset of the sessionprior to the vote, Bellcore made an unexpected an-

IThc,ma\ Fuschett[> Ir , Bel]core [’r{)test Cont]nucs t{) B u i l d ObserLrer-Tr/hunf, Nlar 28, 1Q8.5

h’ Nllchael (;rt)ve, LrICe P r e s i d e n t a n d General C<)unse], Ije[[cc)re, ]etterto lames $tenger C o n c e r n e d Cltlzens ot Morris Township, Nlar 11, 1 Q85

nouncement that it was withdrawing several con-troversial elements of its site plan including the newlaboratory building, and the use of certain toxic andflammable gases. The planning board hastily acceptedthe modified proposal by a vote of 9-O. Realizing thateven a vote in its favor would not end the controversy

or the delay in construction, the company appears tohave capitulated to the concerns of the citizen protes-tors. In response, the citizen’s group chose not to ap-peal the final decision of the planning board—despitesome uneasiness among CCMT members that they hadnot seen a completed version of the adopted site plan.This decision brought to a close an 18-month con-troversy over potentially hazardous research intelecommunications.

The Legal DimensionLocal planning boards derive their authority to ex-

ercise land use controls from State statutes. In NewJersey, Chapter 57 of the State Land Development Or-dinance sets forth principles of municipal land use con-trols that include the promotion of public health andsafety and protection against man-made and naturaldisasters. In the written opinion of the Morris Town-ship counsel, “both municipal land use law as well asthe Morris Township ordinances provide sufficient le-gal basis to deny the [Bellcore] application if the Boardfeels it would present an unacceptable risk. ” The key

to the planning board’s authority to proscribe researchis in the interpretation of “unacceptable risk” —a vagueand elusive term that was the centerpoint of much ofthe public debate.

The Southgate Complex is on land zoned for officeand laboratory use, a point emphasized by Bellcorein its repeated contention that the amended site planwas in conformity with zoning requirements for theparcel. CCMT claimed that it was within the purviewof the planning board to restrict research activities thatpose a threat to human health, public safety, or envi-ronmental quality, even though the parcel is zoned forlaboratory use. It argued that the zoning classification“research” is only a guide. Each activity must be care-fully examined under this broad category (which in-cludes everything from pencil and paper operations tothe storage and use of hazardous chemicals) to deter-mine whether it conforms to community standards ofacceptable risk.

Acting in a quasi-legal manner but without strictrules of evidence, the planning board heard testimonyfrom both sides, cross-examined witnesses, and per-mitted adversaries to question one another. A deci-sion by the planning board is subject to an appeal inthe State courts if the petitioner files the appeal inaccordance with accepted guidelines.

Page 131: The Regulatory Environment for Science - Princeton

138

Citizen opposition to the facility was not directedat a particular research program per se, but rather atthe chemical substances that were a critical part of theresearch activities.

Arguments For and Against aResearch Ban

In its presentation before the planning board, Bell-core maintained that the site plan was in conformityto the zoning requirements of the parcel, Moreover,the proposed laboratory facility was designed to meetor exceed all Federal, State, and local laws on handlingtoxic materials. Company officials argued that “theirplans are a logical extension of work done safely since1941 at Bell Labs in Murray Hill where Bellcore scien-tists are working now until their company opens ahome for them.’”

Bellcore cited results of its commissioned risk assess-ment studies that examined the case of a worst-crediblearsine leak. The conditions defining the worst-crediblecase are a failure in the mechanical scrubber (a devicethat filters out unwanted gases) resulting in a slow leakof arsine, or an accidental release of arsine as a resultof a tube fracture, According to those studies, the max-imum exposure of any citizen in the community wouldbe about one fortieth of the safe arsine levels permis-sible for workers.

The risk assessment consultant to CCMT developeda worst-case scenario that differed considerably fromcases cited by Bellcore. The storage of 1,500 gallonsof liquid hydrogen on the roof of the laboratory build-ing was the basis of one potential worst-case accident.A CCMT consultant cited as a plausible event a largehydrogen leak that could cause an explosion thatwould rupture the arsine tank and send toxic gases outtoward the neighborhood.

CCMT was uncompromising on the matter of stor-ing toxic gases on the roof of the proposed facility.The citizens group was not persuaded by company sta-tistics on the low probability of hydrogen explosions,or the gas detection and monitoring systems plannedfor the new facility. Opponents of the facility fixedtheir attention on the worst-case explosive release oftoxic gases. That became the standard against whichthey would judge acceptable risk.

An article in Technology Review which was distrib-uted widely among members of CCMT fueled the citi-zens’ resolve against accepting a compromise on thestorage of toxic gases. Passages of the article read:4

‘Timothy Mullaney, “Neighbors, Firm Struggle Over Chemical Risk, ” Dail.vRecord, Apr. 22, 1985.

‘Joseph La Dou, “The Not-So-Clean Bus]ness of Mak]ng Chips, ” Technol-ogy Review, May June 1984, pp. 23-25, 32, 36

Acute inhalation [of arsine gas] can cause rapid de-struction of red blood cells, followed by severe kid-ney damage, and if the patient is not immediatelytreated-death. Given sufficient low-level exposureover time, arsine also may be carcinogenic. The ac-cidental release of the contents of a 20-pound cylin-der of 100 percent phosphine would have to be spreadover 1,792 acres—or 276 city blocks—before beingdiluted to the permissible exposure level of 0.3 ppm.A second argument, which evolved somewhat later

in the controversy, centered around the environmentalprotection of the Great Swamp National Wildlife Ref-uge. The proposed laboratory facility borders on Loan-taka Brook, which flows into the Great Swamp. In re-sponse to the prospect of having pretreated emissionsfrom the research facility flush into Loantaka Brook,a spokesperson for the Great Swamp Watershed Asso-ciation said:

[A]ny accidental discharge of hazardous materialsfrom the Bellcore facility could impair the WoodlandTreatment plant operation and seriously degrade waterquality in the brook and further downstream in theGreat Swamp. ’Environmentalists also expressed concerns about

seepage of toxic materials into the groundwater fromaccidental spillage or a gas cylinder rupture. It wasstressed that two streams running through Southgate

--feed a major drinking water source for 600,000 peo-ple. By dramatizing the potential environmental im-pacts, CCMT was able to build a broad coalition ofsupporters, consisting of civic associations and envi-ronmental protection groups, to oppose the Southgatesite of the research facility.

A key difference between Bellcore and CCMT onthe conceptualization of risk is exemplified by theterms “worst-credible case” and “worst-possible case”as applied to an accidental release of hazardous sub-stances, In emphasizing the former phrase, Bellcoreurged the community, in evaluating the risks, to con-sider plausible accidents and not extremely remote orunrealistic events. However, CCMT fixed on the worstevent that was conceivable, without considerations ofprobability. Neither side introduced a quantitativeassessment of the likelihood that any accident couldtake place. Each party argued its case within a pre-ferred model of risk assessment, the choice of whichis more a question of culture than of science. Thisdifference made a negotiated settlement between ad-versary groups extremely difficult.

The Morris Township case is thus not one of a com-munity regulating a form of objectionable research.Barely any interest was expressed by citizens about thenature of the semi-conductor and fiber optic research

‘Sally Dudley, letter to the ed]tor, Observer-Tribune, Mar. 21, 1985

Page 132: The Regulatory Environment for Science - Princeton

139

planned for the site. The entire focus of the debate wason the types of chemicals on site and the possibilitiesof their release into the environment. Had the plan-ning board ruled against Bellcore, the decision wouldalso not have established a legal precedent for similarcases that might arise elsewhere in the township. Plan-ning board decisions are rendered for specific circum-stances and do not accumulate as in case law. How-ever, had such a decision been made, it is likely to havecreated for the township an informal regulatory prece-dent against similar proposals involving research withhighly toxic gases. Although the company withdrewthe proposal before a planning board vote was taken,a mood has been created in Morris Township that,while not codified into law, may be no less effectivein proscribing such activities should they arise in a fu-ture site plan.

Comparisons With Two Cambridge,Massachusetts Cases

Table C-1 illustrates some comparisons and con-trasts between the Morris Township case and the twoCambridge controversies. Notably, both the ArthurD. Little (ADL) and Bellcore cases involve private sec-tor research for which highly toxic chemicals are re-quired. Citizens of both communities cast their oppo-sition to the respective research activities on publichealth grounds emphasizing worst-case scenarios. Op-ponents of the research in both cases were not per-suaded by comparative risk analysis and argumentsthat they consider the probability of a worst-case ac-cident. Environmental factors became important in the

Morris Township debate, but were of little significancein the issue of chemical warfare research. However,the morality of research was discussed, to some de-gree, in both the debate over rDNA molecules andchemical warfare agents. Moral discourse about thenature of the research program did not arise amongMorris Township citizens.

Bellcore’s research program is a mixture of basic andapplied science/engineering. The character of its re-search is a blend of what we would find at a univer-sity and what might be carried out at ADL. In MorrisTownship, the restraints on research came prior to theconstruction of a laboratory; that compares favora-bly with the rDNA case. In contrast, ADL had initi-ated its research before local restraints were imposed.

The social instruments for regulating research aremarkedly different between the Cambridge and MorrisTownship cases. The Cambridge city council and thehealth commissioner played key roles in the controlof rDNA molecules and chemical nerve agents. In con-trast, the township planning board acted as the exclu-sive social instrument for regulating Bellcore’s researchprogram. The public health officials of the townshipdid not have a visible role in the controversy.

Whereas codification of research restrictionsemerged in both the Cambridge cases, no formal re-strictions on research were imposed in the MorrisTownship situation. In the latter case, withdrawal byBellcore of its proposed solid state laboratory revealedthe importance of a local cultural barrier to specifictypes of research. The barrier, although informal andunmodified, may have the persistence and efficacy ofa law.

Page 133: The Regulatory Environment for Science - Princeton

140-.

Table C-1 .—Comparison of Three Cases Involving Local Control of Research

Category of comparison rDNA—Cambridge

Type of research . . . . . . . . . . . . . Basic science

Nature of institutions . . . . . . . . Academic/nonprofit

Stage of the research at outsetof local intervention . . . . . . Not yet begun

Source of research funding . . . . NIH and NSF primarily

Origins of controversy . . . . . . . . National and withinscientific community

Stimulus of localinvolvement . . . . . . . . . . Newspaper story on

Harvard’s plan to buildP-3 genetics lab

Primary regulatory agent . . . . . . City council

Time period of controversy . . . . Stage 1: 7 monthsStage 2: 5 months

Codification of ruling . . . . . . . . . Municipal ordinanceregulating rDNAactivities

Institutional response tocommunity reaction . . . . Universities accept

temporary moratorium

Actual interference withresearch. . . . . . . . . . . . . No appreciable delay

Judicial action . . . . . . . . . . . . . . No legal test of moratoriumor rDNA ordinance

Nature of communityinvolvement. . . . . . . . Primarily from academic

sector; no grass rootsorganizations

Perceived community risk . . . . Unspecified speculativescenario of creation andrelease of disease-carryingorganisms

aSclentific Advisory Commfttee

Arthur D. Little(ADL)–Cambridge Bellcore—Morris Township

Applied chemical andengineering

Consultant/for profit

7 months ongoing

DOD

Local and centered on acity and two towns

Newspaper story on ADL’snew lab for testingchemical warfare agents

Public health c o m m i s s i o n e r

20 months as of July 1985

Public health order banninguses of certain chemicalwarfare agents

ADL rejects moratorium;litigates public healthorder

Not prevented orappreciably delayed todate

Litigation taken on researchban

No organized opposition atthe early stages; intensecommunity organizingafter release of SACa

report

Explosive re lease of nerve

a g e n t s e x p o s i n g r e s i d e n t s

Basic and applied scienceand engineering

Private sector/for profit

Not yet begun

Private sector: regionaltelephone companies

Local and centered on atownship

Planning board hearing onsite plan for commercialdevelopment

Township planning board

18 months

None; withdrawal of plannedresearch by firm

Followed process throughplanning board; finallywithdrew proposal, noIitigation

Research was delayed andfinally prevented at site

No litigation

Organized opposition at theoutset; coalition-buildingwith other townships andregional groups

Explosion of hydrogen tankand release of arsine gas;also release of toxicchemicals into fragilepreservation area andgroundwater —.

Page 134: The Regulatory Environment for Science - Princeton

References

Page 135: The Regulatory Environment for Science - Princeton

References

Albury, Randall, The Politics of Objectivity (Victo-ria, Australia: Deakin University Press, 1983).

Allen, Jonathan (cd.), March 4: Scientists, Students,and Society (Cambridge, MA: The MIT Press,1970) .

American Civil Liberties Union, Free Trade in Ideas:A Constitutional Imperative (Washington, DC:ACLU, May 1984).

Auerbach, Lewis E., “Scientists in the New Deal, ”Minerva, vol. 3, summer 1965, pp. 457-482.

Averch, Harvey A., A Strategic Analysis of Scienceand Technology Policy (Baltimore, MD: JohnsHopkins University Press, 1985).

Barber, Bernard, et al., Research on Human Subjects(New York: Russell Sage Foundation, 1973).

Baron, Charles H., “Fetal Research: The Question inthe States, ” Hastings Center Report, vol. 15, April1985, pp. 12-16.

Beauchamp, Tom, et al. (eds. ), Ethical Issues in So-cial Science Research (Baltimore, MD: Johns Hop-kins University Press, 1982).

Beecher, Henry K., “Ethics and Clinical Research,” TheNew England ]ourna) of Medicine, vol. 274, June16, 1966, pp. 1354-1360.

Berg, Kore, and Tran+y, Knut Erik (eds. ), ResearchEthics (New York: Alan R. Liss, Inc., 1983).

Bremer, Howard W., “Commentary on Rosenzweig, ”Science, Technology, & Human Values, vol. 10,spring 1985, pp. 49-54.

Brill, Winston J., “Safety Concerns and Genetic Engi-neering in Agriculture, ” Science, vol. 227, Jan. 25,1985, pp. 381-384.

Brooks, Harvey, “The Resolution of Technically-In-tensive Public Policy Disputes, ” Science, Tech-nology, & Human Values, vol. 9, winter 1984, pp.39-50.

Budiansky, Stephen, “Rifkin’s Foot in the Door, ” Na-ture, vol. 312, Dec. 20-27, 1984, p. 689.

Bush, Vannevar, Science—The Endless Frontier, a re-port to the President on a Program for Postwar Sci-entific Research (Washington, DC: National ScienceFoundation, 1980 (reprinted from Office of Scien-tific Research and Development, 1945)).

Caldwell, Lynton K., Science and the National Envi-ronmental Policy Act: Redirecting Policy ThroughProcedural Reform (University, AL: The Universityof Alabama Press, 1982).

Callahan, Daniel, “Recombinant DNA: Science andthe Public, ” Hastings Center Report, vol. 7, April1977, pp. 20-23.

Capron, Alexander M., “Medical Research in Prisons, ”The Hastings Center Report, June 1973, pp. 4-6.

Carter, Grace M., et al., The Consequences of Lln-funded NIH Applications for the Investigator andHis Research (Santa Monica, CA: The Rand Corp.,October 1978).

Casper, Barry M., “Laser Enrichment: A New Path toProliferation?” Bulletin of the Atomic Scientists,January 1977, pp. 28-41.

“The Chemist’s Right to Know and Obligation to Tell, ”Chemical & Engineering News, Jan. 28, 1985, p. 38.

Cherniavsky, John C., “Case Study: Openness andSecrecy in Computer Science Research, ” Science,Technology, & Human Values, vol. 10, spring 1985,pp. 99-104.

Chubin, Daryl E., “Open Science and Closed Science:Tradeoffs in a Democracy, ” Science, Technology,& Human Values, vol. 10, spring 1985, pp. 73-81.

Cole, Jonathan O., “Research Barriers in Psychophar-mycology, ” American Journal of Psychiatry, vol.134, No. 8, August 1977, pp. 896-898.

Committee on Science in the Promotion of HumanWelfare, “Secrecy and Dissemination in Science andTechnology, ” Science, vol. 163, Feb. 21, 1969,pp.787-790.

Cournand, Andr<, “The Code of the Scientist and ItsRelationship to Ethics, ” Science, vol. 198, Nov. 18,1977, pp. 699-705.

Cowan, Robert C., “Degrees of Freedom, ” Technol-ogy Review, August-September 1985, p. 6.

Culliton, Barbara J., “Gene Therapy: Research in Pub-lic,” Science, vol. 227, 1985, pp. 493-496.

Culliton, Barbara J., “Science and Secrecy, ” Science,vol. 220, June 17, 1983, p. 1258.

Culliton, Barbara, J., “XYY: Harvard Researcher Un-der Fire Stops Newborn Screening, ” Science, vol.188, June 27, 1975, pp. 1284-1285.

Culliton, Barbara J., “Grave-Robbing: The ChargeAgainst Four From Boston City Hospital, ” Science,vol. 186, Nov. 1, 1974, pp. 420-421, 423.

Curran, William J., “Experimentation Becomes aCrime: Fetal Research in Massachusetts, ” The NewEnglandJournaZ of A4edicine, vol. 292, Feb. 6, 1975,pp. 300-301.

Curran, William J., “The Approach of Two FederalAgencies, ” Experimentation With Human Subjects,Paul A. Freund (cd. ) (New York: George Braziller,1969).

Curran, William J., “The Law and Human Experimen-tation, ” The New England .lournal of Medicine, vol.275, No. 6, Aug. 11, 1966, pp. 323-325.

David, Edward E., Jr,, “The Federal Support of Math-ematics,” Scientific American, vol. 252, May 1985,pp. 45-51.

143

Page 136: The Regulatory Environment for Science - Princeton

144

Delahunt, William D., “Biomedical Research: A ViewFrom the State Legislature, ” Hastings Center Re-port, vol. 6, April 1976, pp. 25-26.

DeLauer, Richard D., “Scientific Communication andNational Security, “ Science, vol. 226, Oct. 5, 1984,p. 9.

Dickson, David, The New Politics of Science (NewYork: Pantheon Press, 1984).

Dunlap, Thomas R., DDT: Scientists, Citizens, andPublic Policy (Princeton, NJ: Princeton UniversityPress, 1981).

Dupree, A. Hunter, Science in the Federal Government(Cambridge, MA: Harvard University Press, 1957).

Edsall, John T., Scientific Freedom and Responsibil-ity (Washington DC: American Association for theAdvancement of Science, Committee on ScientificFreedom and Responsibility, 1975),

England, J. Merton, A Patron for Pure Science (Wash-ington, DC: National Science Foundation, 1982).

Etzkowitz, Henry, “Entrepreneurial Scientists and En-trepreneurial Universities in American AcademicScience, ” Minerva, vol. 21, summer-autumn 1983,pp. 198-233.

Faberge, Alexander C., “Open Information and Secre-cy in Research, ” Perspectives in Biology and Medi-cine, vol. 25, winter 1982, pp. 263-278.

Ferguson, James R., “National Security Controls onTechnological Knowledge: A Constitutional Per-spective, ” Science, Technology, & Human Values,vol. 10, spring 1985, pp. 87-98.

Ferguson, James R., “Scientific Freedom, National Se-curity, and the First Amendment, ” Science, vol.221, 1983, pp. 620-624.

Fletcher, John C., and Schulman, Joseph D., “Fetal Re-search: The State of the Quest ion,” Hastings Cen-ter Report, vol. 15, April 1985, pp. 6-11.

Fox, Jeffrey L., “Patents Encroaching on Research Free-dom, ” Science, vol. 224, June 3, 1984, pp. 1080-1081.

Goggin, Malcolm L., “The Life Sciences and the Pub-lic: Is Science Too Important To Be Left to the Sci-entists?” Politics and the Life Sciences, vol. 3, Au-gust 1984, pp. 28-40.

Grad, Frank P., “Medical Ethics and the Law, ” An-nals of the American Academy of Political and So-cial Sciences, vol. 437, May 1978, pp. 19-36.

Graham, Loren R., “Comparing U.S. and Soviet Ex-periences: Science, Citizens, and the Policy-MakingProcess, ” Environment, vol. 26, September 1984,pp. 6-9, 31-37.

Green, Harold P., “The Role of Science in Science Pol-icy, ” Politics and the Life Sciences, vol. 3, 1984, pp.42-44.

Green, Harold P., “A Legal Perspective, ” Bulletin ofthe Atomic Scientists, vol. 10, December 1981, pp.28-30.

Greenberg, Daniel S., The Politics of Pure Science(New York: New American Library, 1967).

Grobstein, Clifford, “Biotechnology and Open Univer-sit} Science,” Science, Technology, & Human Val-ues, vol. 10, spring 1985, pp. 55-63.

Grobstein, Clifford, A Double Image of the DoubleHelix (San Francisco, CA: W.H. Freeman & Co.,1979).

Gustafson, Thane, “Survey of the Structure and Pol-icies of the U.S. Federal Government for the Sup-port of Fundamental Scientific Research, ” Nation-al Research Council, Committee for Joint U. S./U.~;mS<Rm Academy Study of Fundamental SCienCe

Policy, Systems for Stimulating the Developmentof Fundamental Research (Washington, DC: Na-tional Academy of Sciences, 1978).

Hadwiger, Don F., “U.S. Agricultural Research Poli-tics: Utopians, Utilitarians, Copians, ” Food Policy,August 1984, pp. 196-205.

Hays, Samuel P., “The Structure of EnvironmentalPolitics Since World War II, ” ]ournal of Social His-tory, summer 1981, pp. 719-738.

Hecht, Frederick, “Biomedical Research: Ethics andRights, ” Science, vol. 188, 1975, p. 502.

Hewlett, Richard G., “ ‘Born Classified’ in the AEC:A Historian’s View, ” Bulletin of the Atomic Scien-tists, vol. 10, December 1981, pp. 20-27.

Hodges, Robert E., and Bean, William B., “The Useof Prisoners for Medical Research, ” The Journal ofthe American Medical Association, vol. 202, Nov.6, “1967, pp. 177-179.

Holton, Gerald, and Morison, Robert S. (eds. ), Limitsto Scientific Inquiry (New York: W.W. Norton &Co., 1979), pp. 129-146.

Horowitz, Donald L., The Courts and Social Policy(Washington, DC: The Brookings Institution,197’7).

Hull, David, “Openness and Secrecy in Science: TheirOrigins and Limitations, ” Science, Technology, &Human Values, vol. 10, spring 1985, pp. 4-13.

Jennir~gs, Bruce, “Science and Democracy: A Com-mentary, ” Politics and the Life Sciences, vol. 3,1984, pp. 47-48.

Johnson, Judith A., “Recombinant DNA: Legal Chal-lenges to Deliberate Release Experiments, ” reportNo. 85-502, U.S. Congress, Library of Congress,Congressional Research Service, Science Policy Re-search Division, Jan. 7, 1985.

Johnson, Judith A., “Regulation of Recombinant DNAPrc~ducts, ” issue brief IB 85090, U.S. Congress, Li-

Page 137: The Regulatory Environment for Science - Princeton

145

brary of Congress, Congressional Research Serv-ice, Science Policy Research Division, Apr. 3, 1984.

Kargon, Robert H. (cd.), The Maturing of AmericanScience (Washington, DC: American Associationfor the Advancement of Science, 1974).

Kennedy, Donald, “Government Policies and the Costof Doing Research, ” Science, vol. 227, February1985, pp. 480-484.

Kevles, Daniel, The Physicists (New York: Alfred A.Knopf, 1978).

Kopelman, Loretta, “Ethical Controversies in Medi-cal Research: The Case of XYY Screening, ” Perspec-tives in Biology and Medicine, winter 1978, pp.196-204.

Krimsky, Sheldon, Genetic Alchemy: The Social Fiis-tory of the Recombinant DNA Controversy (Cam-bridge, MA: The MIT Press, 1982).

Lakoff, Sanford P., “Moral Responsibility and the‘Galilean Imperative, ’ “ Ethics, vol. 191, October1980, pp. 110-116.

Lasswell, Harold D., National Security and Individ-ual Freedom (New York: Da Capo Press, 1950).

Leskovac, Helen, “Regulating Research in the PublicInterest, ” Politics and the Life Sciences, vol. 3, 1984,pp. 55-57.

Long, Janice R., “Scientific Freedom: Focus of NationalSecurity Controls Shifting, ” Chemical & Engineer-ing News, July 1, 1985, pp. 7-11.

Long, Janice R,, “Federal Stance on Secrecy of ScienceInformation Relaxes, ” Chemical & EngineeringNews, Oct. 8, 1984, pp. 14-15.

Macklin, Ruth, “On the Ethics of Not Doing Scienti-fic Research, ” Hastings Center Report, vol. 7, De-cember 1977, pp. 11-15.

Marston, Robert Q., “Research on Minors, Prisoners,and the Mentally Ill, ” The New Eng]and Journal ofMedicine, vol. 288, Jan. 18, 1973, pp. 158-159.

Martin, Philip L., and Olmstead, Alan L., “The Agri-cultural Mechanization Controversy, ” Science, VOI.227, Feb. 8, 1985, pp. 601-606.

McCraw, Thomas K. (cd.), Regulation in Perspective:Historical Essays (Cambridge, MA: Harvard Uni-versity Press, 1981).

McMullin, Ernan, “Openness and Secrecy in Science:Some Notes on Early History, ” Science, Technol-ogy, & Human VaZues, vol. 10, spring 1985, pp.14-23.

Miller, Julie Ann, “Lessons From Asilomar, ” ScienceNews, vol. 127, Feb. 23, 1985, pp. 122-123, 126.

Milunsky, Aubrey, and Annas, George J. (eds. ), Ge-netics and the Law 11 (New York: Plenum Press,1980).

Morison, Robert S., “Commentary on ‘The Boundariesof Scientific Freedom’, ” Newsletter on Science,

Technology, & Human Values, June 1977, pp.22-24.

National Academy of Sciences, Committee on Nation-al Statistics, Sharing Research Data (Washington,DC: National Academy Press, 1985).

National Academy of Sciences, Scientific Communi-cation and National Security, report by the Panelon Scientific Communication and National Secu-rity, Committee on Science, Engineering, and PublicPolicy (Washington, DC: National Academy Press,1982).

Nelkin, Dorothy, Science as Intellectual Property(New york: Macmillan Publishing Co., 1984).

Nelkin, Dorothy, “Intellectual Property: The Controlof Scientific Information, ” Science, vol. 216, May14, 1982, pp. 704-708.

Nelkin, Dorothy, The University and Military Re-search: Moral Politics at A4. I. T. (Ithaca, NY: Cor-nell University Press, 1972).

Nichols, Rodney W,, “Mission-Oriented R& D,” Sci-ence, vol. 172, Apr. 2, 1971, pp. 29-37.

Organisation for Economic Co-Operation and De-velopment, Science, Growth and Society (Paris,France: OECD, 1971).

Panem, Sandra, “The Interferon Dilemma: Secrecy v.Open Exchange, ” The Brookings Revjew, winter1982, pp. 18-22.

Park, Robert L., “Intimidation Leads to Self-Censor-ship in Science, Bulletin of the Atomic Scientists,vol. 141, No. 3, spring 1985, pp. 22-25.

Pelikan, Jaroslav, Scholarship and Its Survival: Ques-tions on the Idea of Graduate Education (Prince-ton, NJ: The Carnegie Foundation for the Advance-ment of Teaching, 1983).

Penick, James, Jr., et al. (eds. ), The Politics of Amer-ican Science, 1939 to the Present (Cambridge, MA:The MIT Press, 1972).

Peterson, James C. (cd.), Citizen Participation in Sci-ence Policy (Amherst, MA: The University of Mas-sachusetts Press, 1984).

Pion, Georgine M., and Lipsey, Mark, “Public Atti-tudes Toward Science and Technology: What Havethe Surveys Told Us?” Public Opinion Quarterly,vol. 145, 1981, PP. 303-316.

Powledge, Tabitha M., “Fetal Experimentation: Try-ing to Sort Out the Issues, ” Hastings Center Report,vol. 5, April 1975, pp. 8-10.

Pursell, Carroll, “ ‘A Savage Struck by Lightning’: TheIdea of a Research Moratorium, 1927-37, ” Lex etScientia, vol. 10, October-December 1974, p p .146-158.

Reeve, Mark, “Scientific Uncertainty and the NationalEnvironmental Policy Act—The Council on Envi-ronmental Quality’s Regulation 40 C. F.R. Section

Page 138: The Regulatory Environment for Science - Princeton

146

1502.22, ” Washington Law Review, vol. 60, No.1, 1984-1985, PP. 101-116.

Reiser, Stanley Joel, “Human Experimentation and theConvergence of Medical Research and PatientCare, ” Annals of the American Academy of Politi-cal and Social Science, vol. 437, May 1978, pp.8-18.

Relyea, Harold C. (cd.), Striking a Balance: NationalSecurity and Scientific Freedom (Washington, DC:American Association for the Advancement of Sci-ence, 1985).

Relyea, Harold C., ZVationaZSecun”ty ControZs and Sci-entific Information (updated Sept, 11, 1984), issuebrief IB 82083, U.S. Congress, Library of Congress,Congressional Research Service, Government Di-vision, 1984,

Relyea, Harold C., “Increased National Security Con-trols on Scientific Communication, ” GovernmentInformation Quarterly, vol. 1, No. 2, 1984.

Relyea, Harold C., “Information, Secrecy, and AtomicEnergy, ” New York University Review of Law andSocial Change, vol. 10, No. 2, 1980-1981.

Robertson, John A., “The Scientist’s Right to Research:A Constitutional Analysis, ” Southern CaliforniaLaw Review, vol . 51, September 1978, pp.1203-1279.

Rosenzweig, Robert M., “Research as IntellectualProperty: Influences Within the University, ” Sci-ence, Technology, & Human Values, vol. 10, spring1985, pp. 41-48,

Rubin, Jeffrey S., “Breaking Into the Prison: Conduct-ing a Medical Research Project, ” American _/ournalof Psychiatry, vol. 133, February 1976, pp. 230-232.

Schmidt, Richard, and Shure, Cecile, “Government Se-crecy,” Newsletter on Intellectual Freedom, vol. 33,1984, pp. 129, 159-169.

Science Council of Canada, Regulating the Regulators:Science, Values and Decisions (Ottawa, Ontario:1982).

Seaburg, Paul (cd,), Bureaucrats and Brainpower: Gov-ernment Regulations of Universities (San Francis-co, CA: Institute of Contemporary Studies, 1979).

Shapiro, Martin, “On Predicting the Future of Admin-istrative Law, ” Regulation, vol. 6, 1982, pp. 18-25.

Shattuck, John W., Z+deraJ Restn”ctions on the FreeFlow of Academic Information and Ideas (Cam-bridge, MA: Harvard University, January 1985)

Singer, Peter (cd.), In Defense of Animals (New York:Basil Blackwell Inc., 1985).

Singer, Peter, “Sense and Sensibility in Animal Re-search, ” New Scientist, oct. 25, 1984, pp. 35-36.

Sokal, Michael M,, “Restrictions on Scientific Publi-cation,” Science, vol. 215, Mar. 5, 1982, p. 1182.

Steelman, John R., Science and Public Policy (NewYork: Arno Press, reprinted from 1947).

Swazey, Judith P., and Fox, Renee C., “The ClinicalMoratorium: A Case of Mitral Valve Surgery, ” Ex-perimentation With Human Subjects, Paul A.Freund (cd. ) (New York: George Braziller, 1969).

Teich, Albert, and Thorton, Ray (eds.), Science, Tech-nology and the Issues of the Eighties: Policy Out-Zook (Boulder, CO: Westview Press, 1982).

Tiefel, Hans O., “The Cost of Fetal Research: EthicalConsiderations, ” New England ~ournaJ of Medicine,vol. 294, Jan. 8, 1976, pp. 85-90.

U.S. C;ongress, House Committee on the Judiciary,Subcommittee on Courts, Civil Liberties and theAdrninistration of Justice, Oversight Hearing: 1984Civil Liberties and the National %m..m”ty State, Nov.3, 1983,

Wax, Murray L., and Cassell, Joan (eds. ), Federal Reg-ulations, EthicaZ Issues and Social Research (Boul-der, CO: Westview Press, 1979).

Weart, Spencer R., Scientists in Power (Cambridge,MA: Harvard University Press, 1979).

Wechsler, Henry (cd.), The Social Context of Medi-cal Research (Cambridge, MA: Ballinger, 1981), pp.27-71.

Weyl, F. Joachim, Research in the Service of NationalPurpose, proceedings of the Office of Naval Re-search, Vicennial Convocation (Washington, DC:Office of Naval Research, 1966).

The White House, President’s Scientific ResearchBoard, Science and Public Policy: Administrationfor Research, 3 vols. (Washington, DC: Oct. 4,1947).

Wilson, John T., Academic Science, Higher Education,and the Federal Government, 1950-1983 (Chicago,IL: University of Chicago Press, 1983).

Wulff, Keith M. (cd.), Regulation of Scientific Inquiry(Boulder, CO: Westview Press, 1979).

Young, Leo, “Commentary: The Control of Govern-ment-Sponsored Technical Information, ” Science,Technology, & Human Values, vol. 10, spring 1985,pp. 82-86.

0