America's Investment in the Future

185
America’s Investment in the Future NSF C E L E B R AT I N G 5 0 Y E A R S Foreword by Walter Cronkite Photos by Felice Frankel

Transcript of America's Investment in the Future

Page 1: America's Investment in the Future

America’sInvestment

in the

Future

N S F C E L E B R A T I N G 5 0 Y E A R SForeword by Wal te r Cronk i te Photos by Fe l i ce F ranke l

Page 2: America's Investment in the Future

On the coverPart liquid, part magnet, this ferrofluid

serves as a symbol of NSF’s dual mission—research and education. Precisely ground parti-cles of magnetite in the ferrofluid remain looseand suspended in an oil. When the ferrofluidis placed in a magnetic field, it recalls its mag-netic heritage. Here, a drop of ferrofluid restson a glass sheet placed above a piece of yel-low paper. Photographer Felice Frankel thenplaced seven small magnets under the paper.The attraction between the magnetite and themagnets creates the beautiful flower-like form.

Page 3: America's Investment in the Future
Page 4: America's Investment in the Future

America’sInvestment

in the

Future

N S F C E L E B R A T I N G 5 0 Y E A R SForeword by Wal te r Cronk i te Photos by Fe l i ce F ranke l

Page 5: America's Investment in the Future

Cover and opening photogaphs for each chapter©2000 by Felice Frankel. These photographscannot be reproduced without permission of the photographer.

Printed in the United States of AmericaBook and cover design: Low + Associates Inc.

National Science Foundation4201 Wilson Blvd.Arlington, VA 22230703-292-5111TDD 703-292-5090www.nsf.gov

NSF 00-50

www.nsf.gov

Page 6: America's Investment in the Future

Table of Contents — iii

Contents

Foreword by Walter Cronkite v

Introduction by NSF Director Rita R. Colwell 1

Internetchanging the way we communicate 4

Advanced Materialsthe stuff dreams are made of 18

Educationlessons about learning 32

Manufacturingthe forms of things unknown 48

Arabidopsismap makers of the plant kingdom 62

Decision Scienceshow the game is played 76

Visualizationa way to see the unseen 88

Environmenttaking the long view 102

Astronomy exploring the expanding universe 118

Science on the Edgearctic and antarctic discoveries 132

Disasters & Hazard Mitigationliving more safely on a restless planet 146

About the Photographs 160

Photo Captions and Credits 163

Acknowledgments 166

National Science Board Members 170

National Science Foundation Executive Staff 171

About the National Science Foundation 172

Page 7: America's Investment in the Future
Page 8: America's Investment in the Future

and analysis it took to get to the headline. Eventhough we’re not scientists, this book can help usall to see beneath the surface of things and toappreciate how NSF enables researchers to advancethe frontiers of knowledge in every direction.

Congress established the National ScienceFoundation in 1950 to transform wartime researchinto a peacetime engine for prosperity and nationalsecurity. The Foundation has succeeded master-fully, albeit quietly, in achieving these goals. Maybethat’s because NSF does not operate any labora-tories, conduct any experiments, or land anyastronauts on the moon. Rather, NSF is the nation’ssingle largest funder of laboratories and experiments,of the kind of exploratory research that quietlyplants seeds today that make headlines tomorrow.

This book tells the stories behind those head-lines—stories about the men and women who arehelping us to understand the world around us.For the past fifty years, this has been the storyof the National Science Foundation.

Near the end of A Reporter’s Life, I wrote: “A career can be called a success if one can lookback and say, ‘I made a difference.’” After readingthis book, I think you’ll agree that the NationalScience Foundation is doing just that.

I’m not a scientist. I’ll never experience the thrillof formulating a new algorithm or unlocking a newgene sequence. But you don’t have to be a scientist(or even have passed a physics course) to under-stand and appreciate the National ScienceFoundation. If you’ve ever surfed the Web or sentan e-mail, thank NSF. NSF also played a role in thedevelopment of wireless communications, advancedmedical diagnostics, and more accurate weatherforecasting. The list of scientific discoveries andengineering feats that NSF has supported overthe past fifty years will surprise you.

In 1982 I had the good fortune to accompanymarine biologists on a deep-sea dive off the coastof Mexico. It was an NSF-funded research mission.As The Alvin descended deeper and deeper intothe ocean, I observed a world that I never knewexisted, a world beneath the surface that is vastand varied.

This book, like my adventure on The Alvin, openedmy eyes. Eighteen years ago, I was so awed bythe ocean’s secrets I didn’t stop to think abouthow they were revealed. Today, when we read astory about cloning sheep or about amazinglystrong molecules, most of us don’t stop to thinkabout the years of trial and error, experimentation

It may seem ironic that I—a man who failed first-year physics at the University of Texas—am writing the foreword to a book about the National Science Foundation.

Foreword — v

Walter Cronkite today (top) and aboard

the NSF-funded expedition on The Alvin

in 1982.

Foreword by Walter Cronkite

Page 9: America's Investment in the Future
Page 10: America's Investment in the Future

Introduction — 1

When she assumed her post in August

1998, Dr. Rita Colwell became the

Foundation’s first female director. She

is a nationally respected scientist and

educator. Before she became NSF director,

Dr. Colwell was president of the University

of Maryland Biotechnology Institute and

professor of microbiology at the University

of Maryland where she produced the

award-winning film, Invisible Seas.

While at the University of Maryland, Dr.

Colwell also served as director of the Sea

Grant College and as vice president for

academic affairs.

Unique among federal research agencies, theNational Science Foundation’s mission is toadvance learning and discovery in all disciplinesof science and engineering and to foster connec-tions among them. Our job is to keep scienceand engineering visionaries focused on the furthest frontier, to recognize and nurture emergingfields, to prepare the next generation of scientifictalent, and to ensure that all Americans gain anunderstanding of what science and technologyhave to offer.

Retired hockey star Wayne Gretzky used to say,“I skate to where the puck is going, not to whereit’s been.” At NSF, we try to fund where the fieldsare going, not where they’ve been. In marking ourfiftieth anniversary, we are celebrating this kindof vision and foresight.

For example, as chronicled in this book, NSFbegan funding efforts in the mid-1980s to expandwhat was largely a Department of Defense net-worked computer system into the civilian realm.NSFNET linked NSF-supported supercomputercenters at five universities and was open to allacademic users. Response was so great that NSFwas soon able to turn much of the burgeoningnetwork over to the private sector. Meanwhile, astudent working at one of the NSF supercomputercenters developed the first major Web browser,Mosaic. Other NSF-funded research led to the firstwidely used Internet routers, the gateways andswitches that guide information around the globe.Besides enabling the freer flow and more sophis-ticated manipulation of information, the Internethas triggered a surge of new business activity,which some say will amount to $1.3 trillion in e-commerce activity by 2003.

All in all, NSF’s role in the birth of the Internetis a per fect example of how the right publicinvestment can lead to huge societal pay-offs.Other stories you’ll read here highlight NSF’sinstrumental role in such important innovationsas Magnetic Resonance Imaging, or MRI, one ofthe most comprehensive medical diagnostic tools;the identification of the “ozone hole” over theAntarctic and chlorofluorocarbons as the probablecause; advances in the underlying mathematicsof computer-aided design (CAD) and computer-aided manufacturing (CAM), two techniques thatpower much of U.S. industry’s global competitive-ness; and many other discoveries.

The point to remember is that these and otheradvances came only after long years of publiclyfunded basic research that NSF had identified asamong the most promising avenues for exploration.Businesses, understandably, tend to make researchinvestments that pay off in the short term. Otherfederal agencies have as their mission the oceans,energy, defense, or health, and they fund researchthat directly relates to those missions. In contrast,NSF’s mandate is broad, deep, and long: to investin educational programs and fundamental, multi-disciplinary research of strategic, long-term interestto the nation.

Science—The Endless Frontier, a 1945 reportby Vannevar Bush, a respected engineer andPresident Franklin Roosevelt’s science advisor,made the case for why the federal governmentshould actively promote the progress of researchand science-related education. On May 10, 1950,President Harry Truman signed the bill creatingthe National Science Foundation. Now, fifty yearslater, we are reaping the rewards of this prescient

The National Science Foundation at 50

Where Discoveries BeginAt the National Science Foundation, we invest in America’s future. Our support of creative people,innovative ideas, and cutting-edge technologies has led to thousands of discoveries vital to ournation’s health and prosperity.

Page 11: America's Investment in the Future
Page 12: America's Investment in the Future

Introduction — 3

commitment. As a whole, we are a healthier, bettereducated, and more accomplished nation. Advancesin knowledge have accounted for half of the netnew growth in the U.S. economy since the end ofWorld War II. This is a mighty return on investmentsmade by NSF, other government agencies, and theirpartners in the science and engineering community.NSF’s share of that investment totals nearly $4billion in fiscal year 2000.

We have reason to celebrate NSF’s historicalaccomplishments, but looking back is an unusualposture for the Foundation. We are more accus-tomed to anticipating new frontiers so that we canenable researchers, students, and educators toget where they need to be. As we look ahead today,one of the highest priorities for NSF and its part-ners is information technology research. There isno area of life that has not been dramaticallyaltered by the advent of computers. In my ownwork, tracking the environmental conditions thatgive rise to cholera outbreaks, I’ve gone from veryearly studies using an IBM 650 (a model now ondisplay at the Smithsonian Institution) to recent,highly complex analyses of data collected fromglobal satellites and remote sensing systems.NSF’s multidisciplinary connections, its historicallystrong relationships with the nation’s researchuniversities, and its commitment to the publicgood make the agency a natural leader with regardto information technologies. That is why NSF hasbeen asked to lead the federal government’s initiative to develop faster, more powerful com-puters and networks.

Another priority for NSF in the next few yearswill be to nurture the development of an emergingarea known as “biocomplexity.” The NSF-led

biocomplexity initiative will lead to a better under-standing of the interaction among biological, physical,and social systems. As this book illustrates, manyof the most exciting discoveries occur at the inter-sections of multiple disciplines, where chemistshelp biologists see how blood vessels can berepaired with polymers and social scientists learnfrom mathematicians how to study the seemingchaos of human interactions. NSF is committed tojoining what were once discrete disciplines into amore powerful understanding of the whole of nature.

The education of our nation’s youth also remainsa major concern. In an economy ever more drivenby knowledge and ideas, it’s paramount that wediscover better ways to prepare a culturally diverseand globally competitive workforce of scientists,engineers, and other citizens. NSF has alwaysencouraged innovation in the teaching of science,mathematics, and engineering at all grade levelsand among the general public. We will continueto build the kind of synergistic partnerships amongresearchers, educators, policymakers, parents,and students that lay the groundwork for truereform. As NSF Deputy Director Joseph Bordognahas said, “It’s not enough just to discover newknowledge; we need to train people in the use ofthat new knowledge if the American workforce isto prevail in the twenty-first century.”

As we look to the century ahead, it is apparentthat science and technology will continue to bethe propelling and sustaining forces of our nation’swell-being. Our quality of life will in large measuredepend on the vigor of our economy, the healthof our planet, and the opportunities for enlight-enment. Wherever the next research challengelies, you will find the National Science Foundation.

—Rita R. ColwellDirector

Dr. Colwell has studied the causes and

cycles of the infectious cholera bacterium

for more than 30 years. While working

in Bangladesh recently, she demonstrated

how to use sari cloth as an excellent,

affordable water filter to screen out plank-

ton associated with cholera transmission.

Page 13: America's Investment in the Future

The Internetchanging the way we

communicate

Page 14: America's Investment in the Future

rom a sprawling web of computer

networks, the Internet has spread

throughout the United States and

abroad. With funding from NSF and

other agencies, the Net has now

become a fundamental resource in

the fields of science, engineering,

and education, as well as a dynamic

part of our daily lives.

F

Page 15: America's Investment in the Future

To u nde r s t a nd t h e c hange s brought by the Internet, one has only to go

back a few decades, to a time when researchers networked in person, and collaboration meant

working with colleagues in offices down the hall. From the beginning, computer networks were

expected to expand the reach—and grasp—of researchers, providing more access to computer

resources and easier transfer of information. NSF made this expectation a reality. Beginning by

funding a network linking computer science departments, NSF moved on to the development

of a high-speed backbone, called NSFNET, to connect five NSF-supported supercomputers.

More recently NSF has funded a new backbone, and is playing a major role in evolving the

Internet for both scientists and the public. The Internet now connects millions of users in the

United States and other countries. It has become the underpinning for a vibrant commercial

enterprise, as well as a strong scientific tool, and NSF continues to fund research to promote

high-performance networking for scientific research and education.

Page 16: America's Investment in the Future

The Internet — 7

A Constellation of Opportunities• A solar wind blasts across Earth’s magnetic

field, creating ripples of energy that jostle satel-lites and disrupt electrical systems. Satellitedata about the storm are downlinked throughGoddard Space Flight Center in Greenbelt,Maryland, passed on to a supercomputer cen-ter, and uploaded by NSF-funded physicists atthe University of Maryland and at DartmouthCollege in Hanover, New Hampshire. Using theInternet, researchers work from their ownoffices, jointly creating computer images ofthese events, which will lead to better space-weather forecasting systems.

• An NSF-funded anthropologist at Penn StateUniversity uses his Internet connection towade through oceans of information. Finally,he chooses the EMBL-Genbank in Bethesda,Maryland, and quickly searches through hugeamounts of data for the newly deciphered DNAsequence of a gene he’s studying. He finds it,highlights the information, downloads it, andlogs off.

• It’s time to adjust the space science equip-ment in Greenland. First, specialized radar is pointed at an auroral arc. Then an all-skycamera is turned on. The physicist controllingthe equipment is part of a worldwide team ofresearchers working on NSF’s Upper Atmos-pheric Research Collaboratory (UARC). Whenshe’s finished making the adjustments, thephysicist pushes back from her computer inher Ann Arbor, Michigan, office, knowing thejob is done.

• The Laminated Object Manufacturing (LOM)machine’s camera, or LOMcam, puts a new picture on the Web every forty-five seconds, andthe molecular biologist watches. From his office,he can already tell that the tele-manufacturingsystem is creating an accurate physical modelof the virus he is studying. The San DiegoSupercomputer Center’s LOMcam continues topost pictures, but the biologist stops watching,knowing that he will soon handle and examinethe physical rendering of the virus, and learnmore about it than his computer screen imagecould ever reveal.

This is the Internet at work in the lives of sci-entists around the globe. “The Internet hasn’t onlychanged how we do science, it permits entirely newavenues of research that could not have been con-templated just a few years ago,” says GeorgeStrawn, executive officer of NSF's Directorate forComputer and Information Science and Engineering(CISE) and former division director for networkingwithin CISE. “For example, the key to capitalizingon biologists’ success in decoding the humangenome is to use Internet-based data enginesthat can quickly manipulate even the most mas-sive data sets.”

A Public NetThe Internet, with its millions of connections world-wide, is indeed changing the way science is done.It is making research more collaborative, makingmore data available, and producing more results,faster. The Internet also offers new ways of dis-playing results, such as virtual reality systems thatcan be accessed from just about anywhere. The newaccess to both computer power and collaborativescientists allows researchers to answer questionsthey could only think about a few years ago.

Working from one of the NSF-supported

supercomputing centers, researcher

Greg Foss used the Internet to collect

weather data from many sites in

Oklahoma. He used the data to create

this three-dimensional model of a

thunderstorm.

Page 17: America's Investment in the Future

8 — National Science Foundation

It is not just the scientists who are enthralled.While not yet as ubiquitous as the television or aspervasive as the telephone, in the last twentyyears, the Internet has climbed out of the obscu-rity of being a mere “researcher's tool” to therealm of a medium for the masses. In March 2000,an estimated 304 million people around the world(including nearly 137 million users in the UnitedStates and Canada) had access to the Internet, upfrom 3 million estimated users in 1994. U.S.households with access to the Internet increasedfrom 2 percent in 1994 to 26 percent in 1998,according to the National Science Board’s (NSB)Science and Engineering Indicators 2000. (Everytwo years, the NSB—NSF’s governing body—reports to the President on the status of scienceand engineering.)

In today’s world, people use the Internet tocommunicate. In fact, for many, email has replacedtelephone and fax. The popularity of email lies inits convenience. No more games of telephone tag,no more staying late to wait for a phone call. Emailallows for untethered connectivity.

The emergence of the World Wide Web hashelped the Internet become commonplace inoffices and homes. Consumers can shop for goodsvia the Web from virtually every retail sector, frombooks and CDs to cars and even houses. Banksand investment firms use the Web to offer theirclients instant account reports as well as mech-anisms for electronic financial interactions. In 1999,the U.S. Census Bureau began collecting informa-tion on e-commerce, which it defined as onlinesales by retail establishments. For the last threemonths of 1999, the bureau reported nearly

$5.2 billion in e-commerce sales (accounting for0.63 percent of total sales), and nearly $5.3 billionfor the first quarter of 2000. More and more,people are going “online to shop, learn aboutdifferent products and providers, search for jobs,manage finances, obtain health information andscan their hometown newspapers,” according to arecent Commerce Department report on the digitaleconomy. The surge of new Internet businesspromises to continue, with some experts estimating$1.3 trillion in e-commerce activity by 2003.

Among the Web’s advantages is the fact that itis open twenty-four hours a day. Where else canyou go at four in the morning to get a preview ofthe upcoming art exhibit at the New York Museumof Modern Art . . . fact sheets on how to preparefor an earthquake or hurricane . . . a study on theimportant dates of the Revolutionary War . . . orhands-on physics experiments for elementaryschool teachers?

With all of this information, the Internet has thepotential to be a democratizing force. The sameinformation is readily available to senior and juniorresearchers, to elementary school students, andto CEOs. Anyone who knows how and has theequipment can download the information, examineit, and put it to use.

While the original Internet developers may nothave envisioned all of its broad-reaching capabili-ties, they did see the Internet as a way of sharingresources, including people, equipment, andinformation. To that end, their work has succeededbeyond belief. The original backbone rates of 56 kbps (kilobytes per second) are now availablein many homes.

Page 18: America's Investment in the Future

In March 1997, NSF launched itsPartnerships for Advanced Compu-tational Infrastructure (PACI) program.“This new program will enable theUnited States to stay at the leadingedge of computational science, producing the best science and engineering in all fields,” said PaulYoung, former head of NSF’s Directoratefor Computer and Information Scienceand Engineering.

The program consists of the NationalComputational Science Alliance (theAlliance), led by the University of Illinoisat Urbana-Champaign, and the NationalPartnership for Advanced ComputationalInfrastructure (NPACI) led by the SanDiego Supercomputer Center. Thepartnerships offer a breadth of visionbeyond even what NSF has hoped for.They will maintain the country’s leadin computational science, further theuse of computers in all disciplines ofresearch, and offer new educationalopportunities for people ranging fromkindergartners through Ph.D.s.

The Alliance’s vision is to createa distributed environment as a proto-type for a national information infrastructure that would enable thebest computational research in thecountry. It is organized into fourmajor groups:

• Application Technologies Teams thatdrive technology development;

• Enabling Technologies Teams that convert computer science researchinto usable tools and infrastructure.

• Regional Partners with advancedand mid-level computing resources

that help distribute the technologyto sites throughout the U.S.

• Education, Outreach, and TrainingTeams that will educate and pro-mote the use of the technology tovarious sectors of society.

In addition, the leading-edge site atthe University of Illinois at Urbana-Champaign supports a variety of high-end machines and architectures enablinghigh-end computation for scientistsand engineers across the country.

NPACI includes a national-scalemetacomputing environment withdiverse hardware and several high-endsites. It supports the computationalneeds of high-end scientists and engineers across the country via avariety of leading-edge machines andarchitectures at the University ofCalifornia at San Diego. It also fostersthe transfer of technologies and toolsdeveloped by applications and computerscientists for use by these high-endusers. Another major focus includesdata-intensive computing, digitallibraries, and large data set manipulationacross many disciplines in both engineering and the social sciences.

NSF recently announced an awardto the Pittsburgh SupercomputingCenter to build a system that willoperate at speeds well beyond a trillioncalculations per second. The TerascaleComputing System is expected tobegin operation in early 2001, whenPittsburgh will become PACI's latestleading-edge site. Through thesepartnerships, PACI looks to a strongfuture in computational science.

PACI: Computer Partnerships

Page 19: America's Investment in the Future

10 — National Science Foundation

From the outset, NSF limited the amount of timeit would support CSNET. By 1986, the network wasto be self-supporting. This was a risky decision,because in 1981 the value of network services wasnot widely understood. The policy, which carriedforward into subsequent NSF networking efforts,required bidders to think about commercializationfrom the very start. When the 1986 deadlinearrived, more than 165 university, industrial, andgovernment computer research groups belongedto CSNET. Usage charges plus membership feesranged from $2,000 for small computer sciencedepartments to $30,000 for larger industrial mem-bers. With membership came customer support.

The Launch of NSFNETWhile CSNET was growing in the early 1980s, NSFbegan funding improvements in the academiccomputing infrastructure. Providing access to com-puters with increasing speed became essential forcertain kinds of research. NSF’s supercomputingprogram, launched in 1984, was designed to makehigh per formance computers accessible toresearchers around the country.

The first stage was to fund the purchase ofsupercomputer access at Purdue University, theUniversity of Minnesota, Boeing Computer Services,AT&T Bell Laboratories, Colorado State University,and Digital Productions. In 1985, four new super-computer centers were established with NSFsupport—the John von Neumann Center atPrinceton University, the San Diego Supercomputer

From Modest BeginningsIn the 1970s, the sharing of expensive computingresources, such as mainframes, was causing abottleneck in the development of new computerscience technology, so engineers developed net-working as a way of sharing resources.

The original networking was limited to a fewsystems, including the university system that linkedterminals with time-sharing computers, early busi-ness systems for applications such as airlinereservations, and the Department of Defense’sARPANET. Begun by the Defense AdvancedResearch Projects Agency (DARPA) in 1969 as anexperiment in resource-sharing, ARPANET providedpowerful (high-bandwidth) communications linksbetween major computational resources andcomputer users in academic, industrial, and gov-ernment research laboratories.

Inspired by ARPANET’s success, the CoordinatedExperimental Research Program of the ComputerScience Section of NSF’s Mathematical andPhysical Sciences Directorate started its own net-work in 1981. Called CSNET (Computer ScienceNetwork), the system provided Internet services,including electronic mail and connections toARPANET. While CSNET itself was just a startingpoint, it served well. “Its most important contri-bution was to bring together the U.S. computerscience community and to create the environmentthat fostered the Internet,” explains LarryLandweber, a professor at the University ofWisconsin and a CSNET principal investigator. In addition, CSNET was responsible for the firstInternet gateways between the United States andmany countries in Europe and Asia.

Satellites, the Hubble Space Telescope,

and observatories around the world

provide data to Earth-bound scientists.

Once received, the data are sent, via

the Internet, to researchers around

the country. At the University of

Illinois-based National Center for

Supercomputing Applications, Frank

Summers of Princeton University

used the data to create a model of

a galaxy formation.

Page 20: America's Investment in the Future

The Internet — 11

Center on the campus of the University of Californiaat San Diego, the National Center for Super-computing Applications at the University of Illinois,and the Cornell Theory Center, a production andexperimental supercomputer center. NSF laterestablished the Pittsburgh Supercomputing Center,which was run jointly by Westinghouse, Carnegie-Mellon University, and the University of Pittsburgh.

In 1989, funding for four of the centers, SanDiego, Urbana-Champaign, Cornell, and Pittsburgh,was renewed. In 1997, NSF restructured thesupercomputer centers program and funded thesupercomputer site partnerships based in SanDiego and Urbana-Champaign.

A fundamental part of the supercomputing ini-tiative was the creation of NSFNET. NSF envisioneda general high-speed network, moving data morethan twenty-five times the speed of CSNET, andconnecting existing regional networks, which NSFhad created, and local academic networks. NSFwanted to create an “inter-net,” a “network ofnetworks,” connected to DARPA’s own internet,which included the ARPANET. It would offer usersthe ability to access remote computing resourcesfrom within their own local computing environment.

NSFNET got off to a relatively modest start in 1986 with connections among the five NSF university-based supercomputer centers. Yet itsconnection with ARPANET immediately put NSFNETinto the major leagues as far as networking wasconcerned. As with CSNET, NSF decided not torestrict NSFNET to supercomputer researchersbut to open it to all academic users. The other wide-area networks (all government-owned) supported mere handfuls of specialized contrac-tors and researchers.

The flow of traffic on NSFNET was so great inthe first year that an upgrade was required. NSFissued a solicitation calling for an upgrade and,

equally important, the participation of the privatesector. Steve Wolff, then program director forNSFNET, explained why commercial interests eventually had to become a part of the network,and why NSF supported it.

“It had to come,” says Wolff, “because it was obvious that if it didn’t come in a coordinated way,it would come in a haphazard way, and the academiccommunity would remain aloof, on the margin.That’s the wrong model—multiple networks again,rather than a single Internet. There had to becommercial activity to help support networking,to help build volume on the network. That wouldget the cost down for everybody, including theacademic community, which is what NSF wassupposed to be doing.”

To achieve this goal, Wolff and others framedthe 1987 upgrade solicitation in a way that wouldenable bidding companies to gain technical expe-rience for the future. The winning proposal camefrom a team including Merit Network, Inc., a con-sortium of Michigan universities, and the state ofMichigan, as well as two commercial companies,IBM and MCI. In addition to overall engineering,management, and operation of the project, theMerit team was responsible for developing user

In 1996, researchers and artists envi-

sioned the information superhighway

this way. Since then, individuals, schools

and universities, organizations, and

government agencies have added

millions of connections to the Internet.

Page 21: America's Investment in the Future

12 — National Science Foundation

support and information services. IBM providedthe hardware and software for the packet-switchingnetwork and network management, while MCIprovided the transmission circuits for the NSFNETbackbone, including reduced tariffs for that service.

Merit Network worked quickly. In July 1988,eight months after the award, the new backbonewas operational. It connected thirteen regionalnetworks and supercomputer centers, representinga total of over 170 constituent campus networksand transmitting 152 million packets of informa-tion per month.

Just as quickly, the supply offered by theupgraded NSFNET caused a surge in demand.Usage increased on the order of 10 percent permonth, a growth rate that has continued to this dayin spite of repeated expansions in data commu-nications capacity. In 1989, Merit Network wasalready planning for the upgrade of the NSFNETbackbone service from T1 (1.5 megabits per sec-ond or Mbps) to T3 (45 Mbps).

“When we first started producing those trafficcharts, they all showed the same thing—up andup and up! You probably could see a hundred ofthese, and the chart was always the same,” saysEllen Hoffman, a member of the Merit team.“Whether it is growth on the Web or growth of traf-fic on the Internet, you didn’t think it would keepdoing that forever, and it did. It just never stopped.”

The T3 upgrade, like the original networkimplementation, deployed new technology underrigorous operating conditions. It also required aheavier responsibility than NSF was prepared toassume. The upgrade, therefore, represented anorganizational as well as a technical milestone—the beginning of the Internet industry.

In 1990 and 1991, the NSFNET team wasrestructured. A not-for-profit entity called AdvancedNetworks and Services continued to providebackbone service as a subcontractor to MeritNetwork, while a for-profit subsidiary was spun offto enable commercial development of the network.

The new T3 service was fully inaugurated in1991, representing a thir tyfold increase in thebandwidth on the backbone. The network linkedsixteen sites and over 3,500 networks. By 1992,over 6,000 networks were connected, one-third ofthem outside the United States. The numberscontinued to climb. In March 1991, the Internetwas transferring 1.3 trillion bytes of informationper month. By the end of 1994, it was transmitting17.8 trillion bytes per month, the equivalent ofelectronically moving the entire contents of theLibrary of Congress every four months.

An End and a BeginningBy 1995, it was clear the Internet was growingdramatically. NSFNET had spurred Internet growthin all kinds of organizations. NSF had spent approx-imately $30 million on NSFNET, complemented byin-kind and other investments by IBM and MCI.As a result, 1995 saw about 100,000 networks—both public and private—in operation around thecountry. On April 30 of that year, NSF decommis-sioned the NSF backbone. The efforts to privatizethe backbone functions had been successful,announced Paul Young, then head of NSF’s CISEDirectorate, and the existing backbone was nolonger necessary.

From there, NSF set its sights even higher. In1993, the Foundation offered a solicitation callingfor a new, very high per formance BackboneNetwork Service (vBNS) to be used exclusively forresearch by selected users. In 1995, Young andhis staff worked out a five-year cooperative agree-

Page 22: America's Investment in the Future

Mosaic: The Original BrowserBy 1992, the Internet had becomethe most popular network linkingresearchers and educators at thepost-secondary level throughout theworld. Researchers at the EuropeanLaboratory for Particle Physics,known by its French acronym, CERN,had developed and implemented theWorld Wide Web, a network-basedhypertext system that let usersembed Internet addresses in theirdocuments. Users could simply clickon these references to connect tothe reference location itself. Soonafter its release, the Web came tothe attention of a programming team at the National Center forSupercomputing Applications(NCSA), an NSF-supported facility at the University of Illinois.

The history of NSF's supercom-puting centers overlapped greatlywith the worldwide rise of the per-sonal computer and workstation. Itwas, therefore, not surprising that

software developers focused on creat-ing easy-to-use software tools fordesktop machines. The NSF centersdeveloped many tools for organizing,locating, and navigating throughinformation, but perhaps the mostspectacular success was the NCSAMosaic, which in less than eighteenmonths after its introduction becamethe Internet “browser of choice” forover a million users, and set off anexponential growth in the number ofdecentralized information providers.Marc Andreessen headed the teamthat developed Mosaic, a graphicalbrowser that allowed programmers to post images, sound, video clips,and multifont text within a hypertextsystem. Mosaic engendered a widerange of commercial developmentsincluding numerous commercial versions of Web browsers, such as Andreessen's Netscape and Microsoft’s Internet Explorer.

Page 23: America's Investment in the Future

14 — National Science Foundation

ment with MCI to offer the vBNS. That agreementwas recently extended to keep the vBNS operatingthrough March 2003. The vBNS has met its goalof pushing transmission speed from its startingpoint of 155 Mbps to speeds in excess of 2.4billion bits per second by the turn of the century.

The vBNS originally linked the two NSF super-computing leading-edge sites that are part of the Foundation’s Partnerships for AdvancedComputational Infrastructure (PACI) program. NSFsoon tied in another thir teen institutions. By2000, the network connected 101 institutions,including 94 of the 177 U.S. universities thathave received high-per formance computingawards from the Foundation.

“In ensuring that vBNS will be available at leastthrough March 2003, NSF is living up to its Next-Generation Internet commitments while chartingthe course for new research applications thatcapitalize on that infrastructure,” says NSF'sStrawn. “The new Information Technology Researchprogram—begun in fiscal year 2000—has spurredan overwhelming response of proposals from theacademic community, which proves that thesetools have become critical to research in scienceand engineering.”

Research on Today’s Internet For researchers, the expanding Internet meansmore—more data, more collaboration, and morecomplex systems of interactions. And while notevery university and research institution ishooked up to the vBNS, all forms of the Internethave brought radical changes to the way researchis conducted.

Ken Weiss is an anthropologist at Penn StateUniversity and an NSF-supported researcherstudying the worldwide genetic variability of humans.While he is not actively seeking a hook-up to the

vBNS, he says the Internet has had a significantimpact on his research. For example, he usesemail constantly, finding it more convenient thanthe phone ever was. And he has access to muchmore data. He can download huge numbers ofgene sequences from around the world and doresearch on specific genes.

Weiss is an active user and enthusiast, but hedoes not necessarily agree that more is alwaysbetter. “The jury is still out on some aspects, suchas the exponential growth of databases, whichmay be outpacing our ability for quality control.Sometimes the data collection serves as a sub-stitute for thought,” says Weiss.

Other disciplines are seeing an equally phenom-enal surge of information. Researchers can nowget many journals online when they once neededto work geographically close to a university library.The surge of data is both a boon and a problemfor researchers trying to keep on top of their fields.But no one is asking to return to the pre-Internetdays, and no one is expecting the informationgrowth to end.

On a more profound level, the Internet ischanging science itself by facilitating broaderstudies. “Instead of special interest groupsfocusing on smaller questions, it allows peopleto look at the big picture,” says Mark Luker, who was responsible for high-performance net-working programs within CISE until 1997 and isnow vice president at EDUCAUSE, a nonprofitorganization concerned with higher educationand information technology.

Page 24: America's Investment in the Future

Routers, sometimes referred to asgateways or switches, are combina-tions of hardware and software thatconvey packets along the right pathsthrough the network, based on theiraddresses and the levels of conges-tion on alternative routes. As withmost Internet hardware and soft-ware, routers were developed andevolved along with packet switchingand inter-network protocols.

NSFNET represented a major new challenge, however, because itconnected such a diverse variety of networks. The person who didmore than anyone else to enablenetworks to talk to each other wasNSF grantee David L. Mills of the

University of Delaware. Mills developedthe Fuzzball software for use onNSFNET, where its success led toever broader use throughout theInternet. The Fuzzball is actually apackage comprising a fast, compactoperating system, support for theDARPA/NSF Internet architecture,and an array of application programsfor network protocol development, testing, and evaluation.

Why the funny name? Mills beganhis work using a primitive version ofthe software that was already knownas the “fuzzball.” Nobody knows whofirst called it that, or why. But every-one appreciates what it does.

Fuzzball: The Innovative Router

Page 25: America's Investment in the Future

16 — National Science Foundation

By “special interest groups,” he means themore traditional, individualistic style of sciencewhere a researcher receives a grant, buys equip-ment, and is the sole author of the results. Thecurrent trend is for multiple investigators to con-duct coordinated research focused on broad phenomena, according to Tom Finholt, an organi-zational psychologist from the University ofMichigan who studies the relationship betweenthe Internet and scientists.

This trend, Finholt and others hasten to add,has existed for a long time, but has been greatlyenhanced by the Internet’s email, Web pages, andelectronic bulletin boards. In addition, formal collaboratories—or virtual laboratories of collabo-rators—are forming around the globe. The SpacePhysics and Aeronomy Research Collaboratory(SPARC) is one of these. Developed as the UpperAtmospheric Research Collaboratory (UARC) inAnn Arbor, Michigan, in 1992 and focused onspace science, this collaboratory has participantsat sites in the United States and Europe. Scientistscan read data from instruments in Greenland,adjust instruments remotely, and “chat” with col-leagues as they simultaneously view the data.

“Often, space scientists have deep but nar-row training,” says Finholt. SPARC allows them to fit specialized perspectives into a bigger pic-ture. “Space scientists now believe they havereached a point where advances in knowledgewill only be produced by integrating informationfrom many specialties.”

Furthermore, the collaboration is no longer asphysically draining as it once was. Now, scientistslike Charles Goodrich of the University of Marylandand John Lyon of Dartmouth College can continuecollaborating on space-weather research, evenwhen one person moves away. While Goodrichadmits that the work might be done more easily if it could be done in person, he is sure that boththe travel budget and his family life would sufferif he tried. “You can put your data on the Web, butnot your child,” he quips.

Expectation for the Internet of TomorrowIf the past is a guide, the Internet is likely tocontinue to grow at a fast and furious pace. Andas it grows, geographic location will count lessand less. The “Information Superhighway” is notonly here, it is already crowded. As Luker says, itis being divided, as are the highways of manycities, allowing for the equivalent of HOV lanesand both local and express routes. The electronichighway now connects schools, businesses, homes,universities, and organizations.

And it provides both researchers and businessleaders with opportunities that seemed like sci-ence fiction no more than a decade ago. Even now,some of these high-tech innovations—includingvirtual reality, computer conferencing, and tele-manufacturing—have already become standardfare in some laboratories.

Tele-manufacturing allows remote researchers to move quickly from computer drawing boards toa physical mock-up. At the San Diego SupercomputerCenter (SDSC), the Laminated Object Manufacturing(LOM) machine turns files into models using eitherplastic or layers of laminated paper. The benefitsare especially pronounced for molecular biologistswho learn how their molecules actually fit together,or dock. Even in a typical computer graphicsdepiction of the molecules, the docking processand other significant details can get lost among

A networked virtual laboratory allows

scientists to work remotely on a design

project. This virtual laboratory and lab

worker were created by specialists at

the University of Illinois at Chicago.

Page 26: America's Investment in the Future

The Internet — 17

the mounds of insignificant data. SDSC’s modelscan better depict this type of information. They arealso relevant to the work of researchers studyingplate tectonics, hurricanes, the San Diego Bayregion, and mathematical surfaces.

To make the move from the vir tual to thephysical, researchers use the network to sendtheir files to SDSC. Tele-manufacturing lead sci-entist Mike Bailey and his colleagues then createa list of three-dimensional triangles that bound thesurface of the object in question. With that infor-mation, the LOM builds a model. Researchers caneven watch their objects take shape. The LOMcamuses the Web to post new pictures every forty-fiveseconds while a model is being produced.

“We made it incredibly easy to use so thatpeople who wouldn’t think about manufacturingare now manufacturing,” says Bailey. For someresearchers, the whole process has become soeasy that “they think of it no differently than youdo when you make a hard copy on your laserprinter,” he adds. SDSC’s remote lab has movedout of the realm of science fiction and into thearea of everyday office equipment.

While other remote applications are not as faralong, their results will be dramatic once the bugsare ironed out, according to Tom DeFanti of theUniversity of Illinois at Chicago and his colleagues.DeFanti and many others are manipulating thecomputer tools that provide multimedia, interac-tion, virtual reality, and other applications. Theresults, he says, will move computers into anotherrealm. DeFanti is one of the main investigators ofI-WAY, or the Information Wide Area Year, ademonstration of computer power and networkingexpertise. For the 1995 Supercomputer Conferencein San Diego, he and his colleagues, Rick Stevensof the Argonne National Laboratory and LarrySmarr of the National Center for SupercomputingApplications, linked more than a dozen of thecountry’s fastest computer centers and visualiza-tion environments.

The computer shows were more than exercisesin pretty pictures; they demonstrated new ways ofdigging deeply into the available data. For example,participants in the Virtual Surgery demonstrationwere able to use the National Medical Library’sVisible Man and pick up a “virtual scalpel” to cut“vir tual flesh.” At another exhibit, a researcherdemonstrated tele-robotics and tele-presence.While projecting a cyber-image of himself into theconference, the researcher worked from a remoteconsole and controlled a robot who interacted withconference attendees.

Applications such as these are just the begin-ning, says DeFanti. Eventually the Internet willmake possible a broader and more in-depth experience than is currently available. “We’retaking the computer from the two-dimensional‘desktop’ metaphor and turning it into a three-dimensional ‘shopping mall’ model of interaction,”he says. “We want people to go into a computerand be able to perform multiple tasks just as theydo at a mall, a museum, or even a university.”

NSF Directorate for Computer and Information Scienceand Engineering www.cise.nsf.gov

National Computational Science Alliance www.access.ncsa.uiuc.edu/index.alliance.html

National Partnership for Advanced ComputationalInfrastructure www.npaci.edu

Pittsburgh Supercomputing Centerwww.psc.edu

vBNS (very high performance Backbone Network Service)www.vbns.net

Defense Advanced Research Projects Agencywww.darpa.mil

25th Anniversary of ARPANETCharles Babbage Institute at University of Minnesotawww.cbi.umn.edu/darpa/darpa.htm

To Learn More

Page 27: America's Investment in the Future

Advanced Materials

the stuff dreamsare made of

Page 28: America's Investment in the Future

aterials science and engineering

explore the basic structure and

properties of matter, down to the

molecular, atomic, and even subatomic

levels. For fifty years, NSF-supported

researchers have been unlocking the

potential of materials ranging from

ordinary rubber to ultra-high-density

magnetic storage media. The results

are all around us.

M

Page 29: America's Investment in the Future

NSF f u nd s a b r oad a r r ay o f r e s ea r c h to discover new materials

and processing methods and enhance knowledge about the structures and functions of materials.

A high priority for NSF since the 1950s, materials research has led to countless innovations

that now pervade everyday life. Hand-held wireless cellular telephones and oxygen-sensing anti-

pollution devices in automobiles number among the breakthroughs. NSF supports both individual

investigators and collaborative centers at universities, which bring together materials scientists,

including engineers, chemists, physicists, biologists, metallurgists, computer scientists, and

other researchers to work on projects whose commercial potential attracts significant funding

from industry as well as government. Future discoveries in NSF-supported materials research

laboratories will transform life in ways we cannot yet imagine. Semiconductor substrates whose

storage capacity is hundreds of times greater than the current industry standard; artificial skin

that the body accepts as its own; and the remarkable buckyball, a recently discovered form of

carbon with unprecedented strength and hundreds of potential uses ranging from spaceships to

pharmaceuticals—all these materials and more will change the way we live and work.

Page 30: America's Investment in the Future

Advanced Materials — 21

From Craft to Science in Two CenturiesWhat determines a civilization’s ability to moveforward? In large measure, it is mastery overmaterials. The key indicators of progress—military prowess; the ability to produce goods;advances in transportation, agriculture, and thearts—all reflect the degree to which humanshave been able to work with materials and putthem to productive use.

Humans have progressed from the Stone Age,the Bronze Age, and the Iron Age to the SiliconAge and the Age of Materials Science. Scientistsand engineers have achieved mastery over notonly silicon, but also over glass, copper, concrete,alloys or combinations of aluminum and exoticmetals like titanium, plastics, and wholly new“designer materials” that are configured oneatomic layer at a time.

It may seem hard to believe, but it has beenonly in the last 200 years that we humans haveunderstood elemental science well enough andhad the instrumentation necessary to go beyondfabrication—taking a material more or less in itsraw form and making something out of it. Once webegan to explore the basic structure and proper-ties of materials, a wealth of discoveries ensued:

• Ceramics with distinctive electrical propertiesthat make it possible to miniaturize wirelesscommunications devices ranging from cellulartelephones to global positioning technologies.

• A totally new family of materials called organicmetals: conductive polymers (compounds assem-bled like chains, with numerous units linkedtogether to form a whole) that are soluble, can beprocessed, and whose potential applications include“smart” window coatings with optical and trans-parency properties that can be changed electrically.

• An optic layer that fits over liquid crystaldisplays to maintain high contrast even whenthe display is viewed from an angle—now usedin instrument panels of military and commercialaircraft.

• Nonlinear optical crystals of lithium niobate,a unique combination of materials that is idealfor many laser applications.

• Artificial skin that bonds to human tissueso successfully that many burn victims now healwith a fraction of the scarring that once was con-sidered inevitable.

Microprocessors such as this one, which

contains more than one million transis-

tors, have led to advances in all areas

of science and engineering. NSF support

has enabled researchers to create com-

puter chips that are incredibly tiny and

can perform complex computations

faster than the blink of an eye.

Page 31: America's Investment in the Future

22 — National Science Foundation

NSF-funded research played a pivotal role inall of these and many other innovations. Throughsupport of individual researchers and multidisci-plinary centers around the country, NSF is fueling avast number of diverse projects in materials scienceand engineering. Undertaken for the purpose ofadvancing knowledge, many materials sciencesprojects also have industry co-sponsors whoeagerly anticipate commercialization of the results.

A Never-Ending Search for the New and UsefulMaterials research grew out of a union amongphysicists, chemists, engineers, metallurgists,and other scientists, including biologists. Today,the field has a body of literature and a researchagenda of its own. NSF supports both experimentaland theoretical research with three primary

objectives: to synthesize novel materials withdesirable properties, to advance fundamentalunderstanding of the behavior and properties of materials, and to develop new and creativeapproaches to materials processing.

Materials have been high on NSF’s researchagenda since the earliest days. In the 1950s,NSF built on the momentum of the World War IIeffort by making grants to the University of Akronto help researchers with their work on durableforms of rubber that could withstand elevatedtemperatures and pressures. Rubber belongs tothe class of materials known as polymers, giganticmolecules made up of single units, or monomers,that link together in chains of varying lengths.Other naturally occurring polymers include complexcarbohydrates, cellulose, proteins, spider silk,and DNA. For the Akron researchers, it was asmooth transition from rubber to an ever-wideningrange of synthetic polymers, examples of which arefound today in products ranging from clothing andpackaging to automobile and aircraft components.

In the center of what is now called “PolymerValley,” Akron University’s College of PolymerScience and Engineering houses a faculty whosenames are legendary in the world of rubber andplastic. Known both for their own work—much ofit funded by NSF—and for their students’ contri-butions to industry, these faculty members includeAlan Gent and Joseph P. Kennedy. Gent, an authorityon deformation and fracture processes in rubbery,crystalline, and glassy polymers, served on thespace shuttle redesign committee after the 1986Challenger disaster. Kennedy conducted some ofthe earliest research on vulcanized rubber and hasreceived two American Chemical Society awardsfor pioneering work in polymer synthesis.

With support from NSF, researchers are

attempting to create materials that will

improve the manufacturing of cars,

furniture, and other items. Bruce Novak

at the University of Massachusetts, is

studying thin films such as the one

below in an effort to test and produce

such novel substances.

Page 32: America's Investment in the Future

Richard E. Smalley of RiceUniversity, a long-time NSF grantrecipient, was awarded the NobelPrize in Chemistry in 1996 for dis-covery of a new class of carbonstructures called fullerenes. In atalk that year before the AmericanInstitute of Chemical Engineers, hediscussed the discovery and whereit might lead.

“When you vaporize carbon, mixit with an inert gas and then let itcondense slowly . . . there is awonderful self-assembly processwhere the carbon atoms hook intogether to make graphene sheetsthat start to curl around, theirincentive being to get rid of thedangling bonds on the edge. Withamazingly high probability [they]will close into a geodesic domecomposed of some number of hexagons and 12 pentagons.

“It turns out this graphene sheetis pretty remarkable. It has thehighest tensile strength of any two-

dimensional network we know. It is. . . effectively impermeable undernormal chemical conditions. Eventhough when we look at pictures offullerenes we see, mostly, just a lotof hexagonal holes; if you try tothrow an atom through thoseholes, it will generally just bounceoff . . . So this graphene sheet isreally a membrane, a fabric, oneatom thick, made of the strongestmaterial we expect will ever bemade out of anything, which is alsoimpenetrable. And now we realizethat with pentagons and hexagonsit can be wrapped continuously intonearly any shape we can imagine inthree dimensions. That’s got to begood for something.”—Richard E. Smalley, “From Balls toTubes to Ropes: New Materials fromCarbon.” Paper presented at theAmerican Institute of ChemicalEngineers, South Texas Section,January 1996.

The Strongest Material Ever

Page 33: America's Investment in the Future

24 — National Science Foundation

Today’s vibrant materials science communitybegan its ascent in 1957. After the launch ofSputnik by the Soviet Union, the Department ofDefense lobbied strenuously to make space-related research and technology a national priority.The effort gave bir th to InterdisciplinaryLaboratories (IDLs) under the Department ofDefense’s Advanced Research Projects Agency(DARPA). DARPA took a more integrated approachthan did most universities at the time and broughttogether physicists, chemists, engineers, andmetallurgists into collaborative research teams.They were encouraged to cross departmentalboundaries and use systems approaches to attackcomplex problems of materials synthesis or processing. Graduate students trained in the IDLsaccepted multidisciplinary research as the norm,which influenced the way they approached theirown work.

In 1972, DARPA transferred the managementof the IDLs—600 faculty members at twelve universities—to NSF. NSF’s involvement gave evenstronger emphasis to the multidisciplinary teamapproach in the way funding opportunities weredefined. “NSF makes awards to the projects thatlook as if they will have the most impact on scienceor technology, or both,” explains W. Lance Haworth,executive officer in the Division of MaterialsResearch, which is part of NSF’s Directorate for

Mathematical and Physical Sciences. “Those projects typically involve people from several disciplines. The field benefits from this approach—sparks fly at the boundaries.”

One of the best places to see “sparks fly” is atthe Data Storage Systems Center, an NSF-fundedEngineering Research Center (ERC) at CarnegieMellon University in Pittsburgh and the largestacademic research effort in recording technologyin the United States. Engineers, physicists,chemists, and materials science researchers atthe Carnegie Mellon Center are working togetherto increase dramatically the data storage capacityof computer systems. Among their goals, Centerresearchers aim to demonstrate the feasibility of100 gigabits (100 billion bits) per square inchrecording density for magnetic and optical recordingsystems by 2003—hundreds of times higher thanthe current industry standard.

The Center recently made advances toward thisgoal by synthesizing two new materials. Each hasled to the development of high-quality magneto-opticrecording media for ultra-high densities. One is anartificially structured material made of very thinlayers of platinum and cobalt. The other is a mag-netic oxide. Both provide dramatically improvedperformance over current systems. In a recentbreakthrough experiment, researchers achievedrecording densities of forty-five gigabits per squareinch with platinum and cobalt films.

Other NSF-supported centers also have takenthe challenge of advancing high-density storagemedia. At the University of Alabama’s Center forMaterials for Information Technology, which ishome to one of the twenty-nine Materials ResearchScience and Engineering Centers (MRSEC) thatNSF supports. The MRSECs stress pioneeringmaterials research, education, and outreach, andthey foster multidisciplinary collaborations betweenacademia and industry. One interdisciplinary teamat University of Alabama’s MRSEC is studying the

The strong magnetic properties of

superconductors made it possible for

researchers to develop magnets for

MRI (magnetic resonance imaging)

systems. NSF-funded research into super-

conductivity has enabled advances in

MRI technology, which, in turn, have

led to more accurate diagnosis of disease.

Page 34: America's Investment in the Future

Advanced Materials — 25

physical properties of granular films that haveshown potential as a future low-noise, ultra highdensity magnetic media. A second interdisciplinaryteam at the Alabama Center is exploring thefunctional limits of magnetic materials in highspeed switching. The work could lead to thedevelopment of hard disk drive heads and storagemedia that are capable of operating at frequenciesapproaching a gigahertz.

Other sparks at the boundaries have developedinto industry standards. For example, work onsemiconductor lasers made the photonics revolu-tion of the last three decades possible. Photonicsuses light for signaling and conducting informationalong a pathway (electronics uses electrons forthe same purpose). Researchers at several NSF-funded centers, including the Center for MaterialsScience and Engineering at the MassachusettsInstitute of Technology, are continuing researchinto photonics, a field that has already producedcompact disk players, laser printers, bar codereaders, and medical applications, as well as newsystems for displaying information.

Triumphs in Everyday LifeNSF supported many of the pioneers whoseinvestigative triumphs led to innovations that arenow part of everyday life. One example is Art Heuerof Case Western Reserve University, whoseresearch on transformation toughening in ceramicsled to a way of producing strong ceramics capableof operating and surviving in extremely demandingenvironments. As ceramics cool after firing, theirtiny constituent particles expand slightly and causeoccasional microcracks. To reduce the risk ofcracking, the particles that make up the ceramicsmust be extremely small—on the order of onemicron. Using zirconium dioxide-based ceramics,

Heuer and others were able to prevent cracking byusing appropriate processing to control the expan-sion of the particles during cooling. To the delightof the automotive industry, these tough ceramics,when integrated into catalytic converters, alsoincreased gas mileage.

Many other founders of modern-day materialsscience have been longtime recipients of NSFsupport. Alan MacDiarmid of the University ofPennsylvania and Alan Heeger of the Universityof California at Santa Barbara are considered thefathers of conducting polymers, or synthetic metals.MacDiarmid, a chemist, and Heeger, a physicist,were the first to demonstrate that conjugated, orpaired, polymers such as polyacetylene can be“doped,” or intentionally changed to the metallicstate. The process of doping involves introducinginto a substance an additive or impurity that

Molecular models such as this are help-

ing researchers discover and create the

next generation of superstrong materials.continued on p. 28

Page 35: America's Investment in the Future

NSF-supported Materials Research Scienceand Engineering Center (MRSEC) based atthe State University of New York (SUNY)at Stony Brook, who are studying thecharacteristics of various spray processes,feedstock materials, and resulting spraydeposits. Their goals include the develop-ment of methodology for selecting sourcematerials and achieving a fundamentalunderstanding of flame-particle interactionsand the physical properties of spray deposits.Earlier research by Herbert Herman, theCenter’s director and professor of materialsscience at SUNY-Stony Brook, and hisstudents advanced the use of plasma gunsto apply coatings that protect against veryhigh temperatures and corrosive environ-ments. Plasma-sprayed coatings are com-monly used on components for aircraftengines and gas turbines and in otherareas where materials are required tofunction under extreme conditions

Superconductivity We Can Live WithA superconducting material transmitselectricity with virtually no energy loss.In a world where every electrical cordsteals some of the current passing throughit, a room-temperature superconductorcould save billions of dollars. Supercon-ducting computers could run 100 timesfaster than today’s fastest supercomputers.

To compare the normal electrical systemwith a superconductive one, imagine aballroom filled with many dancers. In normalmaterial, all of the dancers are moving indifferent directions at different times, andmuch of their energy is spent bumping intoeach other. In a superconductive material,the dancers are synchronized, moving in

(STC) established by NSF, is transitioningto self-sufficiency after eleven years ofFoundation support.) Deteriorating steelbeams on the Tom’s River bridge, locatedon a rural road near Virginia Tech, werereplaced with structural beams made outof a strong composite. The new bridge wascompleted in 1997 and since then, VirginiaTech researchers have been closely moni-toring it to determine how well the com-posite beams withstand the tests of time,traffic, and weather. Depending on theresults of this field test and others that areplanned or underway, bridges constructedof composites could become as familiarin the future as tennis rackets and aircraftmade of composites are today.

Standing Up Under StressDesigning composites is one method offabricating novel materials with specialproperties. Surface engineering is anoth-er. Thermal spray processing—a group oftechniques that can propel a range ofmaterials including metals, ceramics,polymers, and composites onto substratesto form a new outer layer—has proven tobe a cost-effective method for engineeringsurfaces that are resistant to corrosion,wear, high or low temperatures, or otherstresses. Current applications include theaerospace, marine and automobile indus-tries, power generation, paper processingand printing, and infrastructure building.

Despite this widespread use of thermalspray processing, the underlying sciencewas little understood until recently. That’sbeginning to change, due in part to thework of researchers and students at theCenter for Thermal Spray Research, the

Heavy LiftingWe are familiar with composites in recre-ational applications such as tennis rackets,golf clubs, and sailboat masts. Thesematerials also bring comfort to thousandsof people as prosthetic arms and legs thatare much lighter than wood or metal ver-sions. At higher performance levels, thesuccess of satellites and stealth aircraftdepends on composites.

In aircraft, weight affects every perfor-mance factor, and composites offer highload-bearing at minimal weight withoutdeterioration at high or low temperatures.NSF-supported researchers and engineersare developing tough new materials likethe resin and fiber composite used in thetail section of the Boeing 777. That com-posite is lighter than aluminum but farmore durable and fatigue-resistant at highaltitudes. Use of such advanced compos-ites reduces the weight of an 8,000-poundtail section by 15 percent, which meansdesigners can increase the aircraft’s payload and fuel capacity.

Meanwhile, down on the ground, advancedcomposites are being evaluated for use inbuilding the U.S. infrastructure for the 21stcentury. In one test, researchers at theCenter for High-Performance PolymericAdhesives and Composites at VirginiaPolytechnic Institute and State Universityhave partnered with the Virginia Trans-portation Research Council, the stateTransportation Department, the town ofBlacksburg, and manufacturer Strongwellto test the long-term effectiveness ofcomposites as an alternative to steel inbridges. (The center at Virginia Tech, oneof the first Science and Technology Centers

Tomorrow’s Materials: Lighter, Tougher, Faster

Page 36: America's Investment in the Future

unison, and therefore can spend all of theirenergy on the dance and none on each other.The dancers represent the electrons ofeach material, chaotic in the normal set-ting and well-ordered when the material issuperconductive. While the entire theoryis more complicated, the overall effect isthat the electrons in superconductivematerial move electrically more easilythrough the system without wasting energy bumping into each other.

Superconductivity, which occurs in manymetals and alloys, isn’t yet in widespreaduse, however. For most of the 20th century,the phenomenon required very cold tem-peratures. Superconductivity was firstobserved by Dutch physicist HeikeKarmerlingh Onnes in 1911 when he cooledmercury down to -425°F, a few degreesabove absolute zero. Until the mid-1980s,commercial superconductors usually usedalloys of the metal niobium and requiredexpensive liquid helium to maintain thetemperature of the material near absolutezero. The need for expensive refrigerantsand thermal insulation rendered thesesuperconductors impractical for all but alimited number of applications. That beganto change in 1986 when Alex Müller andGeorg Bednorz, researchers at an IBMResearch Laboratory in Switzerland, dis-covered a new class of ceramic materialsthat are superconductive at higher tem-peratures. So far, materials have beenknown to reach the superconductive stateat temperatures as high as -209°F, makingit possible to use liquid nitrogen coolant,a less costly alternative to liquid helium.Since the mid-1980s, much of the currentresearch has focused on so-called hightemperature superconductors. The newsuperconducting ceramics are hard and

brittle, making them more difficult thanmetal alloys to form into wires. An inter-disciplinary team at the NSF-supportedMRSEC on Nanostructured Materials andInterfaces, based at the University ofWisconsin-Madison, is focusing on under-standing the properties of the grainboundaries of high temperature supercon-ductors. The center’s research could leadto better materials processing and thedevelopment of a new generation ofsuperconductors for high current and highmagnetic field technology.

Even as research continues, supercon-ductors are being used in a number offields. One of the more visible is medicine.Superconductors have strong magneticcharacteristics that have been harnessedin the creation of magnets for MRI(Magnetic Resonance Imaging) systems.An MRI takes images of a patient byrecording the density of water moleculesor sodium ions within the patient andanalyzing the sources. When used for brainscans, this technique allows clinicians toidentify the origin of focal epilepsy and topinpoint the location of a tumor beforestarting surgery. Similar magnetic resonancesystems are used in manufacturing to testcomponents for cracks and other defects.

One of the preeminent facilities forresearchers and engineers to test super-conductivity and conduct other materialsresearch is the National High MagneticField Laboratory (NHMFL), a unique labo-ratory funded by NSF’s Division of MaterialsResearch and the state of Florida andoperated as a partnership between FloridaState University, the University of Florida,and the Los Alamos National Laboratoryin New Mexico. Since it was established

in 1990, the NHMFL has made its state-of-the-art magnets available to nationalusers for research in a variety of disciplinesincluding condensed matter physics,chemistry, engineering, geology, and biology, as well as materials science. TheNHMFL features several of the world’s mostpowerful magnets, including a hybridmagnet, developed jointly with theMassachusetts Institute of Technology,that delivers continuous magnetic fields of45 tesla, which is about one million timesthe Earth’s magnetic field. The 45 teslahybrid consists of two very large magnets.A large resistive magnet (electromagnet)sits at the center of a huge superconduct-ing magnet, which forms the outer layerand is the largest such magnet ever builtand operated to such high field. The hybrid’srecord-setting constant magnetic fieldstrength gives researchers a new scale ofmagnetic energy to create novel statesof matter and probe deeper into electronicand magnetic materials than ever before.

The NHMFL’s 45 tesla magnet is cooledto within a few degrees of absolute zerousing a superfluid helium cryogenic system.The discovery of superfluid helium wasmade in 1971 by NSF-funded researchersat Cornell University who found that, atextremely low temperatures, the rare iso-tope helium-3 has three superfluid states,where the motion of atoms becomes lesschaotic. This discovery by David Lee,Douglas Osheroff, and Robert Richardsonled to greatly increased activity in lowtemperature physics and furthered studiesof superfluidity. Lee, Osheroff and Richardsonreceived the 1996 Nobel Prize in Physicsfor their contributions to the field.

Page 37: America's Investment in the Future

28 — National Science Foundation

produces a specific and deliberate change in thesubstance itself. Their work stimulated researchworldwide on metallic organic polymers; applicationsinclude rechargeable batteries, electromagneticinterference shielding, and corrosion inhibition.Heeger, MacDiarmid, and Japanese researcherHideki Shirakawa were awarded the 2000 NobelPrize in Chemistry for the discovery and develop-ment of conductive polymers. Another longtimeNSF grantee is Richard Stein, who established thehighly respected Polymer Research Institute at theUniversity of Massachusetts at Amherst and isknown for developing unique methods for studyingproperties of plastic films, fibers, and rubbers.

One of NSF’s best known principal investigatorsis Richard Smalley of Rice University, who in 1985discovered a new form of carbon with astoundingproperties and potential for useful applications.The Buckminsterfullerene, named for the Americanarchitect R. Buckminster Fuller, is a hollow clusterof 60 carbon atoms that resembles one of Fuller’sgeodesic domes. It is the third known form of purecarbon, the first two being graphite and diamond,and is the most spherical and symmetrical largemolecule known to exist. “Buckyballs,” for whichSmalley and his colleagues Harold W. Kroto andRobert F. Curl received the Nobel Prize in chemistry

in 1996, are exceedingly rugged and very stable,capable of surviving the temperature extremes ofouter space. Numerous applications have beenproposed, including optical devices, chemical sen-sors and chemical separation devices, batteries,and other electrochemical applications such ashydrogen storage media. In addition, medical fieldsare testing water-soluble buckyballs, with verypromising results. The soccer-ball-shaped form ofcarbon has been found to have the potential toshield nerve cells from many different types ofdamage including stroke, head trauma, Lou Gehrig’sdisease, and possibly Alzheimer’s disease.

Designer Molecules Reach New HeightsSmalley and his colleagues discovered fullerenesserendipitously while exploring the basic structureand properties of carbon. In contrast, other NSF-supported investigators deliberately set out tocreate novel materials with desirable properties.An example is Samuel Stupp, currently Professorof Materials Science at Northwestern University,whose successes while he was at the Universityof Illinois at Champaign-Urbana were describedby writer Robert Service in the April 18, 1997issue of Science magazine.

“Living cells are masters of hierarchical building.For much of their molecular architecture, they firststring together amino acids into proteins, thenassemble proteins into more complex structures.Chemists have been working to imitate this skill,in the hope of making new materials tailored rightdown to the arrangement of molecules. Researchersat [the University of Illinois] report taking thisassembly process to a new level of sophistication,creating molecules that assemble themselves overseveral size scales, first forming clusters, thensheets, and, ultimately, thick films. Because thebuilding-block molecules are all oriented in thesame direction, the films’ properties mirror those

Nobel laureates Robert Curl and Richard

Smalley pose with their buckyball models.

This superstrong carbon molecule may

have applications to chemistry, medicine,

and manufacturing.

continued from p. 25

Page 38: America's Investment in the Future

Advanced Materials — 29

of the individual molecules, yielding a bottomsurface that’s sticky and a top that’s slick. Thisproperty could make the films useful for everythingfrom anti-icing coatings on airplane wings to anti-blood-clot linings for artificial blood vessels . . .”More recently, Stupp and his research teamhave had success using molecular self-assemblyto change the properties of polymers. Their self-assembly method has potential for producingextremely strong polymers and polymers withimproved optical properties, and it could impactsuch diverse fields as the plastics industry,medicine, and optical communications.

Other advances in new materials are comingout of NSF-supported basic research in the fieldof condensed-matter physics. Researchers at theNSF-funded centers at the University of Chicagoand Cornell University are investigating the fun-damental physical structure and properties ofmaterial when it is placed under extreme condi-tions—such as low temperature, high pressure,and high magnetic fields. Investigators look at thenovel compositions and structures with extraordi-nary electrical and optical properties, includingmetals, insulators, semiconductors, crystals, andgranular material. They also learn to control thatstructure—for example, moving electrons aroundon the sur face of the material. Among otherapplications, this work will be important as engi-neers work at creating ultra-high per formancecomputer chips. Other researchers at MichiganState University and Northwestern University arealso looking at solid-state chemistry and havesynthesized metals with highly efficient thermo-electric properties—that is, the ability to generateelectricity when junctions between the metalsare maintained at different temperatures.Thermoelectric materials already are used inspace applications, but as they improve they maybe useful in environmentally friendly refrigeration,thermal suits for diving, and cooling systems forelectronic devices.

The Healing Arts Embrace Materials ScienceRecent advances in NSF-supported biomaterialsresearch are hastening the development of inno-vative healing aids. Researchers at GeorgiaInstitute of Technology, California Institute ofTechnology, and Massachusetts Institute ofTechnology (MIT) are working with physicians andbiological specialists to develop polymer compos-ites for patching wounds, biocompatible casingsfor cell transplants, scaffolds that guide andencourage cells to form tissue, bioreactors forlarge-scale production of therapeutic cells, andexperimental and theoretical models that predictbehavior of these materials in vivo. Biomaterialshave already been developed to block unwantedreactions between transplanted cells and hosttissue and to help prevent scarring during healing.Closest to commercialization is a polymericmaterial, synthesized at MIT, to which biologicalcells can adhere. Because the human bodyaccepts biological cells while it might reject theoverlying synthetic material, this breakthroughmakes possible the development of inexpensivemultilayer materials that can promote healing,act as artificial skin, or temporarily replace con-nective tissue until the body can produce naturaltissue to complete the healing process.

Another NSF-supported technology for skinreplacement, developed by Ioannis V. Yannas ofMIT and his colleagues, received FDA approval in1996. The Yannas technology addresses thechallenge of treating severe burns that result inthe loss of dermis, a layer about two millimetersthick that lies beneath the epidermis and does notregenerate when damaged. Traditionally, patientswith such severe burns receive skin transplantsfrom sites elsewhere on their bodies, a methodthat results in scarring. The new technology

Page 39: America's Investment in the Future

30 — National Science Foundation

involves collagen taken from animal tendons.Collagen is part of the structural scaffolding inmammals (analogous to cellulose in plants) thatallows tissues to maintain their shape. This col-lagen is chemically bonded with glycosaminoglycan(GAG) molecules, from animal cartilage, to createa simple model of the extracellular matrix thatprovides the basis for a new dermis. The collagen-GAG combination “makes a simple chemical analogof the matrices in our own tissues,” Yannasexplains. GAG cells synthesize a new dermis atthe same time that the scaffold is being brokendown. Epidermis then grows naturally over the newdermis, unless the wound area is especially large.Patients end up almost completely free of disfig-uring scars. The new skin also grows as the patientsdo, an important consideration for children whohave been burned.

Materials for a Small Planet A number of NSF-supported investigators arelooking for more environmentally benign substi-tutes for chemically synthesized materials currently in use. Dragline silk from the orb-weavingspider Nephila clavipes is one of the most promisingnew biomolecular materials, thanks to the silk’sgreat strength and flexibility—greater even thanthe lightweight fiber used to reinforce bulletproof

helmets. Also attractive is the environmentallyfriendly process used to make the silk, which thespider spins from a water-based solution. Intriguedby this spider, which actually makes seven differenttypes of silk, Lynn Jelinski, then at Cornell Universityand currently chemistry professor and ViceChancellor for Research and Graduate Studiesat Louisiana State University, had a vision thathigh-performance, renewable, silk-like polymerseventually can be made using the tools ofbiotechnology. She thinks scientists may be ableto synthesize the key spider genes and insertthem in plants, which then would express theprotein polymers. The resultant materials mightbe used in products ranging from reinforcedtennis rackets to automobile tires.

Discoveries of new materials lead to newquestions, the answers to which create opportu-nities to find still more new materials. An exampleof this cycle is the work of Donald R. Paul of theUniversity of Texas at Austin, an authority on theways in which polymers interact when blended.Polymer blends are a powerful way of enhancingtoughness or otherwise tailoring the performanceof a given material. The information generated byPaul’s research may lead to the development ofhigh-performance polymeric alloys that could beused to replace metal components in automobiles.These lightweight and easy-to-fabricate alloys couldhelp create vehicles that have greater fuel efficiencyand produce fewer emissions.

NSF-funded researcher Lynn Jelinski at

Cornell University is using spiders’ silk

as a model for creating an incredibly

strong and resilient polymer that will

have a variety of practical applications.

Page 40: America's Investment in the Future

wide use throughout the industry. As the leadprincipal investigator for the NSF-supportedScience and Technology Center for EnvironmentallyResponsible Solvents and Processes, DeSimonecontinues to advance research into environmentallysafe solvents. Meanwhile, other work in polymersfocuses on finding ways to use plastics in placeof silicon as the base material of microcircuits.

“Materials research is pushing the edge of thetechnologies of a whole array of societal systems,”said NSF Deputy Director Joseph Bordogna in aninterview for NSF’s publication, Frontiers. “It’s avery powerful catalyst for innovation. As newmaterials become available and processable, theywill make possible improvements in the quality oflife. And that’s the heart of the leadership issueand the competitiveness issue, isn’t it? That’sthe future.”

Advanced Materials — 31

At the same time, materials synthesis researchis being used to investigate new metal alloys. Tocreate these alloys, researchers must learn aboutthe chemistry of the alloy, the microstructure basicto the alloy, and its macroscopic behavior. NSF-funded researchers, including those at theUniversity of Alabama at Birmingham, are investi-gating how to control the alloy composition. Someof the alloys are thin films that will find their wayinto the electrical industry. Others may be usedin newly designed vehicles. These alloys are notonly stronger and lighter than their predecessors,but also more resistant to stress and fatigue, pro-ducing a more fuel-efficient, longer-lasting vehicle.

In research supported jointly by NSF and theU.S. Environmental Protection Agency, an envi-ronmentally benign method of polymer synthesiswas discovered using liquid carbon dioxide in placeof toxic volatile organic solvents. The work byJoseph DeSimone, professor of chemistry at theUniversity of North Carolina at Chapel Hill andprofessor of chemical engineering at North CarolinaUniversity, and his graduate students received oneof Discover magazine’s 1995 Awards forTechnological Innovation. The discovery ledDeSimone and his colleagues to patent an envi-ronmentally friendly process for dry cleaningclothes that uses carbon dioxide instead of per-chloroethylene, a highly toxic organic solvent in

NSF’s Division of Materials Research (DMR)www.nsf.gov/mps/dmr/start.htm

NSF Materials Research Science and EngineeringCenters (MRSEC) www.nsf.gov/mps/dmr/mrsec.htm

NSF Engineering Research Centers (ERC)www.eng.nsf./eec/erc.htm

NSF Science and Technology Centers (STC) www.nsf.gov/od/oia/programs/stc/start.htm

Department of Defense Advanced Research ProjectsAgency (DARPA)www.darpa.mil

National High Magnetic Field Laboratorywww.magnet.fsu.edu

The Nobel Prize in Chemistry 1996www.nobel.se/chemistry/laureates/1996/index.html

The Nobel Prize in Physics 1996www.nobel.se/physics/laureates/1996/index.html

To Learn More

Page 41: America's Investment in the Future

Educationlessons

about learning

Page 42: America's Investment in the Future

Parents, educators, and students aren’t the

only ones with an active stake in the nation’s

schools. The National Science Foundation

understands that discoveries arise from

acquired knowledge, and that all citizens—

not just scientists and engineers—benefit

by learning the scientific and technical basics

behind the major achievements of modern

civilization.

Page 43: America's Investment in the Future

Edu c a t i o n —When it comes to scientific progress, classrooms are just as important as laboratories.

That’s why nearly 20 percent of NSF’s budget is devoted to improving students’ grasp of science, mathe-

matics, engineering, and technology—at all levels, pre-kindergarten to postdoctoral. From the agency’s

Sputnik-inspired reforms of science and mathematics curricula to today’s basic research into human acqui-

sition of knowledge, NSF has devoted itself to answering two fundamental questions: How do students

learn and what should they know?

Page 44: America's Investment in the Future

Education — 35

The Evolution of EducationReading, writing, and arithmetic. Rote memorizationand drills. American children in the first half of thetwentieth century were taught according to thephilosophy that the mind was a muscle, whichcould best be strengthened by lectures and themental equivalent of push-ups. By the 1950s, crit-ics complained that schools had become little morethan vocational sorting stations, sending this childinto shop class, that child into family life class,and preparing relatively few for the rigors of college.

All that changed in October 1957 with the SovietUnion’s launch into orbit of Sputnik, the first-everartificial satellite. The Russian achievement servedas a wake-up call for Americans who realized theyneeded to improve U.S. science and mathematicseducation to compete in a science- and technology-driven world. The space race was on, and only ahighly educated group of homegrown scientists andengineers could get Americans to the moon aheadof the Russians. For the first time, education in theUnited States became a major federal imperative.

The government, perceiving a national crisis,turned to the young National Science Foundation,which had established strong ties to the country’sresearch universities. With the National DefenseEducation Act of 1958, Congress called upon NSFto attend to kindergarten through twelfth-grade(K-12) education in mathematics and science.Later, Congress explicitly added social studies tothe mandate.

Over the next twenty years, the Foundation spent$500 million on elementary and secondary schoolcurricula and teacher development. Teams of sci-entists, educators, and teachers worked togetherto develop new curricula in physics, biology, chem-istry, and mathematics. At the same time, uni-

versities held hundreds of summer programs toassist teachers in understanding and using thenew materials.

Two aspects of the new curricula distinguishedthem from their predecessors. First, there was anemphasis on basic principles. How do waves form?What keeps molecules from flying apart? Whatare functions? Second, there was an assumptionthat students would best learn basic scientificprinciples by actually per forming experimentsrather than simply memorizing facts.

In a 1977 survey, NSF found that 41 percentof the nation’s secondary schools were using atleast one form of the science curricula developedwith NSF funds. In contrast, fewer than 10 percentof schools were using NSF-funded math materials,which many found confusing. Despite the partialsuccess, Congress reined in much of the NSFcurriculum effort by the late 1970s. Lawmakers’objections to a new social science curriculum anda general lack of enthusiasm for major changesin education were largely to blame. The declinecontinued in the early 1980s when the adminis-tration’s goal of a smaller federal governmentresulted in budget cuts that hit NSF’s educationprograms particularly hard.

But then the tide turned again. In 1983, a fed-erally commissioned report entitled A Nation atRisk warned of “a rising tide of mediocrity” in thenation’s schools that was serving to erodeAmerica’s leadership in the world economy. Thereport triggered fresh calls for the setting ofnational or at least state-level education standardsand sent NSF back into the K-12 education arenawith renewed vigor.

Page 45: America's Investment in the Future

36 — National Science Foundation

New Approaches for New TimesToday, NSF is once again an influential player inthe search for better instructional materials andmethods, largely through the efforts of its Direc-torate for Education and Human Resources. Thecurrent programs embody what was learned fromthe successes and disappointments of earlieryears, and also reflect the importance of scienceand technology to the U.S. economy—and hence,to the country’s workforce and citizenry.

A defining feature of today’s curriculum reformmovement is the emphasis on all students.Regardless of whether they intend to pursue sci-ence-related careers or even to go to college, allstudents should receive quality mathematics andscience instruction before they leave high school.And at NSF, “all students” means everyone, includ-ing girls and women, persons with disabilities, andethnic minorities—groups that remain underrep-resented in the nation’s science and engineeringcommunities.

Of course, what constitutes a good way to learnand teach science and mathematics remains amatter of some debate, as evidenced by the cur-rent effort to develop and implement standards.State and local districts now have two sets ofnational standards to guide them: the 1989 stan-dards put forth by the National Council of Teachersof Mathematics and the 1996 National ScienceEducation Standards established by the NationalResearch Council. Both sets of standards grew outof long processes, including in-depth consultationswith the science and mathematics communities,with teachers and educational researchers, andwith others concerned about the issue.

NSF-funded curriculum development teams arealso drawn from a broad spectrum of the science,mathematics, and educational communities. Intheir standards-based approaches, these teamsare moving beyond the kind of learning-by-doingthat asks students to conduct experiments or

manipulate mathematical equations with the sim-ple goal of getting an already-determined result—doing things the “right” way to get the “right”answer. In the new“inquiry-based,problem-oriented”curricula, students become participants in dis-covery by using fact-based knowledge to thinkthrough open-ended problems in a variety of ways.

Making Mathematical ConnectionsTake Connected Mathematics, for example, a middle-school instructional series developed in part withNSF funds. In 1999, these materials were beingused in more than 2,200 school districts acrossthe country. Connected Mathematics was judgedthe best of four—and only four—sets of middle-school mathematics materials receiving an excellentrating from Project 2061, a curriculum reform effortof the American Association for the Advancementof Science. The three other top-rated instructionalmaterials—Mathematics in Context, MathScape,and Middle Grades Math Thematics—were alsodeveloped with NSF funds. None of these materials,however, are as yet in wide use.

What’s so different about Connected Math-ematics and the other top-rated materials? AskLinda Walker, a teacher at Cobb Middle School inTallahassee, Florida, who participated in the devel-opment of Connected Mathematics and whoseschool district implemented the series with thehelp of an NSF grant.

“When I went to school,” she says, “there wasone way to do a mathematics problem—theteacher’s way. He’d show you how to work theproblem, repeat it, and move on. With ConnectedMathematics, I set up a problem and then let thekids explore for answers. They gather data, shareideas, look for patterns, make conjectures, devel-op strategies, and write out arguments to supporttheir reasoning. Instead of getting bored, they’regetting excited.”

Page 46: America's Investment in the Future

Education — 37

Excellence in Higher EducationThe longest running education program offered by theNational Science Foundation is the Graduate ResearchFellowship, which provides funds and national recog-nition to university students working toward careersin science or engineering. In 1952, NSF’s first fullybudgeted year, almost half of the agency’s $3.5 mil-lion appropriation—$1.5 million—was disbursed in theform of research fellowships to 573 graduate students,32 of them women.

From the start, awardees considered the NSF fel-lowships prestigious and career-making. More thanone member of the class of 1952 has kept thetelegram that brought the news of his or her goodfortune. World-famous biologist and Pulitzer Prize-winning author Edward O. Wilson recalls that “theannouncements of the first NSF pre-doctoral fellow-ships fell like a shower of gold on several of my fel-low [Harvard] students in the spring of 1952. I wasa bit let down because I wasn’t amongst them.”Wilson’s spirits lifted the following Monday when hegot his own, albeit belated, notice.

Of the thousands of young scientists who havereceived fellowships over the years, many have madesignificant contributions in a wide variety of fieldsand eighteen have gone on to win Nobel Prizes. SaysDonald Holcomb, professor emeritus of physics atCornell University and a 1952 graduate research fel-lowship recipient, “I do think it is fair to say that thecoincidence of the career spans of me and my con-temporaries with the life span of the National ScienceFoundation created a symbiosis which has profitedboth us as individuals and American science at large.”

Today graduate research fellows—and, in fact, allscience and engineering students—have a largenumber of superior colleges and universities that theycan attend, almost anywhere in the United States.But it wasn’t always so. NSF’s science developmentprograms—better known as the Centers ofExcellence—were created in 1964 in response to

several national concerns: the growing population ofcollege and university students, the explosion of sci-entific and engineering knowledge, and the fact thatthe country’s top-notch research schools were con-centrated in only a few regions of the country. Throughsuch programs as the University Science DevelopmentGrants, the Departmental Science DevelopmentGrants, and Special Science Development Grants,NSF helped degree-granting institutions all aroundthe United States strengthen the quality of theirscience-related research and education activitiesduring the 1960s and 1970s.

“NSF provided the seed money for the developmentof institution-wide master plans, and also helped tofund the implementation of those plans,” says JudithSunley, NSF interim assistant director for Educationand Human Resources. “Then the universities tookover, providing the funds to maintain excellence overthe long haul.”

The first grants were announced by PresidentLyndon Johnson in 1965. By 1972, when the lastscience development awards were made, NSF haddistributed $233 million in 115 grants to 102 publicand private institutions in forty states and the Districtof Columbia. Institutions used the grants primarily torecruit strong faculties, support postdoctoral scien-tists and graduate students, acquire sophisticatedequipment and materials, and construct, modernize,and renovate laboratories, libraries, and other spe-cial facilities for research and teaching.

NSF’s Centers of Excellence program resulted instronger science and engineering departmentsacross the United States. The program’s impactcontinues to be felt by succeeding generations ofscience and engineering students.

(Information on the graduate research fellows’ Class of1952 is based on material gathered by William A.Blanpied, NSF’s Division of International Programs.)

Page 47: America's Investment in the Future

38 — National Science Foundation

In one recent eighth-grade class, Walker askedher students to redesign a brand-name cereal boxto use less cardboard while putting the sameamount of cereal in the same number of boxes ona grocery shelf. There was no single right answer—the goal was just to come up with a more envi-ronmentally friendly box design and, as a resultof the exercise, learn about the ratio of surfacearea to volume.

Walker says she could have had her studentsjust crunch out formulas, but too much would havebeen lost in the process. “The importance of astudent’s exploration is that you, as the teacher,can see what they’re really understanding,” shesays. “Getting a correct answer is only one goal.Are they comfortable with fractions or do they avoidthem in their calculations? What do their guessestell you about what they know and don’t know?”

Science Instruction Changes CourseAs for science courses, among the many inquiry-based curricula developed with NSF funds is ActivePhysics, a course for high school students thatcreatively organizes physics content. Usually,students study physics in a predictable way:mechanics during the fall term, waves in the win-ter, then electricity and magnetism in the spring.With Active Physics, students explore conceptsin one of six thematic areas, such as medicineor sports.

In one classroom exercise, students draft amock proposal to NASA under a scenario in whichthe space agency, as part of its plans for a mooncolony, is soliciting ideas for how to encourageexercise among the colonists. The students’challenge is to invent or modify a sport so thatcolonists can play it in the meager gravity of themoon’s environment. As a result, students learnabout friction’s relationship to weight and discoverthat there is little friction on the moon. They learnwhy moon football might put a premium on liftingopponents out of the way rather than trying to pushthem, and why figure skaters would need largerice rinks for their quintuple jumps.

“Why don’t kids like mathematics and physicsbut do like English and social science?” asksArthur Eisenkraft, science coordinator for theBedford Public Schools in New York. Eisenkraftdeveloped Active Physics with NSF funds and thehelp of leading physicists, physics teachers, andscience educators. “At least one reason is thatsomething like Grapes of Wrath can spark kidsto share their own experience with poverty orhopelessness. They get to contribute to the dis-cussion, really contribute, not just . . .”—raisinghis hand in imitation of a student with the rightanswer to a math question—“. . . 4.3. With ActivePhysics, students never ask me, ‘why are we learn-ing this?’ And my AP [Advanced Placement] kidsget just as much out of it as my LD [learning disabled] kids.”

Most widely used middle and high school sci-ence textbooks do not yet reflect these newapproaches, though a growing body of evidencesuggests that they should. The Third InternationalMathematics and Science Study (TIMSS), conduct-ed in 1995, involved forty-one countries at threegrade levels and compared students’ grasp ofmathematics and science. U.S. students scoredabove the international average in both mathe-matics and science at the fourth-grade level,dropped below average in mathematics at theeighth-grade level, and by twelfth grade were among

These high school scientists are engaged

in a hands-on Active Physics exploration.

The Active Physics curriculum, developed

with the help of NSF funding, helps

students to better understand and

appreciate the unseen forces that shape

our daily lives.

Page 48: America's Investment in the Future

the worst performers in both science and math.In May 1999, however, a study involving NSF-funded curricula and teacher development efforts showed that they seemed to be making a difference. When given the physics portion ofthe TIMSS test, students who were learningphysics with NSF-supported curricula or fromteachers trained in NSF-funded projects postedscores significantly higher than U.S. students in the initial TIMSS assessment.

Curriculum reform is a work in progress.However, even the best reformulated instructionalmaterials won’t be enough to sustain real improve-ment in students’ grasp of science and mathematics.Just ask their teachers.

A More Synergistic WholeIn July 1999, Gerry Wheeler, executive director of the National Science Teachers Association(NSTA) and a long-time veteran of the curriculumreform effort, stood before a crowd of teachersgathered to learn about the latest NSF-sponsoredK-12 curricula.

“We’ve been saying the same thing sinceSputnik,” he exclaimed. “We need inquiry-basedcurricula, we need to make thinking citizens ofour children. But we also need to do more thanjust produce good material.” He pounded thelectern once or twice for emphasis, as if to marktime with the teachers’ nodding heads. “What goodis the best textbook if teachers aren’t given thetime, material, and support they need to preparethemselves to use it?”

One of the things NSF learned from its curricu-lum reform efforts in the 1960s and 1970s wasthat more needed to be done to prepare teachersto use new materials. Today that means settingup training opportunities that meet not just for acouple of weeks in the summer but also in theevenings or on the weekends, or even over theInternet—whatever best accommodates theteachers’ own schedules and far-flung locations.Teachers learn not just about the content of the

new curriculum, but also the practical aspects ofimplementing it. This includes everything from newways to assess student progress (for example,through students’ daily journals) to suggestionsfor gaining support from parents, colleagues, andschool boards. NSF programs also encourageschool districts to free up senior teachers alreadytrained in the new curricula to coach others.

Stronger professional development for teachersand improved materials are crucial, but by them-selves won’t be enough to make a major differencein the way students learn. What’s needed is alarger vision that addresses all the factors affectingthe success of a student’s educational experience.At NSF, a key part of that vision can be summedup in two words: systemic reform.

The idea is simple even if the execution is not—in order for a better set of practices to take holdin a school, everything influencing the schoolsystem must be reevaluated, from parentalinvolvement right on up to statewide laws andpolicies concerning education. NSF launched theStatewide Systemic Initiatives (SSI) program in1991. In the program’s first three years, NSFprovided funds to twenty-five states and PuertoRico to help them start on systemic reform. Today,seven states and Puerto Rico are participating ina second phase of the SSI program. In addition,modified systemic approaches form the basis ofthe Rural Systemic Initiatives (RSI) and UrbanSystemic Initiatives (USI), and the Local SystemicChange (LSC) component of NSF’s teacherenhancement programs.

Education — 39

These members of the Wilson High

School (Rochester, New York) FAIHM

team work on a robotics project.

FAIHM, funded in part by NSF, is an

acronym for FIRST (For Inspiration and

Recognition of Science and Technology),

Autodesk, Institute for Women and

Technology, Hewlett Packard, and MIT.

The program is designed to promote

interest in science and technology that

will lead high school women to careers

in engineering.

Page 49: America's Investment in the Future

40 — National Science Foundation

Through these programs, NSF grants funds tolocal school systems with well-thought-out plansfor how to reform K-12 science and mathematicseducation at the state, city, or regional level. Sofar, NSF has spent more than $700 million onsuch efforts.

How well can systemic reform work? Duringthe 1994-95 school year, the first year that NSFfunded the urban systemic program, Chicago’sschool system saw significantly more of its studentsscore above the national norm in mathematics ona commonly used assessment called the IowaTests of Basic Skills. What’s more, Chicago stu-dents’ performance in mathematics has increasedin sixty-one out of sixty-two high schools, suggest-ing that improvement is occurring across the board.Similar results have been achieved in Detroit,where students from a diverse range of publicschools performed significantly better on a statestandards test after the Detroit Urban SystemicProgram implemented sections of the ConnectedMathematics curriculum. And in Dallas, the num-ber of students passing science and mathematicsAdvanced Placement tests has tripled since thestart of NSF systemic reform funding.

On the state level, Puerto Rico has raised itsstudents’ achievement in science and mathemat-ics with an innovative pyramid system that bringssystemic reform to one school at a time. The NSF-supported effort, which began in 1992, has so

far brought standards-based curricula into morethan one-quarter of the island’s schools.

“Everybody said it was a clumsy idea becauseit takes so long,” Manuel Gomez, head of thePuerto Rico SSI, told a reporter in 1998. “But Isaid, ‘Be patient. It will work if we give it time.’”

Given the complexities, time is a critical factorto the success of any systemic reform initiative—time, and local school systems willing to commitenergy and resources long after NSF’s initial sup-port has kick-started reform.

“The underlying belief of systemic reform is thatpiecemeal attempts, limited by finite projects andinadequate funding, will not change the system,its culture, and its capacity to share what happensin the classroom,” says Daryl Chubin, a seniorpolicy officer with the National Science Board, thegoverning body of NSF. “Change requires convic-tion and staying power. Nothing happens quickly.”

Infusing Education with ResearchTrue reform at the system level requires the par-ticipation of everyone who cares about improvingthe way that students learn about science, math-ematics, and engineering. And that includes theresearch community itself. Finding more ways tofoster the infusion of research into education is a major NSF goal as the agency heads into thenew millennium.

“If we are to succeed in making our educationsystem truly world class,” NSF Director Rita Col-well told the U.S. House of Representatives’ science committee in April 1999, “we must bet-ter integrate our research portfolio with the education we support.”

One way NSF has been taking on this challengeis to fund programs that link ongoing researchprojects with K-12 students through informationtechnologies such as the Internet. A prime example: the Albatross Project.

In states and territories around the

country, NSF’s Statewide Systemic

Initiatives program is successfully com-

bining curriculum reform—including

hands-on activities such as this exhibit

at the San Francisco exploratorium for

students—and teacher training to

improve student performance in science

and mathematics.

Page 50: America's Investment in the Future

Science for EveryoneThe good news is that more women andminorities are earning undergraduateand graduate science and engineeringdegrees—their numbers rose as muchas 68 percent from 1985 to 1995, accord-ing to recent data from a series of congressionally mandated reports pre-pared by NSF’s Division of ScienceResources Studies.

The bad news is that they and personswith disabilities are still underrepresent-ed when compared with the overall U.S.population of eighteen- to thirty-year-olds.

While NSF as a whole is committed toensuring that the nation’s scientific andtechnical workforce is peopled by all thosewith gifts to contribute, this mandate isthe specific mission of NSF’s Division ofHuman Resource Development, a branchof the Directorate for Education andHuman Resources. Why is the crusadefor equity allied so closely (though notexclusively) with NSF’s educational aims?Because schools are fulcrums on whicha young life can turn.

For example, when Tanya Lewis enteredLouisiana’s Grambling State University(GSU) as a freshman, she signed up toparticipate in an NSF-funded minorityscholars program whose goal was toattract undergraduates to the school’sphysics and chemistry departments andguide them into graduate school. Shestruggled with class work and with prob-lems in her personal life; midway through,she decided to take a full semester off.Even then, the mentor who had beenassigned to Lewis kept calling.

“I remember sitting in my house andthinking about what it used to be like toget up and go to school everyday and doresearch,” Lewis says. “I realized that Ihad a gift, and I missed it. I knew Iwanted to spend my time doing research.”

The next semester, with her mentor’ssupport, she returned to GSU andgraduated in 1995. The following fall,she entered graduate school.

Blinded at the age of five in a house-hold accident, Lawrence Scadden,

director of NSF’s Program for Persons withDisabilities, knows firsthand the frustra-tions that confront someone buckingsociety’s notion of who should be a scientist.

“For far too long we’ve been closing disabled people out of science and math,”says Scadden, who received his doctor-ate in psychology and has spent thirtyyears conducting research in human per-ception. “These attitudes—the mythsand the ignorance—have created a majorbarrier that must be removed.”

Mentors, culturally appropriate rolemodels, networking, quality learningmaterials, research fellowships, accessto skilled teachers and to assistive tech-nologies that can help students overcomeimpairments—these are the factorsincluded in myriad NSF programs aimedat knocking down barriers of poverty,discrimination, and distrust.

Page 51: America's Investment in the Future

42 — National Science Foundation

NSF-funded research into new learning

technologies, such as the virtual

reality demonstration at right, gives

students hands-on experience in the

scientific process.

Wake Forest University biologist David Andersonis tracking albatrosses that nest on Tern Island,Hawaii, in an effort to understand (among otherthings) how the availability of food affects thehuge seabirds’ extremely slow rate of reproduction.The birds embark on searches for food that lastdays and even weeks. Do the albatrosses simplyfly to relatively close feeding sites and, once there,take plenty of time to gather their food? Or dothey travel to remote feeding areas, pick up theirfood, and return immediately? Supported by NSF,Anderson has worked for years to discover why thetrips take so long, using satellites to keep tabson albatrosses fitted with miniature transmitters.

But early in his research Anderson realized thathis project had applications beyond the scienceof albatross behavior. “It’s a perfect opportunityto engage school-age kids in science,” he says.

So in a collaboration that continues today,Anderson arranges to feed the satellite data viadaily emails to middle school classes that signup for the experiment from all over the UnitedStates. Teachers receive software and supportmaterial that help them guide their students inmaking sense of the birds’ movements. A relatedWeb site provides even more information, such asweather systems that could affect flight patterns,basic facts about albatross biology, and materialon the history and geography of the NorthwestHawaiian Islands. Mathematical techniques tocalculate the birds’ flight distances and speedare clearly explained. The students then analyzethe data in terms of the hypotheses about thebirds’ food journeys.

“Kids need to know that scientists pose a con-jecture, or hypothesis, and then collect data to tryto prove or disprove the hypothesis,” says Anderson.“This project emphasizes science as a processand a tool to get reliable answers to questions.At the same time, the data help us answer basicquestions about declining albatross populationsworldwide.” So far the project has filled in manydetails about albatross behavior, including thefact that the birds can fly for hours, and maybe

even days, without flapping their wings, therebyconserving energy on long-distance hunts for food.

Another example of how information technolo-gies are allowing students to perform actualresearch is the NSF-funded Hands On UniverseProject, originally developed in 1991 by astro-physicist Carl Pennypacker of the Space SciencesLaboratory at the University of California at Berkeley.As large telescopes became automated, theybegan generating huge numbers of new imagesthat needed to be analyzed. Pennypacker’s ideawas to get students involved by providing schoolswith image processing software, an archive of astro-nomical images, and related curriculum materials.

In 1995, a couple of astronomy teachers—Hughes Pack of Northfield Mount Hermon Schoolin Northfield, Massachusetts, and Tim Spuck ofOil City Area High School in Oil City, Pennsylvania—teamed up with Jodi Asbell-Clarke of TERC, anonprofit research and development organizationin Cambridge, Massachusetts, to develop a Web-based project that works in conjunction with theHands On Universe (HOU) curriculum. Their HOUAsteroid Search project allows students to down-load recent images via the Internet from an NSF-supported telescope in Chile with the specificaim of looking for previously unidentified asteroids.Over the years, students have found nearly twohundred asteroids that appear never to have beenseen before in the main belt of asteroids circu-lating through the solar system.

Then, in 1998, three high school students tak-ing Pack’s astronomy class made an even moreexciting discovery: a previously unknown asteroidin the Kuiper Belt, a collection of celestial objectsorbiting beyond Neptune thought to be leftovers

Page 52: America's Investment in the Future

The Informal Science Education (ISE) program, created in1984, is one way that NSF nurtures a lifelong love of science.ISE projects include everything from film and radio to exhibitsin museums and technology centers. The idea, says HymanField, deputy division director of the Elementary, Secondary,and Informal Education Division, is to “engage everybodyfrom pre-kindergartners to senior citizens in activities out-side the formal school system.”

About a third of ISE-supported projects involve radio,television, or film. Two particularly successful shows areaimed at young audiences. The Magic School Bus® beganas a series of commercial books published by ScholasticInc. for children of elementary school age. The series fea-tures a wacky science teacher named Ms. Frizzle who takesher class on educational field trips in her magically trans-formable bus. “Building on [the books],” says Field, “wesupported development of a television series—one of thefirst animated series on the Public Broadcasting Systemfor early elementary school kids.”

The television exposure stimulated fresh outlets for theproject. A live, traveling version of The Magic School Bus®

now brings fun science activities to schools, malls, andtheaters. Related materials, such as videos, CD-ROMs, andteaching guides are also available.

Older children have benefited from the televised exploitsof Bill Nye, the Science Guy. A mechanical engineer whomoonlighted as a stand-up comic, Nye first appeared as theScience Guy in 1987 on Almost Live!, a local version ofSaturday Night Live in Seattle. Six years later, Nye and twoproducers had expanded the concept into the outline of apopular science show featuring Nye’s zany but educationaldemonstrations using inexpensive, safe household items. Aswith The Magic School Bus®, NSF provided the initial fundingthat has allowed the Science Guy to take off and succeed.

A Lifelong Love of Science

Page 53: America's Investment in the Future

44 — National Science Foundation

from the formation of the solar system. At thetime of discovery, only about seventy-two suchobjects had been identified—none, until thatpoint, by anyone other than a professionalastronomer. The students—Heather McCurdy,Miriam Gustafson, and George Peterson—hadbecome stargazers of the first order.

“They called me over to take a look at a coupleof dots on an image they were analyzing,” recallsPack of that October afternoon. “They suspectedthe dots were artifacts, and I agreed with them.But right below those dots was another pair ofdots that made the hair on the back of my neckstand up. I recognized the signature of Kuiper Beltobjects. But I was a good teacher and just took adeep breath and turned to walk away. Then one ofthe girls said, ‘Mr. Pack, what about these?’ Theytold me the dots looked like evidence of an objectthat was moving, and at a very great distance.”

A week later, with the help of their cohorts inOil City, the Northfield students had done all thecalculations needed to confirm their find.

Says HOU founder Carl Pennypacker, “This isa fantastic piece of science, of education, of dis-covery. The Northfield students’ discovery hasshown that all students from a broad range ofbackgrounds can make solid, exciting, and inspir-ing scientific contributions.”

Students aren’t the only ones to benefit fromdirect experience with scientific research. NSFsponsors a number of programs that temporarilyput K-12 teachers “in the field,” with or withouttheir students, while also coaching the teacherson how to transfer their research experience intoclassroom learning. As a result, NSF-sponsoredteachers are working alongside scientists in theforests of Puerto Rico and the floodplains of theMississippi Delta, at Washington State’s PacificNational Laboratory and West Virginia’s NationalRadio Astronomy Observatory. Some are evengoing to the ends of the Earth itself.

Each year the Teachers Experiencing Antarcticaand the Arctic (TEA) program sends between eightand twelve elementary and secondary teachersto research stations at or near the polar ice capsfor up to eight weeks. TEA teachers have exploredhydrothermal vents around the Antarctic Penin-sula, pulled ice cores from the Greenland Ice Sheet,and released weather balloons at South PoleStation. Professional support abounds, both beforeand after the research trips. Veterans from pastTEA expeditions help mentor the new recruits, whoalso spend time at the home institutions of theirscientist-partners where they get a thorough ground-ing in their particular project. During their expedi-tion or upon their return from the ice, TEA teachersreceive professional help in turning their experienceinto classroom lessons, sharing their knowledgewith other teachers back home, and even attend-ing scientific conferences as co-presenters withother members of the polar learning community.

In 1998, as a biology teacher at Mayo HighSchool in Rochester, Minnesota, Elissa Elliottjoined a team of researchers studying microbiallife at frozen Lake Bonney in the Dry Valleys regionof Antarctica. She kept a daily journal, as TEAteachers are encouraged to do, and uploaded herentries along with photos to the TEA Web sitemaintained by Rice University. That way, her stu-dents back in Minnesota could share in herlearning and excitement. Elliott was in electroniccontact with more than three hundred class-rooms and individuals interested in learning aboutAntarctic science in real time.

Programs such as Teachers Experiencing

Antarctica combine research and educa-

tion in an effort to improve both teacher

training and student appreciation and

mastery of science, math, engineering,

and technology.

Page 54: America's Investment in the Future

By the 1960s, three hundred years after Gottfried Leibnitz andIsaac Newton independently developed it, calculus had becomea standard freshman course for students in the physical sciencesand engineering. Faculty began to use grades in those coursesto screen potential majors in other scientific disciplines and toweed out the less gifted students, even in majors that scarcelyrequired calculus. That approach drew protests, particularly fromstudents not destined for fields that required advanced trainingin mathematics. The fact that several colleges took an assemblyline approach to the subject, grouping students in large lectureclasses taught by teaching assistants, exacerbated the situation.Indeed, a high proportion of the more than half a million studentsenrolled in calculus courses each semester either failed or couldnot apply calculus concepts in later courses.

In January 1986, mathematicians from twenty-five influentialcolleges and universities met at Tulane University under theauspices of the Mathematics Association of America (MAA).There, they discussed better ways to give students a conceptualgrasp of calculus. NSF kept in touch with the reform movement,and in October 1987 announced its Calculus CurriculumDevelopment Program, jointly administered by the Divisions ofUndergraduate Education and Mathematical Sciences.

Over the next ten years, NSF-supported reform projects even-tually led to a significant change in how calculus was taught.Changes include the use of graphing calculators and computers,open-ended projects, extensive writing, more applications, anduse of cooperative learning groups. NSF-funded projects have alsochanged the infrastructure of calculus teaching. Virtually everytraditional college-level textbook has been revised in light of thereform movement. The Advanced Placement calculus outline forhigh school students has been overhauled, and revisions are under-way on the Graduate Record Examination’s mathematics section.

“There is no question of the importance the NSF initiativehas had in achieving the changes reported to date,” wrote theauthors of an MAA report. “The NSF program successfullydirected the mathematics community to address the task ofreforming the calculus curriculum and provided coherence tothose efforts.”

A New Formula for Calculus

Page 55: America's Investment in the Future

46 — National Science Foundation

A substitute teacher was filling in for Elliottduring her absence but, thanks to TEA’s technicalsupport, “essentially, I was able to hold classfrom Antarctica,” she says. “My students and Iemailed back and forth. They had a ton of ques-tions. So much of the time, we’re teaching whatis already known and the sense of discovery justisn’t there. But because I was able to pretty muchcommunicate with them in real time, they couldsee that science is something that is happeningright now. And that does so much more for kidsthan textbooks do.”

A Revolution in University CultureAs exciting and worthwhile as such programs are,of course, they reach only a small fraction of theteacher workforce. Recognizing that not all teacherscan go to the field, NSF is looking for more waysto bring the field to them. One approach is NSF’sGraduate Teaching Fellows in K-12 Education pro-gram. Begun in 1999, the program aims to placegraduate and advanced undergraduate science,mathematics, and engineering students into K-12classrooms as resources for teachers and students.

A critical component of the fellowship is pedagogical training for the upper-level sciencestudents, so they will know how to transform theircutting-edge knowledge into something that youngerstudents can understand and appreciate. Still,“the intention is not to make teachers out of sci-entists, although some may decide that’s whatthey want to do,” says NSF’s Dorothy Stout, whoheaded up development of the program. Rather,NSF hopes that the teaching fellows will go on tobecome scientists who, in turn, will act as bridgesbetween the research and education communitiesby serving as resources for their local school districts.

“We want them to be well-rounded individuals,”says Stout, “who can enhance K-12 classroomswith their specialized backgrounds.”

Or as NSF Director Rita Colwell says, “We can-not expect the task of science and math educationto be the responsibility solely of K-12 teacherswhile scientists, engineers, and graduate studentsremain busy in their universities and laboratories.”

A natural extension of NSF’s commitment tobringing the research and education communitiestogether is a greater emphasis on the conduct ofresearch into education itself. Says Colwell,“We’ve spent a lot of time focused on teachingand yet we don’t really know how people learn—how effectively a person’s learning can be enhanced,and the differences in how people learn.”

Education research emerged as a field in the1950s and 1960s. Although it once struggled togain the level of funding and respect afforded toother areas of scientific inquiry, the field is com-ing into its own as growing numbers of scientistsand educators advocate research to better under-stand how people learn and think.

Finding out more about how children learn, andfiguring out how to implement what is known aboutthe acquisition of knowledge, are huge challenges.Recognizing the importance of this work, the U.S.government announced in April 1999 a uniquecollaboration among NSF, the National Institutesof Health, and the Department of Education. Thegoals of the new Interagency Education ResearchInitiative (IERI) are to meld different kinds ofresearch in how children learn mathematics, sci-ence, and reading; to understand the implicationsof research for the education community, speed-ing the implementation of research-based instruction;and to expand the appropriate uses of technologyin schools.

For example, one project funded by IERI willconduct a cross-cultural study comparing the earlydevelopment of mathematical concepts and under-standing in three- to six-year-old Chinese, Japanese,and American children. The project will also study

Page 56: America's Investment in the Future

learning that NSF was allowing him to pursue (he went on to make important contributions as a research chemist at the National CancerInstitute), Lednicer wrote a letter of thanks to the man who had signed NSF into existence,President Harry Truman. Truman’s plain-spokenreply on October 2, 1954, speaks prescientlyabout NSF’s unique role as a catalyst for scien-tific knowledge, in the laboratory as well as inthe classroom:

Dear Mr. Lednicer:Your good letter of September 21 was very much

appreciated. I always knew that the [National]Science Foundation would do a great amount ofgood for the country and for the world. It took aterrific fight and three years to get it throughCongress, and some smart fellows who thoughtthey knew more than the President of the UnitedStates tried to fix it so it would not work.

It is a great pleasure to hear that it is workingand I know it will grow into one of our greatesteducational foundations.

Sincerely Yours,Harry S Truman

Education — 47

how different cultures support the children’s earlymathematical development in various settings: athome, in child care facilities, and in preschool.The idea is to gain insight into how best to sup-port the growth of children’s mathematical skillsprior to elementary school.

Another project funded by IERI will expand thetesting of an automated reading tutor for at-riskchildren. Children read aloud while a computerprogram “listens” and verbally corrects any mis-takes. The program is not fooled by accents andis able to use other cues (thanks to a cameramounted on the computer) to see if the child ispaying attention to the task. Preliminary studieshave shown that seriously underperforming first-and second-graders who use the automated tutorfor three to six months jump almost to their gradelevel in reading skill. Researchers will also com-pare the automated tutor to human tutors. It’sexpected that students will respond best to humantutors, but by how much? With schools strugglingto provide at-risk students with the extra help theyneed, such technology could be an affordable andeffective boon.

A Great Deal of GoodSince 1950, NSF has worked for stronger curriculaand enhanced professional development forteachers. The agency has planted the seeds ofsystemic change and made it possible forresearchers to work in partnership with educatorsto bolster the scientific basis of learning. Despiteall that NSF has done over the years in theseareas, some may be surprised to discover justhow important education is at one of the country’sprimary sources of research funding. But NSF’scommitment to the nation’s students has beenpart of its mission from the very beginning.

In 1954 Daniel Lednicer, a doctoral student inchemistry, received a third year of financial supportthrough NSF’s fledgling Graduate ResearchFellowship program. Full of gratitude for the life of

NSF Directorate for Education and Human Resourceswww.ehr.nsf.gov

American Association for the Advancement of ScienceProject 2061 www.project2061.org

National Academy Presswww.nap.edu

National Council of Teachers of Mathematicswww.nctm.org

National Science Teachers Associationwww.nsta.org

The Albatross Project (at Wake Forest University)www.wfu.edu/albatross

The Hands On Universe Project (at Lawrence BerkeleyNational Laboratory)http://hou.lbl.gov

Teachers Experiencing the Antarctica and the Arctic (at Rice University)http://tea.rice.edu

To Learn More

Page 57: America's Investment in the Future

Manufacturingthe form of

things unknown

Page 58: America's Investment in the Future

Contrary to its image as a fading giant,

manufacturing is helping to propel

the U.S. economy to new heights of

wealth and reward. NSF contributes

to manufacturing’s success by investing

in innovative research and education.

Page 59: America's Investment in the Future

Manu f a c t u r i n g —the process of converting dreams into objects that enrich lives—

is the poetry of the material. Traditionally, the path from an engineer’s imagination to a finished

prototype was labyrinthine, involving draftsmen, model makers, rooms full of machine tools, and

lots of time. But all that is changing. Over the last two decades, NSF grants have helped to

create new processes and systems as well as innovative educational programs that have trans-

formed manufacturing from a venture dominated by smoke-belching factories to the clean and

agile enterprises of today and tomorrow.

Page 60: America's Investment in the Future

Manufacturing — 51

The Myth of Manufacturing’s DemiseHere at the close of the twentieth century, manu-facturing accounts for one-fifth of the nation’sgross domestic product and employs 17 percentof the U. S. workforce, according to the NationalScience and Technology Council. More significantlyto the nation’s economic well-being throughout the1990s, productivity in manufacturing—the abilityto produce more goods using less labor—far out-stripped productivity in all other sectors of society,including the service sector. As the nation’s pro-ductivity leader, manufacturing has helped thenation to achieve low unemployment with onlymodest inflation.

“Other sectors generate the economy’semployment,” says National Association ofManufacturers economist Gordon Richards.“Manufacturing generates its productivity.”

This record of success seems remarkable when compared to the state of manufacturingjust twenty-five years ago.

“There was a lot of literature in the mid-seventies that argued quite strongly that theUnited States was basically going to a serviceeconomy,” says Louis Martin-Vega, NSF’s actingassistant director for Engineering and formerdirector of the Directorate’s Division of Design,Manufacture, and Industrial Innovation (DMII).Back then, Americans worried while clean, effi-cient Japanese factories rolled out streams ofproducts—cars, televisions, VCRs—that were of higher quality and lower cost than those pro-duced in the United States.

Still based on the classical mass-productionmodel pioneered by early automaker Henry Ford,American manufacturing was proving no match forthe leaner, more flexible manufacturing techniquesthat, although first conceived by American thinkers,were being improved upon elsewhere. In order tomodernize manufacturing processes and systems,however, U.S. businesses needed to do the kind

The NSF-supported Integrated

Manufacturing Systems Laboratory

(IMSL) at the University of Michigan

is developing next-generation manu-

facturing systems that can be quickly

reconfigured to adapt to changing

market realities. Research focuses on

open architecture controls, reconfig-

urable machining systems, and

sensor-based monitoring systems.

Here, graduate students demonstrate

their work in the lab during a visit by

NSF officials.

of research and development (R&D) that wasbecoming too expensive for any one business toundertake by itself. Government help was required,but in the early 1980s, help was hard to come by:The push was on to beef up defense and shrinkthe rest of the federal government, all while ram-pant inflation eroded existing research budgets.The result at NSF, according to Dian OlsonBelanger, author of Enabling American Innovation:Engineering and the National Science Foundation,was that “in real purchasing power, 1982 [research]grantees were living with dollars adequate for 1974.”

By the mid-1980s, the United States was nolonger “the unquestioned technological hub ofthe world,” according to Harvard physicist andNobel Laureate Sheldon Glashow, but was insteadpassing “the torch of scientific endeavor” to othernations. “Steel, ships, sewing machines, stereos,and shoes” were “lost industries,” he said.Unless something was done soon, Glashowexclaimed, Americans would be left with “theirBig Macs . . . and perhaps, [their] federally sub-sidized weapons industries.”

Page 61: America's Investment in the Future

52 — National Science Foundation

Says Martin-Vega of that troubling time,“There was a realization that, well, we’ve lostthe electronics business, the automotive indus-try was hurting, the machine-tool industry wasall Germany and Japan, and then it seemed likewe were going to have the same fate in thesemiconductor industry.”

The potential loss of an industry so crucial inthe burgeoning Computer Age frightened publicofficials and turned federal attention to manufac-turing-related research in a new way. In 1987, thegovernment worked with industry to start a researchconsortium of semiconductor companies known asSEMATECH. The group continues to operate today(having weaned itself from government support)with member companies sharing expenses and riskin key areas of semiconductor technology research.

Within NSF, says Martin-Vega, “The argumentfor supporting work in manufacturing was madeless difficult when you had a situation that couldalmost be considered a national threat.” Engi-neering research seeds planted in the early 1970sbegan to bear fruit. By the mid-1980s, somepivotal scientific foundations for design andmanufacturing were in place. To build on them,in 1985 NSF established a separate design andmanufacturing division.

NSF helped to move manufacturing from theobituaries to the headlines, which now are morelikely to celebrate the “new manufacturing,” withits reliance on information technologies and moremalleable, quick-response organizational structures.As the following highlights demonstrate, with somecritical assistance from NSF, U.S. manufacturingisn’t dying after all—it’s just changing.

Rapid PrototypingIn the late 1960s, Herbert Voelcker—then anengineering professor at the University ofRochester, now at Cornell University—went onsabbatical and asked himself how to do “inter-esting things” with the automatic, computer-controlled machine tools that were just beginningto appear on factory floors. In particular, Voelckerwanted to find a way to take the output from acomputer design program and use it to programthe automatic machine tools.

With funding from NSF, Voelcker tackled theproblem first by developing the basic mathematicaltools needed to unambiguously describe three-dimensional parts (see the chapter on “Visual-ization,” p. 88). The result was the early mathe-matical theory and algorithms of solid modelingthat today form the basis of computer programsused to design almost everything mechanical,from toy cars to skyscrapers.

During the 1970s, Voelcker’s work transformedthe way products were designed, but for the mostpart they were still made the same old way. That is,either a machinist or a computer-controlled machinetool would cut away at a hunk of metal until whatremained was the required part, in much the sameway as Michelangelo removed chips of marble froma block until all that remained was a statue of David.But then in 1987, University of Texas researcher CarlDeckard came up with a better idea.

Page 62: America's Investment in the Future

Next-Generation Manufacturing

Since 1976, various U. S. presidents have formedinteragency councils—with gradually increasing participation from industry—to try to build consen-sus and identify strategies in certain key areas of the economy, including manufacturing. NSF’sleadership has been critical to these efforts, which most recently took the form of the Next-Generation Manufacturing (NGM) project.

NGM was funded by NSF and other federal agencies but headed by a coordinating councildrawn from the manufacturing industries. Starting in 1995, more than 500 industry experts workedtogether to produce a final 1997 report offering a detailed vision for the future of manufacturing.Today the NGM report forms the basis of a follow-up effort called the Integrated ManufacturingTechnology Roadmap (IMTR) project, also funded by NSF and other federal agencies.

“The question that guided us,” says NSF’sDeputy Director Joseph Bordogna, former head of NSF’s Directorate for Engineering and a primaryarchitect of NGM and other efforts to rejuvenatemanufacturing in America, “is ‘what principlesunderlie the ability of a company to continuouslychange itself in response to the changing market-place?’ That means figuring out adaptive, decision-making processes and software, as well as manipulating materials and coming up with new machines for the factory floor.”

According to the NGM report, a “next-generation”manufacturer will need to transform itself from atwentieth century-style company—one that func-tions as a sovereign, profit-making entity—into atwenty-first century company that is more of anextended enterprise with multiple and ever-shiftingbusiness partners. As Stephen R. Rosenthal, direc-

tor of the Center for Enterprise Leadership,describes it, next-generation manufacturers shouldbe companies that stretch from “the supplier’s supplier to the customer’s customer.”

Successful next-generation manufacturers, theNGM report concludes, will have to possess an integrated set of attributes. The company will need to respond quickly to customer needs by rapidly producing customized, inexpensive, and high-quality products. This will require factories that can be quickly reconfigured to adapt to changing production and that can be operated byhighly motivated and skilled knowledge workers.Workers organized into teams—both within and outside a company—will become a vital aspect of manufacturing. As participants in extended enterprises, next-generation companies will onlyundertake that part of the manufacturing processthat they can do better than others, somethingindustry calls “adding value.”

Inherent in these requirements are what theNGM project report calls “dilemmas.” These arisefrom the conflict between the individual company’sneeds and those of the extended enterprise. How can knowledge be shared if knowledge is itself a basis for competition? What security cancompanies offer their skilled employees when the rapidly changing nature of new manufacturingmeans that firms can’t guarantee lifetime employ-ment? How can the gaining of new knowledge be rewarded in a reward-for-doing environment?

Resolving these dilemmas is an important part of NSF’s vision of the work to be done in the twenty-first century, work in which NSF will play a leading role.

Page 63: America's Investment in the Future

54 — National Science Foundation

Instead of making a part by cutting away at alarger chunk of material, why not build it up layerby layer? Deckard imagined “printing” three-dimensional models by using laser light to fusemetallic powder into solid prototypes, one layerat a time.

Deckard took his idea—considered too specu-lative by industry—to NSF, which awarded him a$50,000 Small Grant for Exploratory Research(SGER) to pursue what he called “selective lasersintering.” Deckard’s initial results were promisingand in the late 1980s his team was awarded oneof NSF’s first Strategic Manufacturing (STRATMAN)Initiative grants, given to the kind of interdiscipli-nary groups often necessary for innovation in therealm of manufacturing.

The result of Voelcker’s and Deckard’s effortshas been an important new industry called “free form fabrication” or “rapid prototyping” thathas revolutionized how products are designedand manufactured.

An engineer sits down at a computer andsketches her ideas on screen with a computer-aided design program that allows her to makechanges almost as easily as a writer can changea paragraph. When it’s done, the design canthen be “printed” on command, almost as easilyas a writer can print a draft—except this draft isa precise, three-dimensional object made ofmetal or plastic.

“To take a computer model and turn it into aphysical model without any carving or machiningis incredible,” says an analyst who tracks thisnew industry. “It’s almost like magic when yousee that part appear.”

The method can be used to make things thatare more than prototypes. “Because you cancontrol it in this incredible way, you can makeobjects that you just couldn’t think of machiningbefore,” says George Hazelrigg, group leader ofDMII’s research programs. “For example, you canmake a ship in a bottle.”

More practically, the method has been usedto make a surface with lots of tiny hooks thatresembles Velcro. These new surfaces are provingto be ideal substrates for growing human tissue.NSF-funded researchers have already grown humanskin on these substrates and are looking to growreplacements of other organs as well.

“So these are pretty fundamental things,”Hazelrigg says. “I think it’s fair to say that weplayed a major role in it.”

Bruce Kramer, acting division director of NSF’sEngineering and Education Centers, is even moredefinite: “For a majority of successful rapid pro-totyping technologies, the first dollar into thetechnology was an NSF dollar.”

Getting ControlRapid prototyping may be the wave of the futurebut most manufacturing is still done by traditionalmachine tools—the drills, lathes, mills, and otherdevices used to carve metal into useful shapes.Machine tools have been around for more thantwo centuries, only recently changing to keep timewith the revolution in computer technology. Throughmost of the 1980s, computer-controlled machinetools were capable of only a narrow range of pre-programmed tasks, such as drilling holes or cuttingmetal according to a few basic patterns. For sim-ple designs these controllers were pretty good,

NSF helped launch rapid prototyping

technologies, which can build new

products from a computer-aided design

(CAD) model without any carving or

machining. Here, a student works with

rapid prototyping equipment in the

NSF-supported Advanced Design and

Manufacturing Laboratory at the

University of Maryland. One goal of

the lab is to create graphical materials

for visually impaired students.

Page 64: America's Investment in the Future

but by 1986, when University of California engi-neering professor Paul Wright applied to NSF fora grant to improve machine tools, the limitationsof these so-called closed architecture controllerswere becoming apparent.

“Our goal was to build a machine tool that coulddo two things,” Wright explains. “Number one, bemore connected to computer-aided design imagesso that if you did some fantastic graphics you couldactually make the thing later on. Number two, onceyou started making things on the machine tool, youwanted to be able to measure them in situ withlittle probes and then maybe change the machinetool paths” to correct any errors.

The idea was to devise a controller that wasflexible both in hardware and software, allowingthe use of advanced monitoring and control tech-niques based on the use of sensors. Wright alsowanted to standardize the basic system so otherscould more easily develop new hardware and software options over time.

At first, Wright asked machine tool manufactur-ers to support his research, but “they thought Iwas a complete idiot,” he recalls. Wright wantedto use the relatively new Unix operating system,which the machine tool companies thought wasdaring and unsafe. So Wright and his colleaguesturned to NSF. The agency responded, says Wright,with a grant “to open up the machine tool con-troller box, which was very crude and inaccessibleback then. And, in my humble opinion, that hasled to a lot of good results.”

Today, Wright’s open architecture controllers arethe industry norm and have quite literally changedthe shape of manufactured products. That NSFwas there when even the ultimate beneficiary—industry—was not, is “why I’m so enthusiasticabout NSF,” Wright says.

Supply Chain ManagementRapid prototyping and open architecture controllersare examples of advances in manufacturingprocesses, but NSF has also been instrumentalin helping to modernize manufacturing systems.

In 1927, Henry Ford’s Rouge complex nearDetroit began churning out a ceaseless stream ofModel A cars. The Rouge facility was perhaps theultimate expression of mass production and “verticalintegration,” in which a company tries to cushionitself from the vagaries of the market by owningor controlling virtually every aspect of its business,from the mines that provide the ore to the facto-ries that make the glass. Raw materials—iron ore,coal, and rubber, all from Ford-owned mines andplantations—came in through one set of gates atthe plant while finished cars rolled out the other.

Ford’s vision informed how manufacturingwas done for most of the twentieth century, butby the late 1970s the limitations of this approachhad started to become obvious, at least to theJapanese. Why make steel if what you do best ismake cars? Why be responsible for your ownsuppliers—and pay to maintain all that inventory—when it’s cheaper to buy from someone else?Bloated, vertically integrated American companiesfaced a serious challenge from Japanese carmak-ers who organized their factories along a different,leaner model resulting in cheaper, better cars.Japanese factories—in which each car was builtby a small team of workers rather than beingpieced together along a rigidly formulated assem-bly line—were far more efficient when it came timeto shift to a new model. An American car plant waslike a machine dedicated to building a single typeof vehicle. Workers were interchangeable parts ofthat machine, whose “intelligence” was vested inthe machine’s overall design rather than in the

Manufacturing — 55

Page 65: America's Investment in the Future

56 — National Science Foundation

workforce. In contrast, Japanese plants dependedon the intelligence of their workers, who wereencouraged to make any improvements to themanufacturing process that they saw fit.

It took some time, but by the 1980s Americanmanufacturers such as General Motors (GM) hadabsorbed the Japanese lessons of “lean” manufacturing and were looking to make someimprovements of their own. For help, GM turnedto Wharton Business School professor MorrisCohen who, with support from NSF, analyzed acritical part of its production system: The processby which GM distributed 600,000 repair parts tomore than a thousand dealers.

Cohen’s approach was to see this processas one of many “supply chains” that kept GM upand running. Supply chains form a network ofresources, raw materials, components, and fin-ished products that flow in and out of a factory.Using empirical data and mathematical models,Cohen and his colleagues proposed a completereorganization of GM’s repair parts supply chain.

“We suggested that a high degree of coordina-tion be put in place to connect decisions acrossthe supply chain,” says Cohen. “Today that’scommonplace, but back then the idea was con-sidered radical.”

In fact, the idea was considered so sweepingthat GM executives rejected it—not because theydisagreed with Cohen’s analysis but rather becausethe scale of the reorganization was too much forthem to contemplate at the time. However, GM wassoon to embark on building a new car companycalled the Saturn. GM’s management decided toapply a number of Cohen’s recommendations tothe new venture, including the main proposition:centralized communications and coordinatedplanning among the Saturn dealerships and the

company distribution center. Rather than operatingin the traditional fashion, as separate entities, thedealerships would be hooked up via satellite to acentral computer. By consolidating information andmaking it available to everyone, managementcould make optimal parts-ordering decisions,neighboring dealerships could pool resources,and dealers could focus on maximizing customerservice without worrying about what inventory theyshould be stocking. All of these improvements letmanagement accommodate difficult-to-predict partsservice demands without holding excessiveinventory, while still ensuring that dealers got theparts they needed to repair cars in a timely manner.

Cohen’s approach to supply chain managementquickly proved a success: Saturns, which are rel-atively low-cost cars, are routinely ranked amongthe top ten cars with respect to service. “The othertop ten are high-priced imports,” Cohen says.

Only the Agile SurviveSupply chain management may make for leanermanufacturing, but there is also a premium onagility. Agile manufacturers recognize that information technology and globalization havedramatically quickened the rate at which newproducts must be innovated and brought to market.In such a rapidly shifting marketplace, it’s bestto operate not as a vertically integrated giant butrather as part of a loose confederation of affiliatesthat form and reform relationships depending onchanging customer needs. In the 1990s, NSF setup three institutes—at the University of Illinois,Rensselaer Polytechnic Institute (RPI), and theUniversity of Texas at Arlington—to study issuesraised by agile manufacturing.

Page 66: America's Investment in the Future

Manufacturing, because it is a multi-faceted endeavor depending on the integration of many ideas, techniques,and processes, draws largely on theskills of engineers, a group that has notalways felt entirely welcome at NSF.Vannevar Bush, head of the wartimeOffice of Scientific Research andDevelopment, wrote a major report forPresident Harry Truman that led to theestablishment of NSF in 1950. In thatreport, Bush warned that while Americawas already preeminent in appliedresearch and technology, “with respectto pure science—the discovery of fundamental new knowledge and basicscientific principles—America has occupied a secondary place.”

As a result of this view many cameto see engineering, rightly or wrongly,as a quasi-applied science that, sayshistorian Dian Olson Belanger, “wasalways alien to some degree” within thehistorically basic science culture of NSF.

This attitude began to change duringthe post-Sputnik years and continuingthrough the Apollo moon landing, whenengineering gradually assumed a moreprominent role at NSF. President LyndonJohnson amended the NSF charter in1968 specifically to expand the agency’smission to include problems directly affect-ing society. Now “relevance” becamethe new by-word, embodied in the 1969launch of a new, engineering-dominantprogram called Interdisciplinary ResearchRelevant to Problems of Our Society(IRRPOS), which funded projects mostlyin the areas of the environment, urbanproblems, and energy.

IRRPOS gave way in 1971 to a similarbut much expanded program calledResearch Applied to National Needs(RANN). And within RANN, an NSF program officer named Bernard Chernbegan to fund pioneering research incomputer-based modeling, design, andmanufacturing and assembly processes.

“It is fair to say that Chern’s earlygrantees . . . set the character of muchof American automation and modelingresearch for almost a decade,” saysHerbert Voelcker, former deputy directorof DMII and now an engineering professorat Cornell University. But despite itssuccesses, RANN remained controversialamong those concerned that NSF notlose sight of the importance of curiosity-driven research. Still, by the time RANNwas abolished in 1977, it had built asubstantial beachhead within NSF forproblem-oriented and integrative R&D.

In 1981, NSF was reorganized toestablish a separate Directorate forEngineering. As part of its mandate toinvest in research fundamental to theengineering process, the directorateincludes specific programs devoted todesign and manufacturing issues. Todaysuch issues are the province of theDivision of Design, Manufacture, andIndustrial Innovation, whose mission isto develop a science base for designand manufacturing, help make thecountry’s manufacturing base morecompetitive, and facilitate research andeducation with systems relevance.

A Brief History

Page 67: America's Investment in the Future

58 — National Science Foundation

NSF-funded researchers at the University

of California at Berkeley, have created

an experimental system called CyberCut

that allows users to quickly design and

manufacture prototypes for mechanical

parts via the Internet. An online computer

modeling tool links to a computer-

controlled milling machine. Here,

Cybercut renders art—a human face

scanned by lasers.

“Agile manufacturing takes on a slightly differ-ent definition depending on whom you talk to,” saysRobert Graves, who is a professor in the DecisionSciences and Engineering Systems department atRPI as well as director of the Electronics AgileManufacturing Research Institute, which studiesissues of agile manufacturing as they apply to theelectronics industry. “Here in electronics we lookat the idea of distributed manufacturing.”

In the distributed manufacturing model, anenterprise consists of a core equipment manufac-turer that produces the product and is supportedby supply chains of materials manufacturers andservices. As an exercise, Graves and his colleaguesat RPI set up their own agile “company” to redesigna circuit board used in an Army walkie-talkie. Whileteam members finished the product’s design,companies were found that could potentially supply the parts and assembly services required.But parts listed in the companies’ catalogs weren’talways available or, if they were, might not havebeen available quickly.

So the team redesigned their circuit board toinclude other, more readily available parts. This,and the search for new suppliers, took excessivetime and required extra resources—circumstancesthat emulated the realities of traditional designand manufacturing. But the time wasn’t wasted,since the whole point was to identify commonmanufacturing obstacles and devise ways for thesystem to become more agile.

In the end, the RPI researchers saw that theycould streamline the system by using computersand networks to handle the negotiations betweensuppliers and designers. The researchers devel-oped software that takes a circuit board design,works out all possible, functionally equivalentvariants, and sends out “agents”—self-sufficientcomputer programs—to the computers of thevarious parts suppliers. These automated agentscarry a “shopping” list of the physical character-istics of some sub-system of the board. List inhand, each automated agent essentially rootsaround in the suppliers’ computers, making noteof such things as how much each supplier wouldcharge for the components on the list and howquickly the supplier can deliver. The agents thencarry the information about pricing and availabilityback to the designer’s computer, which may usethe new data to further modify the design and sendout yet more agents.

RPI researchers using this new system cut thecircuit board design process from a typical ninemonths to a matter of weeks. In 1999, the groupspun off a company called ve-design.com to market their newly developed agile system.

Page 68: America's Investment in the Future

Manufacturing — 59

Education that WorksThe success of supply chain management andagile manufacturing shows that manufacturingcannot be considered primarily in terms of trans-forming raw materials into finished goods, saysEugene Wong, former director of NSF’s Directoratefor Engineering and currently professor emeritusat the University of California at Berkeley. Rather,manufacturing should be thought of as a “systemfunction” that serves as the core of a modernproduction enterprise.

“In a larger sense,” says Wong, “the distinctionbetween manufacturing and service is not useful.Modern manufacturing encompasses inventorymanagement, logistics, and distribution—activitiesthat are inherently service-oriented.” Wong suggeststhat this blurring of the manufacturing and servicesectors of the economy constitutes a paradigmshift with profound implications for the future.That is why NSF continues to invest not only inthe development of new manufacturing processesand systems, but also in new approaches toengineering education. As NSF Deputy DirectorJoseph Bordogna says,“It’s not just the discoveryof new knowledge, but the education of workersin that new knowledge that is the fundamental—and maybe unique—mission of NSF.”

The education of both scientists and engineershas been a goal of NSF since 1950. During theeconomic turbulence of the 1970s and 1980s,however, it became clear that industry and acad-emia had become estranged from each other inthe critical area of manufacturing. Manufacturing-related scientific research at the universities wasn’tmaking it out into the real world quickly enough,if at all, and companies were complaining thattheir young engineering hires, while capable ofscientifically analyzing a problem, couldn’t produceactual solutions in a timely fashion. So NSF beganlooking for ways to nurture mutually beneficialpartnerships between companies seeking accessto cutting-edge research and students and pro-fessors looking for practical experience in puttingtheir ideas to work.

In the early 1980s, NSF spearheaded what wasthen known as the Engineering Faculty InternshipProgram. The program provided seed grants—tobe matched by industry—for faculty membersinterested in spending time in an industrial envi-ronment. A decade later, the internship modelwas included as part of a broader program aimedat creating opportunities for universities andindustries to collaborate on long-term, fundamen-tal research. Eventually the expanded program,called Grant Opportunities for Academic Liaisonwith Industry (GOALI), spread throughout thewhole of NSF.

Research funded through the GOALI programhas led to such advances as more efficient chipprocessing and improvements in hydrocarbonprocessing, which allow previously unusableheavy oils to be transformed into gasoline andchemical products.

“GOALI enhances research,” says NSF’s GOALIcoordinator, Mihail C. Roco.“The program has un-locked a real resource in academic and industrialresearch. GOALI promotes basic research thatcan provide enormous economic benefits for thecountry.”

Another effort by NSF to bridge the gapbetween industry and academia is the Engineer-ing Research Centers (ERC) program, launchedin 1984. The ERC program supports university-based research centers where industry scien-tists can collaborate with faculty and students on the kind of knotty, systems-level engineeringproblems that tend to hobble innovation in thelong run. Companies get a chance to conductcutting-edge research with a long-term focuswhile faculty and students (both graduate andundergraduate) become more market-savvy intheir approach to problem-solving. In the end,

Page 69: America's Investment in the Future

60 — National Science Foundation

A student at the NSF-supported

Microelectronics Lab at the University

of Arizona adjusts the water flow to an

apparatus used to study wafer rinsing,

a critical step in the manufacture of

semiconductor chips. Current methods

use huge amounts of water, an envi-

ronmental concern. Through the

fundamental study of wafer rinsing,

NSF-funded researchers have discov-

ered bottlenecks in the process and

have created optimal flow cycles to

reduce waste.

ERCs shorten the path between technical discovery and the discovery’s application.

“The basic goal of the ERC program is to formpartnerships within industry to advance next-generation technology and to develop a cadre ofstudents who are much more effective in practice,”explains NSF’s Lynn Preston, ERC program leader.“Because of the sustained support that we cangive them, the centers focus and function with astrategic plan.”

ERCs focus on relatively risky, long-termresearch—the kind that industry, coping with anincreasingly competitive marketplace, is oftenreluctant to chance. “It’s about really big, toughchallenges that industries can’t take on their own,”says Preston.

A prime example with regard to manufacturingis the Center for Reconfigurable MachiningSystems (RMS) at the University of Michigan inAnn Arbor. Since its establishment as an ERC in1996, the RMS center has aimed to create a newgeneration of manufacturing systems that can bequickly designed and reconfigured in response toshifting market realities. Working with about twenty-five industry partners, the students and faculty ofthe RMS center seek to develop manufacturingsystems and machines with changeable structures.

“Most manufacturing systems today have a rigidstructure,” says Yoram Koren, the RMS center’sdirector. “Neither the machines themselves orthe systems they’re a part of can be changedvery easily. But with the globalization of trade,product demand is no longer fixed and productchangeover becomes faster. Companies need tobe able to adjust their product lines, often incre-mentally, to changing market realities.”

Koren points to the automotive industry as anexample. “When gas prices were low, everybodywanted to buy a V-8 [engine],” he says. “Carcompanies couldn’t make enough V-8 engines tosupply demand. Now gas prices are going up andcompanies are facing the opposite problem.”

One common barrier to change is what’s knownas “ramp-up.” Usually, it takes anywhere fromseveral months to three years to ramp up; thatis, to begin marketing an optimum volume offlaw-free new products once a new manufacturingsystem is put into place. A key contributor to thedelay is the inherent difficulty in calibrating changesthroughout the existing system; a single machineerror can propagate and cause serious productquality problems. To address this issue, the stu-dents, faculty, and staff at the RMS center havecome up with a mathematically based “stream-of-variation” method that Koren says significantlyreduces ramp-up time. The center’s industrypartners are excited about the prospects for thisand other RMS-generated innovations.

“We, and our suppliers, have already benefitedfrom working with University of Michiganresearchers to implement scientific methods inour plants,” says Jim Duffy, manager of manu-facturing engineering at Chrysler Corporation.

Mark Tomlinson, vice president for engineeringat Lamb Technicon (a major machine tool builder),agrees about the potential pay-off for industry.“The ERC for Reconfigurable Machining Systemsis providing the vision and inspiration for our next-generation machines,” he says, “as well as supply-ing the qualified engineers that support our needs.”

Page 70: America's Investment in the Future

Manufacturing — 61

NSF Directorate for Engineeringwww.eng.nsf.gov

NSF Division of Design, Manufacture, and Industrial Innovationwww.eng.nsf.gov/dmii

Engineering Research Centerswww.eng.nsf.gov/eec/erc.htm

Engineering Research Center for Reconfigurable MachiningSystems (University of Michigan)http://erc.engin.umich.edu

Electronics Agile Manufacturing Research Institute(Rensselaer Polytechnic Institute)www.eamri.rpi.edu

To Learn More

Manufacturing the FutureThe ERC program is a particularly good exampleof how NSF brings together the discovery-drivenculture of science and the innovation-driven cultureof engineering. Manufacturers applaud NSF’sefforts because they recognize that coming upwith new systems and products is a much morecomplex and expensive venture than ever before,and they need the help of university-basedresearchers in order to build the science base for future advancements.

For example, it takes about a billion dollars todevelop a new semiconductor chip capable of thekind of performance required in, say, high-definitiontelevision. That level of investment—that level ofrisk—deters even the most ambitious Americancompanies from doing the kind of pioneeringresearch necessary to keep them globally com-petitive. NSF’s role as a catalyst for government-industry-academia collaboration is vital for thenation’s economic well-being.

“You need a partnership,” says NSF DeputyDirector Joseph Bordogna. “You need new knowl-edge out of universities and labs, new processesfrom industry, and a government willing to enableit all through appropriate R&D policy and frontierresearch and education investment, by and forthe citizenry.”

NSF’s efforts to bridge the worlds of industryand academe reflect another truth about modernmanufacturing: Knowledge and ideas are the mostimportant raw materials.

“It’s no longer profitable just to ship a piece ofmetal out the front door,” industry analyst GrahamVickery told Industry Week. “What you’re doingnow is shipping some sort of component thatrequires things like support services, or advice,or design skills, or engineering know-how” inorder for the component to be of actual use atthe other end.

Finding innovative ways to handle informationis now manufacturing’s chief concern. “If youunderstand that today manufacturing is anenterprise-wide production process,” says EugeneWong, “you see that information managementwill assume an increasingly important role, onethat may already have transcended the importanceof transforming materials into products.”

With NSF’s help, American manufacturers aremaking the changes necessary to stay competi-tive in a marketplace increasingly dominated bye-commerce, while at the same time honoring thetraditional core of manufacturing’s purpose: theinnovation of new technologies and products foran expectant public.

Page 71: America's Investment in the Future

Arabidopsismap makers

of the plant kingdom

Page 72: America's Investment in the Future

W ith NSF support, biologists today

are mapping all of the genes of a

model organism—identifying the

location and function of each gene.

They have already made fundamental

discoveries that may lead to the

development of more beneficial

crops and forest products.

Page 73: America's Investment in the Future

Arab i d op s i s t h a l i a na is a small, flowering mustard plant that has become the subject

of intense study by scientists around the world. It has many characteristics of an ideal experimental

system—a model organism for elucidating the biology of flowering plants. Recognizing the promise of

Arabidopsis, NSF began working with leaders in plant biology in the 1980s to cultivate a spirit of coop-

eration and to encourage the use of the model plant in research. In 1990, NSF launched a multi-agency,

multinational project to identify all of the genes in Arabidopsis by location and function—in other words,

to create a genetic road map to flowering plants. The collegial Arabidopsis research community now

expects to complete the sequencing of the plant's genome by the end of 2000—several years ahead

of schedule. By August 1999, nearly 70 percent of the genome sequences for Arabidopsis had been

deposited in the public database, GenBank. Six months later, scientists reported the complete DNA

sequences of two of the five chromosomes of Arabidopsis. Major discoveries have been made about

the mechanisms by which genes regulate flower development, disease resistance, response to adverse

conditions in the environment, and numerous other aspects of plant biochemistry and physiology.

Commercial applications under development include trees with accelerated blooming, biodegradable

plastics grown in crops, and genetically engineered vegetable oil with reduced polyunsaturated fat.

Page 74: America's Investment in the Future

Arabidopsis — 65

A Rose Is a Rose Is a Mustard WeedThere are approximately 250,000 dif ferentspecies of flowering plants, all believed to derivefrom a common ancestor. While plants haveadapted to a multitude of terrains, climate condi-tions, and selective breeding efforts over the mil-lennia, the process of evolution ensures that theyremain related in fundamental ways. At the molec-ular level, for example, what causes a rose bushto flower is not terribly different from what occursin a radish plant. Other characteristics also appearto be similar across species, such as the fruit-ripening process and the internal clock that tellsplants when to open their pores in anticipation ofdaylight. In fact, the physiology and biochemistryof plants display such uniformity across speciesthat one can say, without too much exaggeration:When you’ve seen one flowering plant (at the mol-ecular level), you’ve seen them all.

This essential truth has altered the course ofstudy in plant biology, a field once dominated byresearch into individual crops, such as corn orwheat. Today, plant biology has its own modelorganism, the flowering mustard plant Arabidopsisthaliana; consequently, research in the field nowresembles other types of broad basic research,such as that done on bacteria or animals.

Considered a weed because it is uncultivatedand grows in profusion, Arabidopsis nonethelessengages the attention of a global research com-munity. The researchers, and agencies such as

NSF that support them, expect that by analyzingthe structure and functions of Arabidopsis, theyare laying the groundwork for analyzing most otherplant species.

The project has greatly accelerated the practicalapplication of basic discoveries in agriculture andforestry. Genetically engineered species are begin-ning to appear, and many believe they signal thebeginning of a revolution in plant breeding.

In one such area of discovery, scientists haveidentified genes involved with the regulation andstructural forms of flowers. Knowledge of thesegenes has made possible the genetic engineeringof plants other than Arabidopsis. For example,aspens normally flower only after they haveattained a height of 30 feet, which can take upto twenty years. A genetically transformed aspen,however, flowered in only six months, when it wasjust 2 inches tall. Commercial tree growers have

Researchers are studying the inherited

characteristics of Arabidopsis. The NSF-

funded genome research project to map

Arabidopsis will yield important informa-

tion about how flowering plants interact

with their environments.

Page 75: America's Investment in the Future

66 — National Science Foundation

always wanted to control the timing of floral andfruit production, as well as the closely relatedreproductive cycle. The technology is also beingtested in fruit and timber trees.

In another research initiative, health concernsover saturated fat and hydrogenated vegetableoils are motivating a search for edible oils thatpose no threat to human health. The pathways bywhich plants produce edible unsaturated oils havebeen elucidated and the responsible desaturasegenes cloned from Arabidopsis. The Arabidopsisgenes were used to identify the correspondinggenes in soybeans and other crop plants, whoseoils account for approximately one-third of thecalories in the American diet. At present, mostplant oils are chemically hydrogenated to keepthem from turning rancid. The availability of the

desaturase genes raises the possibility that nutritionally desirable edible oils can be producedfrom plants without the need for chemical modifica-tion. Agrichemical producers have begun field trialswith modified soybeans and other plant species.

Inside the Little Green FactoriesPlant breeding became a science around the turnof the twentieth century, thanks to Austrian scientistand mathematician Gregor Mendel. His studies ofheredity in peas enabled him to draw conclusionsabout gene functioning by observing how the char-acteristics of parents showed up in generations ofoffspring. While adopting increasingly sophisticatedtechniques, plant breeders continued to improvecrops in traditional ways, crossing the currentstock with germplasm containing useful new char-acteristics. The success of the outcomes dependedon the skill and judgment of the breeder in select-ing plants to cross.

Scientists, meanwhile, sought to understandthe underlying genetic mechanisms that inducedplants to express inherited characteristics incertain ways. With advances in plant tissue culturetechniques, biologists were able to produce novelhybrids and study them under controlled laboratoryconditions. Of particular interest were plantcharacteristics that might potentially be modifiedin ways advantageous to humans. One scientistdescribed plants as the “little green factories” thatproduce food, fibers, housing materials, and manypharmaceuticals, as well as the oxygen necessaryfor terrestrial life.

This scanning electron micrograph

shows an Arabidopsis plant in bloom,

highlighting the emergence of the

plant’s flower. The image provides

details about the flowering process that

help researchers follow the earliest

events in the development of an

Arabidopsis flower.

Page 76: America's Investment in the Future

How to Make a FlowerElliot M. Meyerowitz of the California Instituteof Technology in Pasadena was one of the firstmolecular biologists to receive an NSF grant tostudy Arabidopsis genetics. His work on thedevelopment of flowers illustrates how the meth-ods of scientific inquiry employed in molecularbiology can unlock the secrets of plant life.

Flowers are made up of four concentric whorls.Surrounded by tough, protective structures calledsepals, the petals themselves surround the maleand female sex organs, respectively called sta-mens and carpels. Three types of genes controlhow the whorls develop, and by looking at flowersthat lacked some genes, Meyerowitz’s lab dis-covered that if only type A genes are active, acell knows to become part of a sepal. With Aand B genes switched on, the cell turns into partof a petal. Together, genes B and C direct a cellinto a stamen, and C alone, into a carpel.

Meyerowitz’s work has broad applicability. Fully80 percent of the world’s food supply is made upof flowers or flower parts: fruit, grains, or seeds.While genetically engineered flowers may have lim-ited commercial value, the same formulas may oneday be used to tailor food crops to the requirementsof humankind.

Page 77: America's Investment in the Future

68 — National Science Foundation

Knowledge about genetics grew rapidly duringthe 1960s and 1970s, and certain characteristicsbecame recognized as central to all organisms:bacteria, animals, plants, and humans. For example:

• Organs develop and function as they do becauseof the way different combinations of genesexpress themselves in the form of proteinsproduced within cells. The instructions that tellproteins to form a blood cell, a brain cell, or aflower petal are all contained within the genome,and insofar as is now known, in the chemicalcomposition of the deoxyribonucleic acid, or DNA,in a particular gene sequence along a chromo-some.

• When a gene or its constituent nucleotidesundergoes a sudden random change, known asmutation, the result is an abnormality in theaffected cells. Mutations that render an organ-ism better able to cope with its environmentare the raw material that natural selection actson. Many of the successful mutations of anorganism’s ancestors, and possibly a mutationor two of its own, are reflected in the organism’sgenetic composition, or genome.

• The genome responds to environmental forces,such as the supply of essential nutrients, toproduce an organism’s observable character-istics, or phenotype.

• Through recombinant DNA technology, or geneticengineering, it is possible to create new strainsof organisms with DNA containing the exactgenes desired from different sources.

Barbara McClintock’s work identifying mobilegenes in corn, for which she received a NobelPrize in 1983, provided molecular biologists withthe tools necessary for the development of planttransformation. Despite the essential role thatplants play in human existence, much less timeand energy had gone into studying the geneticfunctioning of plants than of bacteria or animals—or humans. A major obstacle was the large andunwieldy mass of genetic material found in mostcrop plants, which were the primary subjects ofscientific research.

This obstacle was a big one. A scientist whowants to find the genetic source of a mutation,such as resistance to a particular disease, has to examine the cells where the mutation wasexpressed and connect the genetic informationthere back to the DNA. The technologies for identifying and isolating genes, sequencing them,cloning them (making large numbers of exactreproductions), and determining their functionsare complex, labor-intensive, and expensive. To apply these techniques to plants, molecularbiologists needed a plant whose genome wasof manageable size.

This Arabidopsis plant, grown under

short-day conditions (eight hours of

light/sixteen hours of dark), shows

how many leaves are produced during

the vegetative phase of the plant’s

approximately five-week life cycle.

After the transition from the vegetative

to reproductive phase, the plant pro-

duces flowers.

Page 78: America's Investment in the Future

“We are now in a ‘golden age’ of discovery inplant biology. Problems that have been intractablefor decades are yielding to the application ofmodern methods in molecular and cellular biology.The formula for much of this success is conceptu-ally simple: Isolate a mutation that affects theprocess or structure of interest, clone the gene,find out where and when it is expressed, wherethe gene product is located, what it does, andwhat it interacts with, directly or indirectly . . . .Although it is not necessarily easy, any gene thatcan be marked by a mutation can be cloned. Thisis a qualitatively different situation from anythingthat has ever before existed in plant biology.”

—From Arabidopsis, E. Meyerowitz and C. Somerville,eds., Cold Spring Harbor Press, 1994.

Golden Ageof Discovery

Page 79: America's Investment in the Future

70 — National Science Foundation

Increasingly, they converged on Arabidopsisthaliana, a weed of the mustard family that hasone of the smallest genomes of any floweringplant. It is estimated that 20,000 to 25,000 genesare arrayed on only five chromosomes, with littleof the puzzling, interminably repetitious DNAthat frustrates ef for ts to study most plants.Arabidopsis is compact, seldom exceeding about afoot in height, and it flourishes under fluorescentlights. All of these characteristics enable scientiststo raise it inexpensively in laboratories. During itsshor t life cycle, this mustard weed producesseeds and mutants prodigiously. It can be trans-formed through the insertion of foreign genes andregenerated from protoplasts, plant cells strippedof their cell walls. For all of its superior properties,Arabidopsis is typical of flowering plants in itsmorphology, anatomy, growth, development, andenvironmental responses, a kind of “everyman” ofthe plant world. In short, Arabidopsis thaliana isa biologist’s dream: a model plant.

NSF Helps Launch the New Biology Arabidopsis began to intrigue not only plant biolo-gists, but also scientists who formerly specializedin bacteria or fruit flies. As laboratories around theworld undertook Arabidopsis projects, the stock ofavailable mutants grew and new techniques weredeveloped for gene cloning. Scientists beganmaking breakthrough discoveries. And NSF under-took to advance Arabidopsis research even morerapidly—first through a series of workshopsand then by launching a long-range plan in 1990for the Multinational Coordinated Arabidopsisthaliana Genome Research Project. The project’ssteering committee, made up of scientists fromeight countries, announced a collaborative agree-ment within the international community to pursuethe goal of understanding the physiology, biochem-istry, and growth and developmental processes ofthe flowering plant at the molecular level.

“I see the NSF program people as scientificcollaborators,” said Chris Somerville, director of the Plant Biology Department at CarnegieInstitution of Washington in Stanford, California,and with Elliot Meyerowitz, co-author of the lead-ing research compendium on Arabidopsis. “[NSF]sensed something happening in the communitythat the individual scientist didn’t necessarilyappreciate fully. By bringing a few of us together,they helped us develop our vision. They played acatalytic role. They observed what was going onand made a good judgment about what it meant.Once we began discussing it, we began to seewhat we could do collectively.”

In the years since the launch of the multinationalproject, the Arabidopsis research community hasbecome a worldwide network of organizations andindividuals. Their continued willingness to shareinformation helps keep the project energized and

Researchers around the world are using

the unassuming mustard weed to unlock

the secrets of the plant world—secrets

with many potential benefits.

Page 80: America's Investment in the Future

Arabidopsis — 71

the path cleared for new discoveries. With fundingfrom NSF and other federal agencies, as well asgovernments in other countries, biological resourcecenters have been established around the world tomake seeds of mutant strains—one scientist calledthem “starter kits”—available to laboratories thatwant to study them. Between 1992 and the sum-mer of 2000, the Arabidopsis Biological ResourceCenter at Ohio State University, which sharesresponsibility with a British center for requeststhroughout the world, shipped 299,000 seed sam-ples and 94,000 DNA samples. In the spirit ofopenness and collaboration encouraged by a multi-national steering committee, hundreds of Arab-idopsis researchers worldwide regularly makedeposits of new seed lines and DNA libraries intothe centers.

Accelerating the PaceThe U.S. component of the multinational effortto sequence the Arabidopsis genome started asa joint program by NSF, the U.S. Department ofAgriculture (USDA), and the Department of Energy(DOE). Building on this effort, in May 1997 theWhite House Office of Science and TechnologyPolicy (OSTP) established the National PlantGenome Initiative (NPGI) program with the long-term objective to understand the genetic structureand functions in plants important to agriculture,the environment, energy, and health. In the NPGI'sfirst year, NSF, USDA, and DOE provided additionalfunds to accelerate completion of Arabidopsissequencing. The international Arabidopsis com-munity now expects to publish the completesequence of the plant's genome by the end of2000, four years ahead of the original schedule.

An in situ hybridization process reveals

the accumulation of a specific type of

RNA—Apetala 1 —in a mutant

Arabidopsis flower. Looking at the

flower in this state provides scientists

with details about the development of

the flower’s sepals, petals, stamens,

and carpels.

Page 81: America's Investment in the Future

72 — National Science Foundation

In 1999 U.S. and European scientists completedmapping the DNA sequences of two of the fivechromosomes of Arabidopsis and published theirfindings in the December 16, 1999 issue of Nature.The results—the first complete DNA sequence ofa plant chromosome—provided new informationabout chromosome structure, evolution, intracel-lular signaling, and disease resistance in plants.

Through the NPGI program, NSF, USDA, andDOE also began jointly funding research tosequence the rice genome, enabling U.S. partici-pation in an international collaboration whosegoals are set by the International Rice GenomeSequencing Working Group. Most of the world'smajor food crops (including rice) are grasses, andthey share common sets of genes. The relativelysmall size of the rice genome—430 million basepairs of DNA divided into 12 chromosomes—makes it a model system for understanding thegenomic sequences of other major grass cropsincluding corn, wheat, rye, barley, sorghum, sugarcane, and millet. The working group estimatesthat researchers could complete the sequencingof the rice genome by 2008.

With rice and other plant sequencing effortsunderway and with the completion of the Arab-idopsis genome sequence tantalizingly close,plant researchers have begun to shift their focusfrom gene identification to functional genomics—a multidisciplinary approach to develop an under-standing of the functions of the plant's genesand how they work together under different con-ditions. A systematic effort to effectively use themassive amounts of genome data becomingavailable to determine the functions of all of thegenes of Arabidopsis is seen as the next frontierin plant research. Such an effort could beaccomplished by 2010, according to a recentestimate, and would lead to an integrated data-base that would be a blueprint of Arabidopsisthrough its entire life cycle.

Research to understand the functions of Arab-idopsis’s gene sequences are still in the earlystages but breakthroughs with considerable practi-cal applications could come in the areas of:

DISEASE RESISTANCE. Plant breeders have longknown that certain varieties of crops are moreresistant than others to particular viral, bacterial,or fungal pathogens. Disease resistance is a majorgoal of most plant-breeding programs, but it hastypically been a long process involving crop plantsfound in natural wild populations. The process hasbeen impeded by the “species barrier,” which, untilrecently, prevented desirable genes from beingpassed around—from corn to cauliflower, forexample. Arabidopsis researchers have determinedthe molecular sequences of genes that code fordisease resistance and, in addition, the process-es by which Arabidopsis and perhaps other plantsmarshal their defenses against pathogens. Thisdiscovery may be particularly useful in triggeringresistance to disease in species other than Arab-idopsis. In one enticing example, a bacterialpathogen of mammals was also discovered to bean Arabidopsis pathogen. Some of the same factorsare required for infection, leading researchers tospeculate that evolutionary susceptibility to disease may be accompanied by factors thatconfer resistance.

CIRCADIAN RHYTHMS. A vast array ofprocesses in plants are regulated in a circadianmanner, including daily leaf movements and poreopenings, flower-blooming schedules, and photo-synthesis cycles. The term “circadian” comesfrom the Latin words circa, meaning about, anddiem, meaning day. It refers to processes thatoccur approximately once every twenty-four hours,in response to an organism’s internal clock.Plants gain a strong adaptive advantage by beingable to anticipate oncoming dawn or dusk, ratherthan merely responding to the presence orabsence of light. They display this ability even as

Page 82: America's Investment in the Future

NSF’s support of genetics dates backto the earliest days of the agency. Oneof NSF’s first five grants in the field ofgenetic biology, as it was originallycalled, was made in 1952 to MaxDelbrück, who came to the UnitedStates from Germany in 1937. Trainedas a quantum physicist, he gravitatedto biology. While working at theCalifornia Institute of Technology andat Vanderbilt University, Delbrückorganized and inspired a distinguishedgroup of biologists. One member ofthe group was James Watson, who,along with Francis Crick and MauriceWilkins, received the Nobel Prize in1962 for discovering the structure ofdeoxyribonucleic acid, or DNA.

Delbrück’s contributions to the his-tory of genetics were numerous andevolved from his early interest in bac-teriophages, viruses that infect bacteria.A phage can attach itself to a bacterialcell, shuck off its own protein coat,and infiltrate the host cell the way thecontents of a syringe enter a vein.Once the phage is inside a cell, itsgenetic material combines with that ofthe bacteria, and the phage reproducesitself exactly. These characteristicsmake phages ideal for the study ofbiological self-replication and thetransfer of bacterial genes between

host organisms. Through experimentswith phages, Delbrück and a collabora-tor demonstrated, for the first time,that bacteria undergo mutation. Theirwork validated the revolutionary ideathat genetic principles apply to micro-organisms. It also opened the door togenetic analysis of recombinationwithin bacteria.

Delbrück won the Nobel Prize in1969, as “the man who transformedbacteriophage research from vagueempiricism to an exact science.” Inhis acceptance speech, Delbrückremarked upon the ways in which onescientific discovery leads to another,and he contrasted progress in art withprogress in science.

“The books of the great scientistsare gathering dust on the shelves oflearned libraries. And rightly so. Thescientist addresses an infinitesimalaudience of fellow composers. Hismessage is not devoid of universalitybut its universality is disembodied andanonymous. While the artist’s commu-nication is linked forever with its original form, that of the scientist ismodified, amplified, fused with theideas and results of others, and meltsinto the stream of knowledge andideas which forms our culture.”

Communication. . . Fused with theIdeas and Results of Others

Page 83: America's Investment in the Future

The major multinational effort to under-

stand the intricacies of Arabidopsis has

already led to major breakthroughs in

engineering disease-resistant plants.

74 — National Science Foundation

days grow shorter in the fall or longer in the spring.By fusing Arabidopsis genetic material with bio-luminescent material from fireflies, researchershave been able to observe a glowing pattern ofresponse that reflects the plants’ internal clocks.This enabled them to find mutants with aberrantresponses, which in turn led to identification of abiological clock gene named “toc.” Although influ-enced by sunlight, “toc” also operates indepen-dently, even when the plant is in constant darkness.

ENVIRONMENTAL RESPONSE. Plants respond toa great deal of information from the type of day-light they receive. For example, the changing lightthroughout the year provides clues about whetherit is time to sprout or time to make seeds. Whenan object blocks the light, plants respond bygrowing around the object to reach the light. Muchof our current information about how plants per-ceive and respond to light is derived from studieswith Arabidopsis. These studies have identifiedthe basic genetic framework of light perceptionand the complex communications system, calleda signal transduction network, through which plantsact upon information from their photoreceptors.There has also been significant progress in under-standing how plants respond genetically whenexposed to stresses in the environment, such asozone, UV-irradiation, touch, cold, and oxygendeprivation.

PLANT HORMONE RESPONSE. Hormonesplay a central role in the regulation of plant growthand development. Of particular interest is the fruit-ripening hormone ethylene; growers have longsearched for a way to minimize crop spoilage bypreventing or delaying ripening in a reversible man-ner. Studies of Arabidopsis have demonstrated forthe first time the mechanisms through which thetissues and cells of plants respond to ethylene. Agene that prevents response to the growth sub-stance ethylene turns out to be comparable to a“never-ripe” gene in tomatoes, a finding that fur-ther supports Arabidopsis’s ability to serve as amodel for other plants, including crop species.

COMMERCIAL APPLICATIONS. Genetic comparisons between Arabidopsis and cropspecies are increasing constantly. For example,even though the flowers of Arabidopsis are verydifferent from those of snapdragons, the samegenes control flower development in both. Thisdiscovery brings scientists closer to understandingand being able to manipulate the development ofgrains, fruits, and other flower products to oneday create more productive crops. The genes thatguide the synthesis of oils in Arabidopsis areclosely related to those that produce oils incommercial oil crops, a relationship that is alreadybeing exploited commercially to produce plantswith oils lower in polyunsaturated fats. Arabidopsishas also been the test organism for efforts toproduce biodegradable plastics in crop plants.Several large chemical companies have startedactive research programs based on Arabidopsisresearch to develop transgenic crops that producepolyhydroxybutyrate (PHB), a biodegradable plastic.

Why Learn about Arabidopsis?The Multinational Coordinated Arabidopsis thalianaGenome Research Project began as an effort byNSF and leading academic researchers to advancefundamental knowledge about how plants function.When NSF published the multinational committee’slong-range plan in 1990, U.S. government expendi-tures on Arabidopsis research totaled $7.5 million.In 1993, total expenditures on Arabidopsis researchby NSF, USDA, DOE, and the National Institutes ofHealth (NIH) were $22 million. In 1998, NSF, USDA,and DOE awarded an additional $28.3 million overa three-year period to accelerate the pace of Arab-idopsis research. The global effort to understandArabidopsis involves scientists in more than thirtycountries.

Page 84: America's Investment in the Future

Presciently summing up the major applicationsof biotechnology, a 1995 report from the NationalScience and Technology Council called Biotechnologyfor the 21st Century stated: “Through the use ofadvanced tools such as genetic engineering,biotechnology is expected to have a dramaticeffect on the world economy over the next decade.Innovations emerging in the food and pharmaceu-tical sectors offer only a hint of the enormouspotential of biotechnology to provide diverse newproducts, including disease-resistant plants, ‘nat-ural’ pesticides, environmental remediation tech-nologies, biodegradable plastics, novel therapeuticagents, and chemicals and enzymes that will reducethe cost and improve the efficiency of industrialprocesses. . . [B]iotechnology. . . may well play aspivotal a role in social and industrial advancementover the next ten to twenty years as did physicsand chemistry in the post-World War II period.”

Arabidopsis — 75

From the beginning, NSF saw the Arabidopsiseffort as an opportunity to foster a collegial, highly motivated, scientific community that wouldadvance fundamental knowledge in an effectiveway. Both within NSF and in other agencies, officials also recognized that the research hasimportant practical applications. Despite the vastproductivity of the agricultural sector, most cropsgrown in the United States produce less than 50 percent of their genetic potential. Plants succumb to disturbances in their environment; in some years, floods, drought, disease, and parasitic attacks cost billions of dollars. Unlikehumans, plants cannot be moved to high groundor inoculated against illnesses. The only protec-tion is to grow resistant strains, and many feelthat conventional plant breeding cannot accom-plish this fast enough. In the developing world inparticular, the problem is exacerbated by growingpopulations that put extraordinary pressures onthe ecosystem. Many see biotechnology as theonly feasible solution.

Bioengineered plants also figure prominently inthe ideal world envisioned by NIH and the healthcare community, who see potential in plants as asource of improved, less costly pharmaceuticals.The Department of Energy, for its part, envisionsa future in which biotechnology improves the qualityand quantity of biomass products, such as alter-native fuels and chemical feedstocks, and providesa way to engineer plants to clean up contaminatedsoil at former nuclear weapons production sites.

NSF Directorate for Biological Scienceswww.nsf.gov/bio

The Multinational Coordinated Arabidopsis thaliana GenomeResearch Project Progress Reports (published by NSF)www.nsf.gov:80/bio/reports.htm#progress

Gregor Mendel’s work in plant heredity www.netspace.org/MendelWeb/

Arabidopsis Biological Resource Center at Ohio State Universityhttp://aims.cps.msu.edu/aims/

Nottingham Arabidopsis Stock Centre http://nasc.life.nott.ac.uk/

The Arabidopsis Information Resource (TAIR)www.arabidopsis.org

National Institutes of Healthwww.nih.gov/

National Science and Technology Councilwww.whitehouse.gov/WH/EOP/OSTP/NSTC/html/NSTC_Home.html

To Learn More

Page 85: America's Investment in the Future

Decision Scienceshow the game

is played

Page 86: America's Investment in the Future

arly in its existence, NSF started to

support research on game theory—the

study of individuals’ rational behavior in

situations where their actions affect

other individuals. Although the research

had little practical value at the time,

NSF continued to support it during the

decades to follow, with substantial

returns on the investment. Game

theory and related areas of decision

science supported by NSF have helped

to solve practical problems once

thought too complicated to analyze.

E

Page 87: America's Investment in the Future

Game t h eo r y deals with the interactions of small numbers of individuals, such as

buyers and sellers. For almost thirty years after its development during World War II, the theory

remained an academic exercise, its dense mathematical proofs defying practical applications.

Yet NSF stood by leading economists who painstakingly demonstrated how to use game theory

to identify winning strategies in virtually any competitive situation. NSF also supported experi-

mental economists who tested theoretical approaches under controlled laboratory conditions,

and psychologists whose studies of individual decision making extended understanding of how

economically rational individuals behave.

Persistence paid off. In 1994, John F. Nash, Reinhard Selten, and John C. Harsanyi, who first

received NSF support in the 1960s, won the Nobel Prize in Economics for “their pioneering

analysis of equilibria in non-cooperative games.” The following year, game theory gave the Federal

Communications Commission the logical structure for innovative auctions of the airwaves for

new telecommunications services. The auctions raised over $7 billion for the U.S. Treasury, and

marked a coming of age for this important analytic branch of economics.

In supporting this field, NSF’s goal was to build the power of economics to elucidate and

predict events in the real world. The support not only advanced the discipline, but also benefited

all individuals in many aspects of daily life.

Page 88: America's Investment in the Future

Decision Sciences — 79

Decisions, DecisionsWe all find ourselves in situations that call forstrategic thinking. Business executives plan strate-gies to gain market share, to respond to theircompetitors’ actions, to handle relations withemployees, and to make career moves. Managersin government think strategically about the likelyeffects of regulations at home and of diplomaticinitiatives abroad. Generals at war develop strate-gies to deploy troops and weaponry to defeat theenemy while minimizing their own losses. At a moreindividual level, buyers and sellers at flea marketsapply strategies to their bargaining. And parentsuse strategy on their children, who—of course—behave strategically with their parents.

What is strategy? Essentially, it is anticipatingthe actions of another individual and acting inways that advance one’s self-interest. Since theother person also behaves strategically, strategyincludes making assumptions about what thatindividual believes your strategy to be. We usuallyassociate strategy with adversarial situations suchas war, but that is much too narrow. In love, weuse strategy, often unthinkingly, to win our lovedone’s heart without sacrificing our self-esteem orour bank account. Some strategists have objec-tives, such as racial harmony, that they feel serveeveryone’s interest. Probably the most commonuse of strategy occurs in basic economic trans-actions, such as buying and selling. However itis applied, the ultimate point of strategy is toachieve objectives.

That is precisely what players of games try todo. Similarities between games and strategicbehavior in the economy formed the frameworkfor Theory of Games and Economic Behavior, abook published in 1944 by mathematicians Johnvon Neumann and Oskar Morgenstern. Theirlandmark work begins where classical economicsleaves off.

The starting point for traditional economics isthe equilibrium price, the point at which a seller’sasking price equals the buyer’s bid price. Classicaleconomic theory goes on to analyze the price in terms of outside influences. Von Neumannand Morgenstern, however, went in anotherdirection: They looked at the relationship betweenthe participants.

Exactly how, they asked, do buyers and sellersget to the equilibrium price? In a world of perfectcompetition, containing so many buyers and sell-ers that any one individual’s acts are insignificant,marketplace dynamics suffice. But what abouteconomic transactions that involve only a fewbuyers and sellers? What happens when, forexample, MCI offers potential customers a dealon long-distance service, and AT&T responds byoffering the public its own new deal? The strate-gic moves in such economic decision makingstruck Morgenstern and von Neumann as mathe-matically indistinguishable from moves in chess,poker, and other games in which some strategiesconsistently win over others. Their book, a com-pendium of mathematical theorems embodyingmany different strategies for winning, was the firstrigorously scientific approach to decision making.

Significant as it was, game theory took a longtime to catch on. A small group of academicsrecognized its significance as a research tool.And some military applications appeared in the1950s, when the Rand Corporation used gametheory to anticipate responses of potential ene-mies to weaponry of various kinds. The world ofbusiness, however, regarded game theory as anarcane specialty with little practical potential.

Page 89: America's Investment in the Future

80 — National Science Foundation

NSF Lends Support NSF took a different view, and began to supportgame theory mathematics in the 1950s. “Theappeal of game theory always was the beauty ofthe mathematics and the elegance of the theorems,”explains Daniel H. Newlon, senior program direc-tor for economics at NSF. “That was part of theappeal to a science agency.”

A few years later, when NSF began to fundresearch in the social sciences, leading scholarsurged the agency to continue its support for gametheory. John Harsanyi of the University of Californiaat Berkeley began receiving NSF grants in the1960s, as did other major game theorists. “Whatkept NSF interested in game theory,” says Newlon,“was the drive of people working in this field tounderstand how people interact, bargain, andmake decisions, and to do it in a more rigorous,systematic fashion. For years, the problems wereso difficult, given the state of computers and themathematical tools at people’s disposal, that youdidn’t see significant results. Yet NSF hung in there.”

NSF went beyond supporting individual gametheorists. It also sponsored conferences that gavegame theorists the opportunity to gain visibility fortheir work. One such event was the annual StanfordInstitute Conference in Theoretical Economics—run by game theorist Mordecai Kurtz—which NSFbegan to fund in the mid-1960s. Another NSF-funded meeting at the State University of New Yorkhad two goals: to use game theory to advancethe frontiers of economic research and to improvethe skills of graduate students and junior facultyin economics departments. From time to time, NSFinvited proposals for workshops and awardedgrants for computers and other needed equipment.

Into the LaboratoryOriginal work in game theory consisted entirely ofmodels—simplified representations of the under-lying logic in economic decision-making situations—which may have contributed to the business world’sreluctance to accept its usefulness. A theory inphysics or biochemistry can be tested in a con-trolled laboratory situation. In real-world decisionmaking, however, conditions are constantly alteredas a result of changes in technology, governmentinterventions, organizational restructuring, andother factors. The business world and most econ-omists found it hard to see how reading Theoryof Games and Economic Behavior could actuallyhelp them win games or make money.

In the early 1960s, Charles R. Plott and hiscolleagues at the California Institute of Technologystarted to make game theory into an experimentalpursuit. Supported by NSF, his group conducteda series of experiments that helped to answerquestions about one facet of game theory: theideal number of stages in an auction and theiroverall length. Experimentation, which Plott referredto as “debugging,” became increasingly popularin economics as a complement to field researchand theory.

The general idea was to study the operation ofrules, such as auction rules, by creating a simpleprototype of a process to be employed in a com-plex environment. To obtain reliable informationabout how test subjects would choose amongvarious economic alternatives, researchers madethe monetary rewards large enough to induceserious, purposeful behavior. Experiments withprototypes alerted planners to behavior that couldcause a system to go awry. Having advance warn-ing made it possible to change the rules, or thesystem for implementing the rules, while it wasrelatively inexpensive to do so.

Other economists refined and expanded gametheory over the years to encompass more of thecomplex situations that exist in the real world.Finally, in the early 1980s, business schools and

Page 90: America's Investment in the Future

Starting in the 1960s, withsupport from NSF, Charles R.Plott of the California Instituteof Technology made advancesin game theory that paved theway for practical applicationsthree decades later. Here, heoutlines the practical relevanceof NSF-supported economicsresearch:

“The fruits of economicresearch are everywhere.Because NSF is the only dedi-cated source of funding in theUnited States for basic researchin the economic sciences, itsimpact has been large. We seeit in the successful applicationof game theory to the designof the FCC auctions of licens-es for new telecommunica-tions services.

“More broadly, we see theimpact of NSF-supported workin some of the most importanteconomic trends of our life-times, such as deregulation ofairlines and other industries,nongovernmental approachesto environmental protection,and the liberalization of world-wide trade. The recent reex-amination of the ConsumerPrice Index and how it shouldbe measured relies heavily onNSF-sponsored basic researchon price indices.

“In economics it is easy tofind problems that are notsolved, and perhaps are notsolvable in any scientificsense. Yet measured in a cost-benefit sense, the achievementsof economic research standagainst those of any science.”—Charles R. Plott

The Fruits of Economic ResearchAre Everywhere

Page 91: America's Investment in the Future

82 — National Science Foundation

Ph.D. programs in economics began to appreciatethe power of game theory. By the 1990s, it hadall but revolutionized the training of economistsand was a standard analytical tool in businessschools. In 1994, game theory received the ulti-mate recognition with the award of Nobel prizesto Nash, Selten, and Harsanyi—three pioneeringresearchers in the field.

Practical PayoffsFrom a financial standpoint, the big payoff forNSF’s long-standing support came in 1995. TheFederal Communications Commission (FCC) estab-lished a system for using auctions to allocate bandsof the electromagnetic spectrum for a new gener-ation of wireless devices that included cellularphones, pagers, and hand-held computers with email capabilities.

Instead of the standard sealed-bid auction,Stanford University game theorist Paul Milgrom,an NSF grantee, recommended open bidding, whichallows each bidder to see what the others areoffering. Participants could also bid simultane-ously on licenses in the fifty-one zones establishedby the FCC. Game theory’s models of move andcountermove predicted that open bidding wouldreassure bidders who, in trying to avoid the so-called winner’s curse of overpaying, might beexcessively cautious. Open bidding would alsoenable bidders to carry out economically advan-tageous strategies to consolidate holdings inadjacent territories, although FCC rules guaranteedthat no one could obtain a monopoly in any zone.The intended outcome was an optimal solution forall parties. The bidders would get as many licens-es as they were willing to pay for, while the U.S.Treasury would earn the maximum possible.

In the final accounting, the FCC’s 1995 simul-taneous multiple-round auctions raised over $7 billion, setting a new record for the sale ofpublic property. Not only was the decision a

landmark in the recovery of private compensationfor use of a public resource, it also represented avictory for the field of game theory, whose lead-ing scholars had applied what they knew aboutstrategic decision making in recommending anauction design to the FCC.

Game theory has proved its worth in many otherpractical areas, among them management planning.Alvin E. Roth of the University of Pittsburgh appliedgame theory to analyze and recommend match-ing mechanisms for allocating thousands of med-ical interns among hundreds of hospitals in sucha way as to give both the hospitals and the internsthe matches they favor the most. He had thebroader research aim of understanding how marketinstitutions evolved to determine the distributionof doctors and lawyers.

Polls, Markets, and AllocationsGame theory represents one important facet ofdecision science. In fact, decision science dealswith the entire subject of markets—for goods, ser-vices, and ideas, as well as labor. NSF-fundedresearcher Bob Forsythe, at the University of Iowa,provides one example of efforts in this area: Hisinnovative Iowa Electronic Market, started in 1988,offered speculators a “real-money futures market.”This type of market deals with abstract, but mea-surable, items. Participants bet on what the priceof an abstract item, such as pork bellies, will bedays, weeks, or months into the future. Betweenthe time the bet is placed and the point at whichit is paid off, the price of pork bellies will be influ-enced by a succession of economic and politicalevents, including elections and the stock prices offirms in the pork industry.

Forsythe’s “market” actually represented aneffort to elicit more accurate information from votersthan opinion polls provided. Instead of pork bellies,it focused on the electoral prospects of politicalcandidates. “Market prices” summed up what

Page 92: America's Investment in the Future

Decision Sciences — 83

players knew, or thought they knew, about a candi-date’s true chances of success in an election.Participants would win if the candidates on whomthey bet were elected, and would lose if the can-didates lost. Plainly, market participants couldinfluence the result by their own votes; they wouldslightly improve the chances of the candidatesthey bet on by voting for those candidates, andslightly diminish those chances by voting for theopponents. Nevertheless, in its first ten years,the Iowa Electronic Market predicted election outcomes more accurately than did pollsters.

Recently, Forsythe and his colleagues receivedNSF funding to develop instructional materials thatuse the electronic market as a laboratory exerciseto help undergraduates studying economics betterunderstand market concepts

In another innovative area, called “smart markets,”computers have become partners in the processof making allocation decisions, such as assignmentsof airport landing rights and management of gaspipelines and electric distribution systems.Computers process information, coordinate activi-

ties, and monitor allocation situations historicallythought to be impossible to manage with anythingother than a heavily bureaucratic administrativeprocess. For example, the results of NSF-sponsoredresearch have been used to shape a particulartype of market that sets pollution limits and allowsfacilities that generate pollution to trade ‘pollutionpermits’ among themselves, so long as the overalllimit is not exceeded.

Real-World Decision MakingThese examples illustrate the common threadamong the diverse projects of economists who build and test models: All of the projectsare designed to explain more of what occurs inthe real world.

Economic models work well when applied tomarkets and other institutions, in part becausepeople gathered together in large numbers seemto behave as “rational” decision makers. Butindividual behaviors, and the behavior of smallgroups of individuals like the bidders in the FCC

Decision science has broad implications

for all sectors of our society. It plays

a role in understanding outcomes in

financial markets and assessing the

role of a centralized matching system

in ensuring a stable supply of medical

school graduates to hospital residency

programs. Among the many questions

that NSF-funded decision science

researchers are attempting to answer:

how people behave in economic envi-

ronments, how information is distrib-

uted within economic institutions, and

the influence of expectations and beliefs

on decision making.

Page 93: America's Investment in the Future

84 — National Science Foundation

on these tendencies, Paul Slovic, Baruch Fischhoff,and Sarah Lichtenstein of Decision Research inEugene, Oregon, wrote:

“People greatly overestimate the frequency ofdeaths from such dramatic, sensational causesas accidents, homicides, cancer, botulism, andtornadoes, and underestimate the frequency ofdeath from unspectacular causes that claim onevictim at a time and are common in nonfatalform—diabetes, stroke, tuberculosis, asthma,and emphysema . . . . The errors of estimationwe found seemed to reflect the working of amental shortcut or ‘heuristic’ that people commonlyuse when they judge the likelihood of risky events.”

The authors explain that people judge an eventas likely or frequent if instances of it are easy toimagine or recall. On the other hand, individualsoften don’t bother to consider information that isunavailable or incomplete. Every time we makedecisions that involve probabilities, we confirmthe reality of the phrase “out of sight, out of mind.”

Another area where humans are systematicallyerror-prone involves what economists call utility.We frequently face choices between doing the safething and taking a risk. One example is the choicebetween driving to work on secondary roads ortaking the interstate, which usually saves severalminutes but can occasionally take an extra half-houror more because of back-ups. Another is thedecision between investing in a safe money mar-ket account and taking a flier on a volatile stock.

No two people feel exactly the same aboutwhich risks are worth taking. The concept of utilitycombines several factors in decision making: therange of possible outcomes for a particular choice,the probability associated with each outcome,and an individual’s subjective method of rankingthe choices.

Imagine choosing between two tempting oppor-tunities. One is a coin toss—heads you win$1,000, tails you win nothing. The other is a surething—an envelope with $500 inside. Do youchoose the safe and sure $500 or the 50/50

“A bird in hand is worth two in the

bush” describes one action a person

might take to minimize risk and maxi-

mize utility—the real or perceived

ability of a product or service to satisfy

a need or desire. Utility theory attempts

to define the many factors that influence

how people make decisions and to pre-

dict how an individual will behave when

faced with difficult choices.

auctions, often seem inconsistent with rationality.NSF’s Decision, Risk, and Management ScienceProgram, within the Directorate for Social,Behavioral, and Economic Sciences, supportsleading scholars in the decision sciences wholook at these inconsistencies from another pointof view. They try to determine the nature and origins of systematic errors in individual decisionmaking, and use game theory to provide sets ofstrategies for anticipating and dealing with them.

Systematic errors abound in decisions thatinvolve probability. Most people are not good atestimating the statistical likelihood of events, andtheir mistakes fall into distinct patterns. Reporting

Page 94: America's Investment in the Future

All in a Day’s WorkImagine you are a cab driver. Whatyou earn in a given day varies accord-ing to the weather, time of year, con-ventions in town, and other factors.As a rational person, you want tomaximize both your income and yourleisure time. To achieve that, youshould work more hours when wagesare high and fewer hours whenwages are low.

What that means is that cab driversshould work more hours on busy daysand fewer hours on slow days. Dothey? Not at all; they do the opposite.

Colin Camerer of the CaliforniaInstitute of Technology made this dis-covery when he and his colleaguesinterviewed a large sample of cabdrivers. They found that the cabbiesdecide how many hours to work bysetting a target amount of moneythey want to make each day. Whenthey reach their target, they stopworking. So on busy days, they workfewer hours than on slow days.

Why? Camerer suggested thatworking an extra hour simply may

not be worth an hour of leisure time;in the language of economics, themarginal utility is too low. On theother hand, it may not be the moneyas much as cab drivers’ feelingsabout the money—or, more precisely,how they think they may feel if theydepart from their usual workinghabits. Will a cab driver who worksan extra hour or two on a busy dayfeel later that it wasn’t worth theeffort? Will one who knocks off earlyon a slow day feel guilty about it?Setting a target may be a way toavoid regrets.

NSF has supported Camerer andothers in their efforts to explain thisand other paradoxes that character-ize human economic behavior. Fromthe beginning, decision scienceresearch has had the goal of a betterfix on people’s feelings about wages,leisure, and tradeoffs between them,with implications for labor relations,productivity, and competitivenessacross a wide spectrum of industries.

Page 95: America's Investment in the Future

A B

A B

33%33%CHANCECHANCE600600LIVELIVE

66%66%CHANCECHANCE

0LIVELIVE

100%100%CHANCECHANCE200200LIVELIVE

33%CHANCE600LIVE

66%CHANCE

0LIVE

100%CHANCE200LIVE

100%100%CHANCECHANCE400400DIEDIE

33%33%CHANCECHANCE

0DIEDIE

66%66%CHANCECHANCE600600DIEDIE

100%CHANCE400DIE

33%CHANCE

0DIE

66%CHANCE600DIE

86 — National Science Foundation

chance to win $1,000? An economically rationalperson makes the choice that reflects the highestpersonal utility. The wealthier that individual is, forinstance, and the more he or she likes to gamble,the higher is the utility of the risky 50/50 choiceversus the sure thing.

Questioning Utility TheoryUtility theory postulates that it should not matterhow alternatives are presented. Once we knowwhat’s at stake, and the risks involved, we shouldhave enough awareness of ourselves to make thechoice that serves us best.

In fact, psychologists Daniel Kahneman ofPrinceton and the late Amos Tversky of Stanforddemonstrated that the way alternatives are framedcan make quite a difference in our choices. In oneof their most famous studies, they presentedpeople with a choice between two programs thataddressed a public health threat to the lives of

600 people. When the outcomes of the programswere described as (a) saving 200 lives for sure,or (b) a one-third chance to save 600 lives and atwo-thirds chance to save no one, most respon-dents preferred the first option. But when theoutcomes were presented as (a) 400 people dyingfor sure, or (b) a two-thirds chance of 600 peopledying and a one-third chance that no one would die,most respondents preferred the second option.Of course, the two versions of the problem arethe same, because the people who will be savedin one version are the same people who will notdie in the other. What happens here is that peopleare generally risk-averse in choices between suregains and favorable gambles, and generally risk-seeking in choices between sure losses andunfavorable gambles. “Some propensities,” pointsout former NSF Program Director Jonathan Leland,“are so ingrained that the trick is to help peopleunderstand why their decisions are bad.” Noone, it seems, is immune to the power of thewell-chosen word.

NSF-funded researchers Daniel Kahneman

and the late Amos Tversky were instru-

mental in the development of rational

choice theory. First used to explain and

predict human behavior in the market,

advocates of rational choice theory

believe that it helps integrate and

explain the widest range of human

behavior—including who people vote

for, what they buy at the grocery

store, and how they will react when

faced with a difficult decision about

medical treatment.

Page 96: America's Investment in the Future

with higher expected payoffs. With NSF support,M.H. Bazerman of Northwestern University docu-mented these stubborn tendencies in a variety ofsettings, and proposed corrective measures thatorganizations can take to counteract them.

Just as it supported game theory from the veryearly stages, NSF has funded research on the appli-cation of psychology to economic decision makingfrom the field’s infancy. That support yielded evenfaster dividends: Within a few years, the researchhad given rise to popular books advising managersand others on how to correct for error-prone ten-dencies and make better decisions. “We knowthat people bargain and interact, that informationis imperfect, that there are coordination problems,”NSF’s Daniel Newlon explains. “NSF’s long-termagenda is to understand these things. Even ifthey’re too difficult to understand at a given time,you keep plugging away. That’s science.”

Decision Sciences — 87

Why We Make Foolish DecisionsIndividuals frequently ensure poor decision makingby failing to obtain even the most basic informationnecessary to make intelligent choices. Take, forexample, the NSF-supported research of HowardKunreuther of the University of Pennsylvania’sWharton School. He and his colleagues observedthat most people living in areas subject to suchnatural disasters as floods, earthquakes, andhurricanes take no steps to protect themselves.Not only do they not take precautions proven to becost-effective, such as strapping down their waterheaters or bolting their houses to foundations;they also neglect to buy insurance, even when thefederal government provides substantial subsidies.

What accounts for such apparently foolish deci-sion making? Financial constraints play a role. ButKunreuther found the main reason to be a beliefthat the disaster “will not happen here.” Hisresearch suggested “that people refuse to attendto or worry about events whose probability is belowsome threshold.” The expected utility model, headded, “is an inadequate description of the choiceprocess regarding insurance purchases.”

Kunreuther next applied decision science todevising an alternative hypothesis for the behavior.First, he posited, individuals must perceive that ahazard poses a problem for them. Then they searchfor ways, including the purchase of insurance, tomitigate future losses. Finally, they decide whetherto buy coverage. They usually base that decisionon simple criteria, such as whether they knowanyone with coverage. The research showed that,since people do not base their purchasing decisionson a cost-benefit analysis, premium subsidiesalone did not provide the necessary impetus topersuade individuals to buy flood insurance.

Decision making within organizations is alsoriddled with systematic bias. One example is thefamiliar phenomenon of throwing good money afterbad. Corporations frequently become trapped in asituation where, instead of abandoning a failingproject, they continue to invest money and/oremotion in it, at the expense of alternative projects

NSF Division of Social and Economic SciencesEconomics Programwww.nsf.gov/sbe/ses/econ/start.htm

NSF Decision, Risk, and Management Sciences Programwww.nsf.gov/sbe/ses/drms/start.htm

Consumer Price Indexhttp://stats.bls.gov/cpihome.htm

Decision Researchwww.decisionresearch.org

Iowa Electronic Marketwww.biz.uiowa.edu/iem/index.html

The Nobel FoundationNobel Prize in Economic Scienceswww.nobel.se/economics/laureates/1994/index.html

Stanford Encyclopedia of PhilosophyGame Theoryhttp://plato.stanford.edu/entries/game-theory/

Stanford Institute for Theoretical Economicswww.stanford.edu/group/SITE/siteprog.html

To Learn More

Page 97: America's Investment in the Future

Visualizationa way to seethe unseen

Page 98: America's Investment in the Future

omputers help us answer questions

about matters ranging from the future

of the world’s climate to the workings

of the body’s cells. Computer output,

however, can seem to be little more

than mounds of dry data, completely

disconnected from the dynamics of

the original event. To better connect

the answers to the questions, we

have come to use visualization tech-

niques such as computer graphics,

animation, and virtual reality—all

pioneered with NSF support.

C

Page 99: America's Investment in the Future

S c i e n t i s t s i n many d i s c i p l i n e s use sophisticated computer techniques

to model complex events and visualize phenomena that cannot be observed directly, such as

weather patterns, medical conditions, and mathematical relationships. Virtual reality laborato-

ries give scientists the opportunity to immerse themselves in three-dimensional simulations,

often with spectacular results such as those in the acclaimed IMAX film Cosmic Voyage. The

field of computer visualization has developed rapidly over the past twenty-five years, largely

because of NSF support. NSF has encouraged pioneering individual research, the development

of supercomputing centers, wider-range applications being explored at science and technology

centers, and far-reaching programs such as the Partnerships for Advanced Computational

Infrastructure. NSF’s commitment has helped computer visualization grow from its infancy in

the 1950s and 1960s to the important scientific and commercial field it is today.

Page 100: America's Investment in the Future

Visualization — 91

Visualizing Science in ActionA surgeon can repair a human heart, but like allliving organs, the heart presents a host of prob-lems to scientists who want to understand it indetail. X-rays, probes, and scans show only apartial picture—a snapshot—while often what isneeded is a motion picture of how all the partsinteract. In 1993, scientists at New York Universitycame up with a solution. Working at the NSF-fundedPittsburgh Supercomputing Center, they createdthe first three-dimensional, animated model of abeating heart.

That first successful “heartbeat” required nearlya week of computing time and represented fifteenyears of work by mathematicians Charles Peskinand David McQueen. Subsequently, the work hashad broad influence in studies of biological andanatomical fluid flow.

Other simulations demonstrate the dynamicsof much larger events,such as tornadoes. Scientistsat the University of Illinois have traced air motionwithin and around tornadoes by introducing thou-sands of weightless particles into the flow. Withthat information, and with the computing powerat the NSF-suppor ted National Center forSupercomputing Applications (NCSA), also at theUniversity of Illinois, they created a model thatprovides a closer look at updrafts, downdrafts,and strong horizontal changes in wind speed.

Robert Wilhelmson, an atmospheric computerscientist, began modeling storms almost thir tyyears ago in hopes of better predicting severeoccurrences. One of the founders of NCSA, hewanted from the beginning to model how stormsevolve. Wilhelmson and his research group havepushed storm visualizations from static two-dimensional images to three-dimensional anima-tions at ever-greater resolution and over longertime spans.

All the sciences use the visual arts in someform or another, depicting everything from mole-cules to galaxies. Engineering, too, relies ondetailed renderings. Over the years, NSF hasfunded the development of computer visualizationsin many fields, while at the same time challengingcomputer specialists to go back to the basics andlearn how to make these visualizations more accu-rate and more useful as scientific predictors.

Ensuring this accuracy, according to CornellUniversity’s Don Greenberg, a long-time NSFgrantee, entails making sure that computer-generated visualizations obey the laws of physics.Researchers at Cornell, one of five institutions inthe NSF Science and Technology Center forComputer Graphics and Visualization, addressthis issue by developing ways to incorporate thephysics of how light behaves and how our eyesperceive it. Their renderings look like architecturalphotos—studies of light and space. And theirresearch has been used by General ElectricAircraft Engines, Battelle Avionics, EastmanKodak, and others.

Moving computer visualizations from what isacceptable to what is most useful from a scientificstandpoint has taken a lot of work, says Greenberg.“It’s easy to make visualizations believable;Jurassic Park and Star Wars did a fine job of that,”he says. “But they weren’t very accurate. It’smuch harder to make them accurate.”

Mathematicians Charles Peskin and

David McQueen laid the groundwork

for visualizations that will allow physi-

cians to better understand the inner

workings of the human heart. The model

may lead to more effective diagnosis and

treatment of heart defects and diseases.

Page 101: America's Investment in the Future

92 — National Science Foundation

Worth at Least a Thousand Data PointsThe increased ability of scientists and engineersto model complex events is a direct result of NSF’sinvestment in supercomputing centers. Theseuniversity-based research facilities, started inthe 1980s, gave researchers around the countryaccess to the computational power they neededto tackle important—and difficult—problems.Visualization, while not an explicit goal of thesupercomputing centers, quickly emerged as away to cope with the massive amounts of scien-tific data that had been pouring out of computerssince the 1960s. “We became very good at flippingthrough stacks of computer printouts,” recallsRichard Hirsh, a specialist in fluid dynamics whois now NSF’s deputy division director for AdvancedComputational Infrastructure and Research. “Butwe realized that, at some point, people needed tosee their solutions in order to make sense of them.”

Humans are adept at recognizing patterns, Hirshsays, especially patterns involving motion. One ofthe early visualization success stories was a modelof smog spreading over Southern California, amodel so informative and realistic that it helpedto influence antipollution legislation in the state.As the cost of computer memory dropped andcomputer scientists began finding more applica-tions for visualization techniques, the scientificcommunity began to take notice.

The NSF Panel on Graphics, Image Processing,and Workstations published its landmark reportVisualization in Scientific Computing in 1987.“ViSC [visualization in scientific computing] isemerging as a major computer-based field,” thepanel wrote. “As a tool for applying computers toscience, it offers a way to see the unseen . . .[it] promises radical improvements in thehuman/computer interface.” The NSF report wasaccompanied by two hours of videotape demon-strating the potential of the new tool.

“Before the publication of the report, the opinions and observations of many well-knownand respected computer graphics experts wereof little concern to the scientific and computing

establishments,” recalls Tom DeFanti, director ofthe Electronic Visualization Laboratory at theUniversity of Illinois at Chicago and co-editor ofthe ViSC report. Today, he says, “their commentsare sought after—to educate the public, to influ-ence industry research, and to identify new scientific markets.”

NSF earmarked funds for visualization at thesupercomputing centers from 1990 to 1994. Duringthat time, application of visualization techniquesspread. Since 1997, NSF’s Partnerships forAdvanced Computational Infrastructure (PACI)program has stimulated further advances in areasranging from sophisticated tools for managing,analyzing, and interacting with very large datasets to collaborative visualization tools to enableresearchers from far-flung areas to work interac-tively on a real-time basis. Applications now spanthe whole of contemporary science. For example: • Molecular biologists use modeling to depict

molecular interaction.• Astronomers visualize objects that are so far

away they cannot be seen clearly with mostinstruments.

• Medical researchers use computer visualiza-tion in many diagnostic techniques, includingthe Magnetic Resonance Imaging (MRI) sys-tem that produces three-dimensional imagesof the body.

Art and Science: An Alternative to NumbersDeFanti and his colleague Maxine Brown summa-rized reasons for the booming popularity of visualization in Advances in Computers (1991):

“Much of modern science can no longer becommunicated in print; DNA sequences, molecu-lar models, medical imaging scans, brain maps,simulated flights through a terrain, simulationsof fluid flow, and so on all need to be expressedand taught visually . . . . Scientists need analternative to numbers. A technical reality todayand a cognitive imperative tomorrow is the useof images. The ability of scientists to visualize

Page 102: America's Investment in the Future

“Advances in computer graphicshave transformed how we usecomputers . . . While everyoneis familiar with the mouse, multiple ‘windows’ on computerscreens, and stunningly realisticimages of everything from ani-mated logos in television adver-tisements to NASA animationsof spacecraft flying past Saturn,few people realize that theseinnovations were spawned byfederally sponsored universityresearch.

“[For example] [h]ypertextand hypermedia have theirroots in Vannevar Bush’sfamous 1945 Atlantic Monthlyarticle ‘As We May Think.’Bush described how documentsmight be interlinked in thefashion of human associativememory. These ideas inspiredDoug Engelbart at SRI (fundedby DARPA) and Andries vanDam of Brown University (fund-ed by NSF) to develop the firsthypertext systems in the1960s. These systems werethe forerunners of today’s

word-processing programs,including simple what-you-see-is-what-you-get capabilities . . .

“High-quality rendering hascaught the public’s eye and ishaving a vast impact on theentertainment and advertisingindustries. From Jurassic Parkto simulator rides at DisneyWorld and dancing soda cansin TV commercials, the worldhas been seduced by computeranimation, special effects, andphotorealistic imagery of virtu-al environments . . .

“One could continue withmany more examples, but themessage is clear: federal spon-sorship of university research incomputer graphics stimulated amajor segment of the computingindustry, allowing the UnitedStates to establish and main-tain a competitive edge.”

—Excerpted from Computer Graphics:Ideas and People from America’sUniversities Fuel a Multi-billion DollarIndustry by Edward R. McCracken,Former chairman and Chief ExecutiveOfficer, Silicon Graphics, Inc. © 1995-1997.

Computer Graphics:A Competitive Edge

Page 103: America's Investment in the Future

94 — National Science Foundation

complex computations and simulations is absolutelyessential to ensure the integrity of analyses, toprovoke insights, and to communicate thoseinsights with others.”

Over the years, two basic types of drawingsystems have vied for the attention of bothdevelopers and users—vector graphics and rastergraphics. Vector graphics systems are based onspecifying the location of points on an X and Ycoordinate system and connecting the pointswith lines. The basic drawing element of vectorgraphics is the line, created by an electron beamin the monitor as it moves directly from one setof coordinates to another, lighting up all thepoints in between. By contrast, the electronbeam in the monitor of a raster graphics systemscans across the screen, turning on specific pic-ture elements (which came to be called pixels) ina predefined grid format.

While the precision of vector graphics waswell suited to mechanical drawing, computer-aided design and manufacturing, and architectur-al computer graphics, raster graphics opened uppossibilities in other areas and brought manymore types of people into the world of computergraphics. It was perhaps the use of raster graphicsin television advertising, including titles for network

specials, that brought the public’s attention to thepotential of computer graphics. The low resolutionof the television screen and the short viewingtime—measured in seconds—called for relativelyfew calculations and was therefore less expen-sive in terms of power, speed, and memory.However, there was initial disappointment in theprecision of raster graphics; the disappointmentwas largely offset with anti-aliasing techniquesthat minimized the disturbing effect of jaggedlines and stair-stepped edges. Compression inthe number of pixels in the predefined grid alsoimproved the quality of raster images.

But high resolution has a price. A typical full-color computer screen with 1,000 rows and1,000 columns of pixels requires 24 million bitsof memory. That number multiplied by at least 60 is the amount of memory required for therapid-fire sequencing of frames in a smooth, professional-looking animation. While not as costlyas it once was, animation remains an exercise inallocating the supercomputers’ massive resourcesto achieve the most effective results.

Today, scientific visualization embodies theresults that NSF hoped to achieve in funding thesupercomputing centers: to find answers toimportant scientific questions while advancing boththe science of computing and the art of usingcomputer resources economically.

In 1992, the four supercomputer centers thensupported by NSF (National Center for Super-computing Applications in Chicago and Urbana-Champaign, Illinois; Pittsburgh SupercomputingCenter; Cornell Theory Center; and San DiegoSupercomputer Center) formed a collaborationbased on the concept of a national MetaCenterfor computational science and engineering. Thecenter was envisioned as a growing collection ofintellectual and physical resources unlimited bygeographical or institutional constraints.

In 1994, the scientific computing division ofthe National Center for Atmospheric Research inBoulder, Colorado, joined the MetaCenter. The fivepartners, working with companies of all sizes,sought to speed commercialization of technology

Through computer mapping of topo-

graphical surfaces, mathematicians

can test theories of how materials will

change when stressed. The imaging is

part of the work at the NSF-funded

Electronic Visualization Laboratory at

the University of Illinois at Chicago.

Page 104: America's Investment in the Future

Visualization — 95

developed at the supercomputer centers, includingvisualization routines. An early success was Sculpt,a molecular modeling system developed at theSan Diego Supercomputer Center. It earned a placeon the cover of Science magazine and has nowbeen commercialized by a start-up company.

The concept of a national, high-end computationalinfrastructure for the U.S. science and engineeringcommunity has been greatly expanded since 1997,when the National Science Board, NSF’s governingbody, announced PACI as successor to the NSFsupercomputing program. PACI supports twopartnerships: the National Computational ScienceAlliance (“the Alliance”) and the NationalPartnership for Advanced ComputationalInfrastructure (NPACI). Each partnership consistsof a leading edge site—for the Alliance it is theNational Center for Supercomputing Applicationsin Urbana-Champaign, while the San DiegoSupercomputer Center is the leading edge sitefor NPACI—and a large number of other partners.More than sixty institutions from twenty-sevenstates and the District of Columbia belong to oneor both of the partnerships. With access to an interconnected grid of high-per formance computing resources, many researchers at theseparticipating institutions are developing state-of-the-art visualization tools and techniques toaddress multidisciplinary challenges that rangefrom creating roadmaps of the structures andconnections within the human brain to producingastronomically accurate, high-resolution animationsof distant galaxies.

Staking the Pioneers: The 1960s to the 1990sThe richness of computer visualization today can be traced back to pioneering work, such asIvan Sutherland’s landmark doctoral dissertationfrom the early 1960s. As an NSF-supported graduate student at the Massachusetts Instituteof Technology (MIT), Sutherland developed a real-time line-drawing system that allowed a per-

son to interact with the computer using a proto-type light pen. While the research itself was sup-ported in terms of both funds and computingresources by the Air Force through the MIT LincolnLaboratory, the NSF fellowship helped make thisgraduate study possible. Sutherland credits NSFfor the support it provided: “I feel good aboutNSF taking well-deserved credit for supportingmy graduate education. Having independent NSFsupport was crucial to my ability to transfer toMIT from Caltech. MIT had seemed eager tohave me in 1959, but was all the more willing toadmit me in 1960 as a post-master’s studentbecause I brought NSF support with me.”

Sutherland’s Sketchpad introduced new con-cepts such as dynamic graphics, visual simulation,and pen tracking in a virtually unlimited coordinatesystem. The first computer drawing system, DAC-1(Design Augmented by Computers), had beencreated in 1959 by General Motors and IBM.With it, the user could input various definitions ofthe three-dimensional characteristics of an auto-mobile and view the computer-generated modelfrom several perspectives. DAC-1 was unveiledpublicly at the 1964 Joint Computer Conference,the same forum Sutherland had used in 1963 tounveil Sketchpad, which had the distinguishingfeature of enabling the user to create a designinteractively, right on the screen. His achievementwas so significant that it took close to a decadefor the field to realize all of its contributions.

The NSF-funded National Center for

Supercomputing Applications at the

University of Illinois at Urbana-

Champaign has long been a source of

innovation in the field of visualization.

One product of NCSA-based research is

Virtual Director, created by Robert

Patterson and Donna Cox. This applica-

tion provides an easy-to-use method to

control camera action for playback or

animation recording.

continued on p. 98

Page 105: America's Investment in the Future
Page 106: America's Investment in the Future

Computer graphics has entered just aboutevery aspect of modern life. The informa-tion age has become an age of images—a new hieroglyphics containing morecontent and promising greater understand-ing than ever before. Among the key areasof modern business, industrial, and acade-mic activity that have been revolutionizedin the last forty years are those listed here.

ARCHITECTURE & ENGINEERING Buildingdesign, space planning, real estateanalyses, interior architecture anddesign, construction management,cost-estimating integrated with designand drafting, procurement, facilitiesmanagement, furniture and equipmentmanagement, and needs forecasting.

BIOMEDICAL APPLICATIONS Surgical andradiation therapy planning, diagnosticaids, prostheses manufacturing, stud-ies of morphology and physiology, mol-ecular modeling, computerized tomog-raphy (CT scans), nuclear magneticresonance (NMR, MRI), and teachingof surgical techniques.

BUSINESS & MANAGEMENT GRAPHICS

Decision-making systems, graphical datadisplays, presentation graphics systems,visual information systems, C3I (command,control, communication, and informationsystems), financial graphics systems,and business and scientific charts andgraphs.

EDUCATION & LEARNING Techniques fordeveloping visual thinking skills andcreative abilities in both children andadults; science and mathematicsinstruction; architecture, engineering,and design instruction; arts instruc-tion; and development of the electronicclassroom based on research findingson the cognitive, motivational, and ped-agogic effects of computer graphics.

ELECTRIC CAD/CAM Printed wiring boardand integrated circuit design, symbolconstruction, schematic generation,knowledge-based systems in electronicdesign and simulation, advanced sys-tems for chip and circuit design, circuitanalysis, logic simulation, electronic fabri-cation and assembly, and test set design.

HUMAN FACTORS & USER INTERFACES

Advances in the visual presentation of information; graphical software development tools and visible languageprogramming; improvements in screenlayout, windows, icons, typography, andanimation; and alternative input devices,iconographic menus, improvements incolor graphics displays, and graphicaluser interfaces (GUIs).

MANUFACTURING Computer-aided design(CAD), manufacturing (CAM), and engineering (CAE); computer-integratedmanufacturing (CIM), numerical control(NC) in CAD/CAM, robotics, and man-aging the flow of manufacturing infor-mation from design to field services;manufactured parts, buildings, andstructures; and integrating CIM with CAD.

MAPPING & CARTOGRAPHY Geographicinformation systems (GIS) and graphicaldatabases; computer-assisted cartogra-phy; engineering mapping applicationsin transportation and utility fields; com-puter-assisted map analysis; 3-D mappingtechniques; management systems forindustrial, office, and utility sites andcross-country facilities such as trans-mission lines; military and civilian govern-ment facilities management; naturaland man-made resource mapping; andland planning, land development, andtransportation engineering.

PATTERN RECOGNITION & IMAGE

PROCESSING Feature selection andextraction; scene matching; videoinspection; cartographic identifications;radar-to-optical scene matching; indus-trial and defense applications; analysisand problem solving in medicine, geolo-gy, robotics, and manufacturing; com-puter vision for automated inspection;and image restoration and enhancement.

PRINTING & PUBLISHING Integration oftext and graphics in printed documents,technical documentation created fromengineering drawings, technical publish-ing systems, online documentation sys-tems, page layout software, scanningsystems, direct-to-plate printing capabil-ities, and computer-assisted design sys-tems for publication design and layout.

STATISTICAL GRAPHICS Graphical tech-niques for rendering large masses ofdata to increase understanding, graphicaltechniques for data analysis, graphicaldisplay techniques, graphical interactiontechniques, and multivariable dataanalysis.

VIDEO & MULTIMEDIA TECHNOLOGY

High-definition television; computer-generated video for entertainment andeducational applications; electronicvideo-conferencing; broadcast applica-tions for news, weather, and sports;and CD-ROM and Web graphics.

VISUAL ARTS & DESIGN Computer graphicsapplications for graphic design, industrialdesign, advertising, and interior design;standards based on design principlesrelating to color, proportion, place-ment, and orientation of visual ele-ments; image manipulation and distor-tion for special effects; systems forcomputer artists involved in drawing,painting, environmental installations,performance, and interactive multi-image systems; computer animation infilm, television, advertising, entertain-ment, education, and research; anddigital design of typography.

A Panoply of Applications

Page 107: America's Investment in the Future

98 — National Science Foundation

Computer graphics was still too obscure a fieldto be a cover story in 1972 when Bernard Chern,who later retired as director of NSF’s Division ofMicroelectronic Information Processing Systems,began a program to support the development ofcomputer systems for representing objects inthree dimensions. Chern assembled a stable ofgrantees, including many of the country’s leadingresearchers in automation and modeling.

Among them was Herbert Voelcker, who recallsthe state of the technology when he launched thecomputer modeling program at the University of Rochester: “Major advances in mechanicalcomputer-assisted design were not possiblebecause there were no mathematical and compu-tational means for describing mechanical partsunambiguously . . . There were no accepted scientific foundations, almost no literature, andno acknowledged community of scholars andresearchers . . . These early explorations wereunsettling, but also challenging because they ledus to try to build foundations for an emerging field.”

Voelcker and his team were among the pioneersin computer-assisted design (CAD), which, for mostof its history, had relied primarily on wireframesystems. Mimicking manual drafting, these com-puter programs build either two- or three-dimen-sional models of objects based on data suppliedby users. While useful, the programs frequentlyresult in ambiguous renderings—a rectangle mightrepresent either a flat side or an open space—and are fully capable of producing images thatresemble the drawings of M.C. Escher, wherecontinuous edges are a physical impossibility.Solid modeling, on the other hand, is based onthe principles of solid geometry and uses unam-biguous representations of solids.

In 1976, Voelcker’s group unveiled one of theearliest prototype systems, called PADL, for Partand Assembly Description Language. For the nexttwo decades, PADL and other solid modeling sys-tems were constrained by heavy computational

requirements, but as faster computers have comeinto their own, PADL descendants are now dis-placing wireframe modeling and drafting in themechanical industries.

NSF-funded researchers at the University ofUtah are taking computer drafting techniques evenfurther, all the way to what is known as “from artto part.” That is, they are creating a system thatgenerates a finished metal product from a sketchof a mechanical object, bypassing the prototypingstage all together.

Visualization: Back to the FutureBy 1991, the field of computer visualization wasexploding. “The field had gotten so big, with somany specialties, that no one could know it all.No single research lab could do it all. Graphicshadn’t just become broad—it was increasinglyinterdisciplinary,” explains Andries van Dam ofBrown University. Van Dam is the current directorof NSF’s Science and Technology Center forComputer Graphics and Scientific Visualization,which was established both to help deal with theinterdisciplinary needs of the scientists and toexpand the basics of computer graphics.

The center is a consortium of research groupsfrom five institutions—Brown University, theCalifornia Institute of Technology (Caltech), CornellUniversity, the University of North Carolina atChapel Hill, and the University of Utah—all ofwhich have a history of cutting-edge research incomputer graphics and visualization.

In addition to collaborating, each universityfocuses on a different part of graphics and visu-alization research, studying such fields as noveluser interfaces, hardware design for visualizationand graphics, the physics of the interaction oflight with its environment, and geometric model-ing for mechanical design.

continued from p. 95

Page 108: America's Investment in the Future

The advances that built on Ivan Sutherland’sground-breaking Sketchpad work at MITwould bring computer graphics out of the lab-oratory, off the military base, and into thecommercial marketplace, creating a steadilygrowing demand for computer-generatedimages in a variety of fields.

Continuing technical developments andthe widespread commercial adoption of thepersonal computer—both the IBM PC andthe Apple computer—helped spur a demandso strong that computer graphics ceasedto be an add-on to a computer’s capabilityand became an integral feature of the com-puter itself. Today, entire generations aregrowing up with an exclusively graphics-based experience of computing.

Some of the advancing techniques alongthis route to the future were:

WIREFRAME DRAWING PROGRAMS AND

BOUNDARY REPRESENTATION SYSTEMS depictstructures in three dimensions showing all theoutlines simultaneously from various perspec-tives. These programs can recognize surfaces,erase hidden lines, and add shading.

SOLID MODELING SYSTEMS define the interiors,edges, and surfaces of an object.

CONSTRUCTIVE SOLID GEOMETRY SYSTEMS

provide a library of preformed shapes that canbe combined in additive and subtractive waysto create solid objects.

FRACTAL GEOMETRY uses self-similar forms—where the structure of a small section resem-bles the structure of the whole—to geometricallysimulate the intricacies of nature, such as pat-terns in tree bark, cracks in the mud of a dryriverbed, or the edges of leaves.

RAY-TRACING ALGORITHMS simulate the effect of light rays bouncing around a scene—illumi-nating objects, creating reflections, and definingareas of shadow. Ray-tracing often producesstrikingly realistic images.

IMAGE MAPPING, also known as texture mapping, is a technique for wrapping two-dimensional patterns and images around three-dimensional models.

SPATIAL TEXTURING uses an automatically cre-ated three-dimensional pattern that is definedfor a three-dimensional volume rather than atwo-dimensional plan. With spatial texturing,also known as solid textures, you can cut amodel of a block of wood in half and see thewood grain inside.

ELECTRONIC PAINT SYSTEMS include tools thatimitate the use of brush, oil, and canvas andprovide a menu of choices for type of paint brush,color hue and intensity, and type of stroke.

IMAGE PROCESSING PROGRAMS enable users to edit and manipulate photographs and otherimages to create different effects.

ANIMATION introduces the dimension of timeand creates the illusion of motion.

VIRTUAL REALITY SYSTEMS create the illusion of real three-dimensional space through theuse of three-dimensional graphics and head orbody tracking that changes the view when auser moves.

Computer Graphics: Into the Marketplace

Page 109: America's Investment in the Future

100 — National Science Foundation

Another of the center’s focuses, explains vanDam, is tele-collaboration. “We are building toolsthat will make it seem like you’re looking througha glass window and seeing your colleagues inthe next room working on objects you’re design-ing. We want to create an immersive environment.In my lifetime it won’t be quite real, but it will beclose enough.”

Visualizing a Virtual RealityWhile van Dam and his colleagues are movingpeople into a virtual design shop, other researchersoutside the center are creating virtual realities—computer-driven worlds where everything is inter-connected, allowing exploration on a level soextraordinary it approaches science fiction.

In previous studies of the Chesapeake Bay,scientists had to measure the wind, current,salinity, temperature, and fish populations sepa-rately. But with a virtual reality model, all the ele-ments come together. Glen Wheless, a physicaloceanographer at Old Dominion University, workedwith William Sherman, a computer scientist at theNational Center for Supercomputing Applications,to create a dynamic model of the Atlantic Ocean’ssaline waters converging with fresh water from morethan 150 creeks and rivers that flow into the bay.

The model has given scientists new insights intothe ways in which fish larvae are transported

around the estuary; scientists are learning, forexample, that they had previously underestimatedthe influence of wind, tides, and runoff.

The Chesapeake Bay vir tual reality model isdifferent from a computer animation in that it isinteractive. Researchers can continually update thedata and re-run the model. Computer animations,for all their explanatory power, cannot accommo-date this demand; once completed, they are noteasily changed.

Virtual environments are presented to the viewerthrough wide-field displays. Sensors track theviewer’s movements through the data and updatethe sights and sounds accordingly. The result isa powerful mechanism for gaining insight into large,multidimensional phenomena. The ChesapeakeBay simulation was designed in one of the country’sleading virtual environments for science, CAVE,which was pioneered with NSF support by theElectronic Visualization Lab at the University ofIllinois at Chicago. CAVE is an acronym for CaveAutomatic Vir tual Environment, as well as a reference to “The Simile of the Cave” in Plato’sRepublic, which explores the ideas of perception,reality, and illusion through reference to a personfacing the back of a cave where shadows are theonly basis for understanding real objects.

CAVE is a darkened cubicle measuring 10 by10 by 9 feet. Sound and three-dimensionalimages derived from background data are projectedonto three walls and the floor. Wearing specialglasses, visitors get a sensation of steppinginside the simulation.

CAVE’s technology has been used for manysimulations, perhaps the most famous of whichis Cosmic Voyage, an IMAX film that made its debutin 1996 at the Smithsonian National Air and SpaceMuseum in Washington, D.C. The museum cosponsored the film project with NSF and Motorola.Cosmic Voyage includes a four-minute segmentof research-quality scientific visualization. Thesegment tells a story that begins shortly after theBig Bang, continues through the expansion of the

For more than fifteen years, University

of Pittsburgh researcher John Rosenberg

and his colleagues have studied how

protein-DNA recognition works in the

case of a particular protein, Eco RI

endonuclease. This detailed model

shows that the DNA-Eco RI interaction

creates a kink in the DNA’s structure.

Page 110: America's Investment in the Future

Visualization — 101

universe and the formation of galaxies, and endswith the collision of two spiral galaxies. The seg-ment is the result of the collaborative efforts ofNCSA scientific visualization experts, NSF-supportedastronomers, two movie production companies, andnumerous high-performance computing machinesat multiple centers.

Donna Cox, professor of art and design at theUniversity of Illinois, Urbana-Champaign, choreo-graphed the various parts of the simulation segment. For the camera moves, she worked withstaff at the Electronic Visualization Laboratory to create a voice-driven CAVE application calledthe Virtual Director, a virtual reality method fordirecting the computer graphics camera for real-time playback or animation recording. Approximatelyone-half of the sequence—the collision and themerging of two spiral galaxies—is based on asimulation carried out by Chris Mihos and LarsHernquist of the University of California, SantaCruz, on the San Diego Supercomputer Center’sCRAY C90 system. As the galaxies merge andthen draw apart, tidal forces and galactic rotationcause the galaxies to cast off stars and gas inthe form of long, thin “tidal tails.” The compressionof interstellar gas into the merged galaxies fuelsan intense burst of star formation. Mihos andHernquist found that increasing the resolution oftheir simulation led to new science, “particularly,”says Mihos, “the large number of small, con-densing gas clouds in the colliding galaxies thatcould be related to the formation of young, lumi-nous star clusters or small dwarf galaxies, whichare seen in many observed galaxy collisions.”

In February 2000, Passport to the Universedebuted at New York’s Hayden Planetarium tocritical praise. The digital film, made using VirtualDirector software and other high-end computingand visualization resources from both the Allianceand NPACI, combines images of actual astronomicalobjects with simulations made by cosmologyresearchers to provide audiences with an unpar-alleled depiction of intergalactic travel.

Real Support, Real Time, Real ValueWhile galaxies are merging and drawing apart—at least virtually—there is realism in the value ofNSF’s support of the basic scientific explorationsthat have fueled developments in computer visu-alization over the past fifty years. As one voice inthe complex community of this field, Ivan Sutherlandreflects on the value of this support. “I have now reached an age where the Association forComputing Machinery (ACM), the Institute ofElectrical and Electronics Engineers, Inc. (IEEE),and the Smithsonian Institution have seen fit tohonor me in various ways,” he says. “Such retro-spective honors are not nearly so important asthe prospective honor NSF did me with an NSFfellowship. The prospective honor made a giantdifference in my ability to contribute to society.”

NSF Directorate for Computer and Information Science and Engineeringwww.cise.nsf.gov

NSF Science and Technology Center for Graphics and Visualizationwww.cs.brown.edu/stc

Caltech Computer Graphics Researchwww.gg.caltech.edu

University of North Carolina Graphics and Image Clusterwww.cs.unc.edu/research/graphics/

Cornell Theory Centerwww.tc.cornell.edu

Cornell University Vision Groupwww.cs.cornell.edu/vision

National Computational Science Alliance http://access.ncsa.uiuc.edu/index.alliance.html

National Partnership for Advanced Computational Infrastructure www.npaci.edu

SDSC Tele-manufacturing FacilitySan Diego Supercomputer Centerwww.sdsc.edu/tmf/

Pittsburgh Supercomputing Centerwww.psc.edu

National Center for Atmospheric Researchwww.ncar.ucar.edu/

Electronic Visualization Lab at University of Illinois at Chicagowww.evl.uic.edu/

To Learn More

Page 111: America's Investment in the Future

Environmenttaking the

long view

Page 112: America's Investment in the Future

SF is supporting research to learn

how the diverse parts of our environ-

ment—from individual species to

ecosystems to global weather

patterns—interact to form the world

around us. A better understanding

of the give-and-take between organ-

isms and the environment is critical

to the search for knowledge as well

as for a healthy planet.

N

Page 113: America's Investment in the Future

A l t hough h umans h ave b e en f a s c i n a t e d by the relationship

between organisms and their environment since the days of Aristotle, ecology as a separate

scientific discipline is only about a century old. Today the field is closely aligned in many minds

with concerns about pollution and species extinction. The National Science Foundation began

to make a serious investment in ecological research in the 1960s and in 1980 launched its

pioneering Long-Term Ecological Research (LTER) program. Usually, researchers receive grants

to conduct three-year studies that ask a relatively narrow range of questions. But with the LTER

program, NSF has recognized that real understanding of the complex interplay among plants,

animals, and the environment requires a longer and broader view. Currently more than 1,000

researchers are working at twenty-four ecologically distinct LTER sites, where studies often last

for decades. The questions these NSF-funded ecologists are posing, and the answers they’re

getting, are emblematic of a maturing and vital discipline.

Page 114: America's Investment in the Future

Environment — 105

The Big PictureA temperate coniferous forest teeming with hem-locks, red cedar, and firs. An Arctic tundra dottedwith icy lakes and headwater streams. An EastCoast city interlaced with deciduous trees, houses,and parks. A tallgrass prairie. A tropical rainforest.A coastal estuary. A fiery desert.

For every ecological domain on Earth, thereseems to be an LTER site devoted to unmaskingits secrets. Each location hosts an average ofeighteen different principal investigators—oftenaffiliated with nearby universities—who head upvarious studies that last anywhere from the fewyears it may take a graduate student to completeher thesis to the decades needed to understandthe ongoing effects of, say, fire on the prairie.The sites themselves are much larger than theaverage experimental plot, ranging in size from the3,000 acres under continuous study at the HarvardForest LTER in Petersham, Massachusetts, to the5 million acres that make up the Central Arizona/Phoenix site.

The rationale behind the LTER program is basedon conclusions that environmental scientistsreached by the end of the 1970s. One conclusionis that changes in many of the most importantecological processes, such as nutrient levels in thesoil, occur slowly. Relatively rare events such asflash floods have a major impact on an ecosystem,but they can only be properly studied if researchershave, in effect, anticipated the occurrences withongoing studies. Another conclusion is that manyecological processes vary greatly from year to year;only a long-term view can discern inherent patterns.Finally, the kind of long-term, multidisciplinarydatabases established by LTER researchers arecritical for providing a context in which shorter-term studies can be understood.

Although each site boasts its own array ofstudies designed for that particular ecologicalsystem, all studies undertaken at an LTER sitemust address one or more of what ecologistSteward Pickett, project director for the BaltimoreLTER, calls “the holy commandments of LTER.”These commandments come in the form of fivequestions that are fundamental to how any eco-system functions: What controls the growth ofplants? What controls the populations of plantsand animals? What happens to the organic mat-ter that plants produce? What controls the flowof nutrients and water in the system? How dodisturbances affect the system?

Without periodic fire, the tallgrass

prairies of central North America would

disappear into a woodland/shrub habitat.

At the NSF-funded Konza Prairie LTER

site in Kansas, researchers seek to under-

stand the interplay of prairie and fire

by subjecting sixty experimental plots to

short- and long-term intervals of burning.

Page 115: America's Investment in the Future

106 — National Science Foundation

While these five themes provide focus to indi-vidual LTER studies, they also allow researchersfrom very different locales to do an “apples-to-apples” comparison of their data so that evenlarger lessons can be learned. Clues to how anecosystem functions are more readily apparentwhen scientists can compare how the sameprocess works across ecologically diverse sites.For example, the LTER program allows researchersto observe how nutrients travel through two dif-ferent types of grasslands and how grasslandsdiffer from forests in terms of nutrient flow. Tohelp make these kinds of comparisons, repre-sentatives from each LTER site meet formallytwice a year and also communicate regularly viaemail and the LTER program’s Web site.

Key to the success of the LTER approach, ofcourse, are long-term funding and large-scale areas.With the proper time and space, “you can do riskierexperiments,” says NSF’s LTER program directorScott Collins, “or you can do experiments thattake a long time to have an effect, or big experi-ments that require a lot of space, or ones thatneed a certain kind of team.”

Long-term studies also provide an increasinglyimportant baseline of how the environment works—a baseline against which crucial managementdecisions can be measured. “As the sites arestudied longer,” Collins says, “their value increases[because] the findings can be applied to policyand conservation issues.”

What follows is a brief tour through just a fewof the LTER sites that are fulfilling the promise oflong-term, large-scale environmental research.Studies at these sites have unraveled human healthproblems, helped to clean up the air, changed howforests are managed, exposed the effects ofglobal change, and revealed how cities interactwith their surrounding environment.

An Ecological Solution to a Medical Mystery When young, otherwise healthy people in theremote Four Corners area of Arizona and NewMexico began dying of a mysterious acute respi-ratory disease in the spring of 1993, people werescared. Those who caught the disease got verysick, very quickly. Eventually twenty people died.At the time, some wondered if the disease was a biological warfare agent, a military experimentgone bad.

The Atlanta-based U.S. Centers for DiseaseControl and Prevention (CDC) sent scientists to theregion to investigate. Tests of the victims’ bloodyielded a surprising result: the people had becomeinfected with a previously undetected kind ofhantavirus. The hantavirus causes HantavirusPulmonary Syndrome, a serious respiratory illnessthat can be fatal.

Named after the Hantaan River in Korea, han-taviruses were known to spread from rodents tohumans but until the Four Corners outbreak, themicrobes had only been seen in Asia and Europe.Moving quickly, CDC investigators asked biologistsat the University of New Mexico for help in collect-ing rodents and insects around the homes ofpeople who had gotten sick. A likely suspect soonappeared when the infection popped up in oneparticular kind of mouse.

“The CDC called us and asked, ‘What mouse is this?’,” says University of New Mexico mammol-ogist and museum curator Terry Yates, who alsoserves as co-principal investigator at the NSF-funded Sevilleta LTER site—so-called because thesite’s 230,000 acres are located within theSevilleta National Wildlife Refuge, about an hour

Page 116: America's Investment in the Future

Environment — 107

south of Albuquerque. Yates told the CDC that theinfected animal was a deer mouse, a close rela-tive of the type of Old World mice that also carryhantaviruses and that transmit the disease throughtheir droppings and urine.

Now the CDC knew what the disease was andhow it was transmitted. But the investigators stilldidn’t know why a disease carried by a commonanimal like the deer mouse seemed to be croppingup for the first time in North America. For answers,the CDC turned to what Sevilleta researcher RobertParmenter calls “a bunch of rat trappers” who hadbeen working on matters entirely unrelated tomedical science at Sevilleta even before the sitewas admitted to NSF’s LTER network in 1988.

The major research question at the SevilletaLTER site was this: How do the Sevilleta’s fourmajor ecosystems (grassland, woodland, desert,and shrub steppe) respond to short-term and long-term fluctuations in climate? One way to addressthat question was to measure the population fluc-tuations of plants and animals. Climate changesaffect vegetation, which in turn affects the amountand kind of food available to animals. Keeping trackof the rodent populations was just one part of amulti-investigator project—but it turned out to bea crucial part of the CDC investigation.

Parmenter, who directs the Sevilleta FieldResearch Station, recalls being told by the CDCthat “I could take all the time I wanted so longas [the rodent report] was ready by next Tuesday.”He and his team of students and fellow professors“were gung-ho excited—working up the data, doingthe analyses just as fast at we could.”

Their conclusion? The hantavirus outbreak couldbe blamed on El Niño, a periodic pattern of changein the global circulation of oceans and atmosphere.Parmenter’s team saw in their long-term data thatmassive rains associated with the 1991–92 El Niñohad substantially boosted plant productivity in theSevilleta after several years of drought. A banneryear for plants was followed by a banner year forrodents. Rodent populations during the fall of 1992and spring of 1993 surged as much as twentytimes higher in some places as compared to pre-vious years. The same phenomenon likely occurredin the nearby Four Corners region. More micemeant that more humans stood a greater chanceof exposure to infected rodents as the peoplemoved among their barns and outhouses and didtheir spring cleaning of cabins and trailers.

Data from the Sevilleta also helped to determinethat the deadly hantavirus wasn’t new to NewMexico. Yates and his colleagues tested tissuesamples collected from rodents prior to 1993 and

As part of the NSF-supported LTER project

in metropolitan Phoenix, Arizona State

University graduate student Jennifer

Edmonds collects water samples at the

Salt River, east of Phoenix. The samples

will be tested for nutrients and major

ions as part of a project that helps

researchers to better understand the

relationship between urbanization and

ecological conditions.

Page 117: America's Investment in the Future

108 — National Science Foundation

detected evidence of hantavirus. In other words,the virus had been in rodents all along—it was thechange in climatic conditions that triggered thefatal outbreak in humans. Such knowledge mayhave helped save lives in 1998, when a particular-ly active El Niño event prompted health authoritiesto warn residents of the American Southwest tobe careful when entering areas favored by mice.The events of 1993 continue to be felt directly atthe Sevilleta LTER, which now counts among itsstudies one that aims to identify the ways inwhich hantavirus is spread from rodent to rodent.

Yates says, “This is a classic example of basicresearch done for totally different reasons com-ing to the rescue when a new problem arises.”

Contributing to a Cleaner WorldLTER researchers are both medical and environ-mental detectives. Using many of the same skillsthat helped determine the cause of the hantavirus,these scientists are conducting studies that deter-mine how pollution affects ecosystems. The resultsof these investigations are helping to create ahealthier environment.

A case in point is the Hubbard Brook Experi-mental Forest, home to the longest continuallyoperating ecosystem study in the United States.In 1955, scientists began research on the 8,000-acre site in New Hampshire’s White MountainNational Forest to figure out what makes a for-est tick. NSF began funding research at the sitein the 1960s; Hubbard Brook joined the LTERnetwork in 1987.

The main research aim at Hubbard Brook issuitably large scale: By measuring all the chemicalenergy and nutrients that enter and leave thisexperimental site, researchers hope to learn whatmakes a forest, a forest.

“The approach we use is called the smallwatershed approach,” says Charles Driscoll, anenvironmental engineer at Syracuse University inSyracuse, New York, and a principal investigatorfor the Hubbard Brook LTER. A watershed is thewhole area drained by a particular stream and itstributaries. The watersheds at Hubbard Brook spanmountain valleys from ridgeline to ridgeline, encom-passing the hillsides and the tributaries that draininto the streams on the valley floor. Researcherslearn about the effects of both human and naturaldisturbances by measuring and comparing thetransport of materials, such as water and nutrients,in and out of different watersheds.

The small watershed approach at Hubbard Brookhas proven crucial to understanding the effects ofacid rain. The term “acid rain” describes precipi-tation of any kind that contains acids, largelysulfuric and nitric acids. Natural processes releasesulfur and nitrogen compounds into the air, wherethey react with water vapor to form acids. By burn-ing gasoline, coal, and oil, humans are responsiblefor releasing even greater amounts of sulfur andnitrogen compounds, creating snow and rain thatcan carry life-stunting levels of acids into waterwaysand forests. By the 1970s, numerous lakes andstreams in the heavily industrialized NorthernHemisphere became inhospitable to fish and otherorganisms. The link to forest degradation has beenharder to prove, but in Europe people have coineda new word—Waldsterben—to describe the kindof “forest death” thought to be caused by toomuch acid rain.

Page 118: America's Investment in the Future

The Birth of Long-Term Ecological ResearchToday most of us take it for grantedthat the Earth’s diverse systems,from forests, grasslands, and desertsto the oceans and the atmosphere,are interconnected. But in the early1960s, thinking about the world as a set of interacting systems was a“totally revolutionary concept,” saysJoann Roskoski of NSF’s Division ofEnvironmental Biology. At the time,researchers took what the late influ-ential ecologist Tom Callahan calleda “critter-by-critter” approach, focus-ing on single species.

“That’s fine as far as it goes,”Callahan said, “but it doesn’t saymuch about the bigger picture.”

And the bigger picture is what the 1960s environmental movementwas all about. During this decade,NSF helped move ecology to sci-ence’s center stage by serving asthe primary U.S. representative inthe International Biological Program(IBP). The IBP, which was approvedby the International Union ofBiological Sciences and theInternational Council of ScientificUnions, was a controversial effort to coordinate a series of ecologicalprojects conducted by scientific com-munities around the world. The pro-

gram’s critics charged that the IBPfocus was too vague and unwieldy.Amid the controversy, NSF decidedthat the major aspect of the U.S. program would be large-scale pro-jects featuring new, multidisciplinaryresearch—specifically, systems ecol-ogy, the analysis of ecosystems bymeans of computer modeling, a strik-ingly new approach at the time. Atotal of five different “biomes” werestudied between 1968 and 1974:western coniferous forests, easterndeciduous forests, grasslands, tun-dra, and desert.

The IBP helped to consolidateecosystem ecology; resulted in a permanent increase in funding for the field; stimulated the use of com-puter modeling in ecology; producedsmaller-scale models of ecologicalsystems; and trained a generation of researchers. “If you now look at a lot of the leadership in Americanecology today, these folks cut theirteeth on IBP,” says the University ofTennessee’s Frank Harris, who wasNSF program director for ecosystemstudies in 1980.

Still, researchers and policymak-ers came to realize that huge pro-jects such as the IBP ultimately

had only limited applicability to the practical problems of environmentalmanagement. Attention began toturn to smaller-scale integrated pro-jects such as the Hubbard BrookEcosystem Study, which NSF hadbeen funding since 1963, evenbefore IBP. Results from HubbardBrook, such as being able to predicthow forests recover from clear-cut-ting and the discovery of acid rain in North America, demonstrated the power of taking an ecosystemapproach to understanding the envi-ronment, but over a longer time scalethan was typical of IBP projects.

Six years after the IBP ended,NSF launched its Long-TermEcological Research (LTER) program,today’s new standard for excellencein environmental science. So suc-cessful have LTER researchers beenthat in 1993 an international LTERprogram was launched after a meet-ing hosted by the U.S. LTER network.The international LTER effort nowincludes seventeen countries (withthirteen more in the wings), all ofwhom support scientific programs or networks that focus on ecologicalresearch over long temporal andlarge spatial scales.

Page 119: America's Investment in the Future

110 — National Science Foundation

Acid rain in North America was first documentedin 1972 by Gene E. Likens, F. Herbert Bormann,and Noye M. Johnson at Hubbard Brook. BecauseHubbard Brook researchers using the small water-shed approach had long been monitoring thequality, not just quantity, of precipitation, they couldtell that rainwater wasn’t quite what it used to beand that the acid problem was getting worse.Their work was important in the establishment ofthe National Acid Precipitation Assessment Programand the passage of the landmark Clean Air ActAmendments in 1990, which mandated reductionsin sulfur dioxide emissions from power plants.

Although precipitation over the United Statesis not quite as acidic as it was in 1972, forestsare still showing worrisome signs of decline. A1996 Hubbard Brook study determined at leastone reason why: Acid rain ravages the soil’s abilityto support plant life.

“A lot of people thought that acid rain changessurface waters, but not the soil,” says Likens,director of the Institute of Ecosystem Studies inMillbrook, New York, and lead author of the 1996Hubbard Brook study. “This was one of the firststudies to clearly demonstrate the substantialeffects of acid rain on soil.”

As it turned out, numerous minerals essentialto life, including calcium and magnesium, dissolvemore readily in highly acidic water. Thirty years ofHubbard Brook data on the chemical compositionof soil, rain, and stream water showed that acidrain was and is seriously leaching calcium andmagnesium from the forest soil—as rain falls, it reacts with soil minerals and washes them intothe streams.

Can anything be done to bolster the soil’sresistance to acid rain? In 1999, Hubbard Brookresearchers set out to address this question bysending up helicopters to drop a load of calciumpellets on a 30-acre watershed that, like the restof the forest, has been depleted of calcium overthe years.

“We’re going to look at the trees, the herbaceousplants, how salamanders respond, how microbesrespond, and how aquatic organisms respond,”Driscoll says. In a few years, the researchers maybe able to report whether calcium enrichmentshows any signs of helping to restore damagedsoil. Such a finding would be welcome news toNew Englanders in the tourism and maple sugarindustries, where concern is high about whethercalcium levels in the soil have something to dowith the notable decline in the region’s sugarmaple trees. A full understanding of calcium’s rolein the environment will take longer. That’s whyDriscoll says the new study—like most HubbardBrook studies—will continue “not just for a fewmonths, but for fifty years.”

Says Driscoll, “Once we start, we don’t quit.”

Timothy Katz, site manager for the NSF-

funded North Temperate Lakes LTER in

Wisconsin, samples open-water fishes

with a vertical gill net. Among the wealth

of long-term data gathered from the

lakes is evidence of time lags in how

“invaders” affect lake communities.

For example, in Sparkling Lake a kind

of trout called cisco went extinct sixteen

years after smelt found their way in.

Page 120: America's Investment in the Future

Studying only one piece of the envi-ronment—even one as big as an LTERsite—provides only partial understand-ing of how the world works. Such isthe nature of what NSF Director RitaColwell calls “biocomplexity.”Eventually, all the pieces will need toconjoin in order to solve the puzzle.

One would-be puzzle master is the NSF-funded National Center forEcological Analysis and Synthesis(NCEAS) at the University of Cali-fornia in Santa Barbara. NSF helpedcreate NCEAS to organize and ana-lyze ecological information from allover the globe, including sites withinNSF’s Long-Term Ecological Research(LTER) program. The center does notcollect new data itself; instead,NCEAS’ job is to integrate existinginformation so that the information ismore useful for researchers, resourcemanagers, and policymakers who aretackling environmental issues.

“Natural systems are complex,

and humans are altering these systems at an unprecedented rate,”says NCEAS Deputy Director SandyAndelman. “We need to do a betterjob of harnessing the scientific infor-mation that’s relevant to those sys-tems and putting it in a useable form.”

But gathering and integrating suchinformation is a daunting task. Thereis no central repository in which eco-logical scientists can store their data.Most studies are conducted by individual researchers or small teamsworking on specific small, short-termprojects. Since each project is slightly different, each data set isslightly different.

“Ecological data come in all kindsof shapes and forms,” Andelman says.She adds that, in ecology, “There isnot a strong culture of multi-investigator,integrated planning of research . . . .Ecology and other related disciplineshave amassed vast stores of relevantinformation, but because this infor-

mation is in so many different formsand formats and many different places,it is not accessible or useful.”

Hence the need for something likethe NCEAS, which is collaborating withthe San Diego Supercomputing Centerand the LTER program to come upwith the necessary advanced com-puting tools. NCEAS is also develop-ing a set of desktop computer toolsthat will allow researchers to enterand catalog their data into the net-work using standardized data dictio-naries. Eventually, researchers thou-sands of miles apart will be able tolook at each other’s data with just a few clicks of the mouse.

“If people knew that their datacould contribute to a larger question,most would happily make a little extraeffort to put their data into a moreuseful format,” Andelman says. “Butthere hasn’t been that framework inplace.” And now, thanks to NSF,there will be.

Solving the Biocomplexity Puzzle

Page 121: America's Investment in the Future

Counting the Blessings of BiodiversityIn addition to pollution, species extinction rankshigh as a concern among those interested in howecosystems function. According to the fossil record,several thousand plants and animals have disap-peared over the last ten million years; during thetime dinosaurs were alive, one species disappearedabout every one to ten thousand years. But as thehuman population has grown, so has the rate ofextinction—researchers now conservatively esti-mate that species are dying out at the dramaticrate of one a day.

The assumption, of course, is that this can’tbe good. More than a century ago, Charles Darwinfirst suggested that more species would make anecosystem more productive. But researchers havestruggled to test the notion rigorously, not just inthe lab but in the field. It wasn’t until 1996 thatanyone had real evidence that biodiversity—sheernumbers of different species—is critical to theplanet’s well-being.

In an experiment that other ecologists havedescribed as “brilliant” and “a first,” Universityof Minnesota ecologist David Tilman and otherresearchers at the Cedar Creek Natural HistoryArea—an NSF-funded LTER site since 1982—demonstrated that plant communities with thegreatest biodiversity yielded the greatest totalplant growth year to year. These plant communitiesalso were much more likely to hang on to essen-tial nutrients that might otherwise have beenleached from the soil.

Tilman’s team approached the problem byconstructing 147 miniature prairies within a sec-tion of the 5,500-acre experimental reserve atCedar Creek, and planting each one with anywherefrom one to twenty-four species. The burning,plowing, and planting were done by the spring of1994. Then the researchers sat back to see whichplots would end up doing best.

Actually, no one sat much. The researchers,aided by an army of undergraduates, have toiledever since to meticulously weed the 100-square-foot plots of anything that didn’t belong to whateach plot was designed to contain, be it brown-eyedsusans, bunch clover, or yarrow. A critical aspectof the study was that researchers randomlyselected which species went into which plots.This kept the focus on the number rather thanthe type of species.

Why do more species make for a merrierecosystem? Tilman has found that a diverse plantcommunity uses the available energy resourcesmore efficiently.

“Each species differs from others in a varietyof traits,” says Tilman. “Some have high waterrequirements and grow well during the cool partof the year. Others grow well when it’s really warmand dry. Each one in the system does what it’sgood at, if you will, but there’s always somethingleft to be done.” That is, conditions that are lessthan hospitable to some species will be readilyexploited by others, leading to more lush growthoverall. These processes, says Tilman, also explainwhy so many species can coexist in nature.

“It wasn’t until we knew how rapidly specieswere going extinct that this issue really came tothe forefront,” says Tilman. Still, more work needsto be done before biodiversity’s role in a healthyecosystem can be unequivocally celebrated. That’swhy Tilman and other Cedar Creek researchers haveadded a second experiment to the mix, this timeusing more than three hundred bigger plots, eachabout the size of an average suburban backyard.The extra area should allow for a better under-standing of how, for example, plots with differentnumbers of species handle insects and disease.

112 — National Science Foundation

Page 122: America's Investment in the Future

Environment — 113

“Nobody’s ever done what they’ve done,” saysSamuel McNaughton, an ecosystem ecologist atSyracuse University in New York. “It’s an enormousamount of work. Tilman would not have beenable to do this without NSF funding through theLTER program.”

Keeping Up with Global Change From a focus on plant communities to a broaderlook at global climate change, LTER research isrevealing how the components of our environ-ment interact.

Albert Einstein once said that chance favorsthe prepared mind. So, too, are LTER scientistsuniquely prepared to learn from seemingly chancefluctuations in global climate—what LTER programhead Scott Collins calls “the surprise years.”

A good illustration of this can be found amongthe scores of lakes that make up the NSF-fundedNorth Temperate Lakes (NTL) LTER site in Wisconsin.A member of the network since the LTER program’sstart in 1980, the NTL site is managed byresearchers at the University of Wisconsin atMadison. The NTL LTER includes two field stations:one in the Yahara Lake District of southernWisconsin and the other—called the Trout LakeStation—in the state’s northern highlands. Whilethe area boasts hundreds of lakes that areamenable to study, the sites’ principal investiga-tors have chosen seven to consistently monitorover the long haul.

If researchers investigate only one lake, theydon’t know whether their findings are unique tothat lake, says University of Wisconsin limnologistTimothy Kratz, a principal investigator for the NTL LTER. Studying many lakes exposes patternsand commonalities that are visible only whenresearchers investigate environmental conditionsover a broad region. The seven lakes of the NTLLTER were chosen because of their representativevariety in size and location.

The number of lakes, their different sizes (ranging from quarter-acre bogs to 3,500-acrebehemoths), and their distribution from lower tohigher elevations, allowed Kathy Webster, then adoctoral student, and other NTL researchers inthe late 1980s to conduct one of the first andmost informative field studies of how lakesrespond to drought.

Year in, year out since 1981, NTL researchershave measured the lakes’ chemical composition,tracking fluctuations in calcium, magnesium,alkalinity, and other factors. These persistentmeasurements paid off in the late 1980s, whenthe upper Midwest was hit by a major drought.“We were able to look at our lakes pre-drought,during the drought, and after the drought,” saysKratz. The results were surprising: Although all ofthe lakes lost water, only those lakes positionedhigher in the landscape lost significant amountsof calcium, an essential nutrient for all organisms.The effect was all the more striking because theelevation difference between the highest and low-est study lakes was only about 33 feet.

An aerial view of a biodiversity experi-

ment at the NSF-supported Cedar Creek

Natural History Area in Minnesota.

Researchers here have shown that plant

communities with the largest variety of

species exhibit the greatest total plant

growth, one sign of a robust environment.

Page 123: America's Investment in the Future

114 — National Science Foundation

What could explain the different level of calciumloss? Groundwater, suggests Kratz. All of the lakesin the study are fed by groundwater seepingthrough the rocky soil. This groundwater carrieswith it an abundance of critical minerals, includingcalcium. But the drought caused the groundwatertable to fall below the higher lakes, essentiallyshutting off their mineral supply.

In a prolonged drought, says Kratz, lakes inhigher elevations might become calcium deficient,causing a cascade of biotic effects. Animals suchas snails and crayfish would be in trouble, sincethey require calcium to make their shells. In turn,fish that eat snails would find it harder to getenough food. The higher lakes might also becomemore susceptible than their low-lying counterpartsto the effects of acid rain, since the calcium andother minerals from groundwater can counteractthe deleterious effects of acid precipitation.

If changes in the world’s overall climate resultin droughts that become more frequent—as someresearchers predict with the advent of globalwarming—the chemistry of these two types oflakes will start to diverge. Data of the kind gath-ered at the North Temperate Lakes LTER shouldhelp both scientists and policymakers predict andcope with the environmental consequences ofglobal climate change.

“We didn’t know the particular event of inter-est would be a drought,” Kratz says. “But we hadin place a system of measurements that wouldallow us to analyze the situation—whatever theevent was.”

Cityscapes Are Landscapes, TooNot all LTER sites are located in remote, ruralareas. In 1997, NSF added two sites to the net-work specifically to examine human-dominatedecosystems—in other words, cities. One site iscentered in Baltimore, Maryland, the other inPhoenix, Arizona.

The Central Arizona/Phoenix (CAP) site fans outto encompass nearly five million acres of MaricopaCounty. While much of the site’s study area isurbanized, some portions are still agricultural fieldor desert, and there are also a few nature reserves.CAP researchers are in the early stages of layingthe groundwork for long-term studies at the site.For one thing, they’re busy identifying two hundredsampling sites that will encompass the city, theurban fringe, and enough spots on the very outeredge to ensure that some portion of the site willremain desert for the next thirty years.

“One of our exciting challenges will be to takethose very standard common ecological measuresthat people use in the forest and desert and every-where else, and say, well, is there an equivalentway to look at how the city operates?” saysCharles Redman, Arizona State University arche-ologist and co-director of the CAP-LTER. To tacklethat challenge, Redman and co-director NancyGrimm work with a research team that includesecologists, geographers, remote sensing special-ists, sociologists, hydrologists, and urban planners.

As a framework for their foray into the ecologyof a city, the researchers are adopting a popularand relatively new ecological perspective that recognizes that rather than being uniform, an eco-system is patchy, rather like a quilt. For example,

Page 124: America's Investment in the Future

Chytrids are not something peoplegenerally worry about. Yet this little-known group of fungi made news in1998 when it was linked with a rashof frog deaths in Australia and Panama.

It had taken frog researchers sever-al years to locate a chytrid specialistcapable of identifying the deadly fun-gus, and even then the experts weresurprised. “We didn’t know that any[chytrids] were parasites of vertebrates,”says Martha Powell, a chytrid special-ist at the University of Alabama.

Chytrids aren’t alone in being poorlyclassified. Only about 1.5 million specieshave been identified so far out of the13 million or so thought currently toexist (some estimates of the overallnumber are closer to 30 million). Thegargantuan challenge of collecting and describing examples of all theseunknown species falls to a steadilyshrinking pool of scientists known assystematic biologists. With the adventof high-tech molecular techniques forstudying evolutionary relationships,

taxonomy—the science of species clas-sification—has come to seem faintlyantiquated, even though biologicalresearch collections “remain the ulti-mate source of knowledge about theidentity, relationships, and propertiesof the species with which we sharethe Earth,” according to StephenBlackmore, chair of the SystematicsForum in the United Kingdom, whowrote about the problem in 1997 forthe journal Science.

But even as “the inescapable needto know more about the diversity of lifeon Earth remains largely unmet,” wroteBlackmore, “declining funds are limitingthe ability of institutes around the worldto respond . . . .” As of 1996, therewere only about 7,000 systematists inthe world, a workforce that Blackmoreand others deem “clearly inadequate.”

Says James Rodman, NSF programdirector for systematics, “There arevery few people studying the obscuregroups” of species and many of thoseexperts are beginning to retire.

One way the National ScienceFoundation is trying to address theproblem is through its Partnerships forEnhancing Expertise in Taxonomy (PEET)program. PEET funds systematic biolo-gists working to identify understudiedgroups like the chytrids. In fact, Powelland her colleagues are now workingunder a PEET grant to train at leastthree new Ph.D.s in the systematicbiology of chytrids. Besides trainingthe next generation of systematists,PEET projects are also making what isknown about these species more wide-ly available through the developmentof Web-accessible databases that con-tain information such as identificationkeys, photographs, distribution maps,and DNA sequences.

“Systematists,” wrote Blackmore,“hold the key to providing knowledgeabout biodiversity.” Knowing more abouthow the world functions requires learn-ing more about each of the world’sparts, however small.

Wanted: A Complete Catalog of Creatures and Plants

Page 125: America's Investment in the Future

patches in a grassland might be recognized asareas that burned last year, areas that burnedfive years ago, and burned areas where bison arenow grazing. Smaller patches exist within thelarger patches: The bison might graze more heavilyin some sections of the burned area than others,for example. There are patches of wildflowers,patches where bison have wallowed, and patcheswhere manure piles have enriched the soil. Eachtime ecologists look closely at one type of patch,they can identify a mosaic of smaller patches thatmake up that larger patch. And if they can figureout what the patches are, how the patches changeover time, and how the different types of patchesaffect one another, they might be able to figureout how the ecosystem functions as a whole.

Anyone who has flown over an urban area andlooked at the gridlike mosaic below can imaginehow easily cityscapes lend themselves to the hierarchical patch dynamics model. Still, it’s a newapproach where cities are concerned, says JianguoWu, a landscape ecologist at Arizona StateUniversity. And the patches within cities are newto ecologists.

“You can see very large patches—the built-upareas, the agricultural areas, the native desertareas,” he says of the Phoenix site. “But if youzoom in, you see smaller patches. Walk intodowntown Phoenix. There are trees, parking lots, concrete. They form a hierarchy of patcheswith different content, sizes, shapes, and other characteristics.”

CAP researchers have gathered informationabout how land use in the Phoenix region changedfrom the early 1900s until today. The team hasfound that as the area became more urban, thepatches became smaller and more regularlyshaped. In the new millennium, the scientists wantto see how this kind of more orderly patchinessaffects ecological processes. For example,researchers would like to know how insects andother small animals move across the landscape,and how storm runoff carries away nutrients acrossthe various patches, whether concrete or soil.

Grimm thinks that the patch dynamics modelwill help researchers integrate all the informationthey collect about the rapidly changing Phoenixmetropolitan area. The model emphasizes linkagesbetween different levels and types of patches suchthat researchers can design studies to ask: Howmight the actions of an individual eventually affectthe ecology of a whole community-sized patch? If someone sells an undisturbed piece of desertproperty to a developer, for example, the ecosys-tem will change. What kind of development isbuilt—whether there is one house per acre or aseries of closely packed townhomes—will differ-ently affect the ecological processes in theadjacent patches of remaining desert.

“Once the land use changes, the ecologychanges,” says Wu, adding, “What is really important is the dynamics—the impact of thispatchiness on the ecological, physical, hydrologi-cal, and socioeconomic processes of the city.”

116 — National Science Foundation

Page 126: America's Investment in the Future

Environment — 117

Long-Term Research: A Model for NSF’s FutureThe LTER program has already demonstrated aremarkable return on NSF’s investment. Thanksto NSF-supported research, we now have a betterunderstanding of the complex interplay amongplants, animals, people, and the environment.

In February 2000, the National Science Board(NSB), NSF’s policymaking body, released a reporturging that NSF expand the LTER program andmake the environment a “central focus” of itsresearch portfolio in the twenty-first century.

“Discoveries over the past decade or more haverevealed new linkages between the environmentand human health,” says Eamon Kelly, chair of theNational Science Board. “But just as we are begin-ning to better understand these linkages, the rateand scale of modifications to the environment areincreasing. These alterations will present formi-dable challenges in the new century—challengeswhich we are now only minimally equipped to meet.”

Preeminent ecologist Jane Lubchenco of OregonState University chaired the NSB Task Force onthe Environment, which was responsible for thereport. “The LTER program is widely viewed asone of the outstanding successes of NSF,” saysLubchenco, “and is the model for federal agenciesas well as other countries for superb place-basedecological sciences. [The program is] very lean,very efficient, very productive.”

The LTER program’s success is one reason thetask force recommended, among other things, thatNSF boost its spending on environmental researchby $1 billion over a five-year period beginning in2001. That kind of financial commitment wouldmake environmental science and engineering oneof the agency’s highest priorities.

And none too soon, according to Lubchenco.“We’re changing things faster than we understandthem,” she once said in a news interview. “We’rechanging the world in ways that it’s never beenchanged before, at faster rates and over largerscales, and we don’t know the consequences.It’s a massive experiment, and we don’t knowthe outcome.”

NSF Division of Environmental BiologyDirectorate for Biological Scienceswww.nsf.gov/bio/deb/start.htm

NSF Global Change ProgramsDirectorate for Geoscienceswww.nsf.gov/geo/egch/

NSF Partnerships for Enhancing Expertise in Taxonomy (PEET)www.nhm.ukans.edu/~peet

U.S. Long-Term Ecological Research Networkwww.lternet.edu

International Long-Term Ecological Research Networkwww.ilternet.edu

Sevilleta LTER Projecthttp://sevilleta.unm.edu/

Hubbard Brook Ecosystem Studywww.hbrook.sr.unh.edu/

Cedar Creek Natural History Areawww.lter.umn.edu/

North Temperate Lakes LTERhttp://limnosun.limnology.wisc.edu/

Central Arizona-Phoenix LTERhttp://caplter.asu.edu/

Environmental Science and Engineering for the 21stCentury: The Role of the National Science FoundationNational Science Boardwww.nsf.gov/cgi-bin/getpub?nsb0022

To Learn More

Page 127: America's Investment in the Future

Astronomyexploring the

expanding universe

Page 128: America's Investment in the Future

sing powerful instruments developed

with NSF’s support, investigators are

closing in on fundamental truths about

the universe. The work of these scien-

tists creates new knowledge about the

Sun, leads to the discovery of planets

around distant stars, and uncloaks the

majestic subtlety of the universe.

U

Page 129: America's Investment in the Future

Eve r s i n c e Ga l i l e o perfected the telescope and made the stars seem closer to

Earth, scientists have been searching the heavens, asking fundamental questions about the

universe and our place in it. Today’s astronomers are finding that they don’t have to go far for

some of the answers. With major funding from NSF, some researchers are exploring the interior

of the Sun by recording and studying sound waves generated near its surface. Others are

discovering planets around distant stars and expressing optimism about finding still more, some

of which may resemble Earth. With sophisticated equipment and techniques, we humans are

finally “seeing” what lurks at the center of the Milky Way, hidden from direct view. We are

making profound progress in uncovering the origins of the universe, estimating when it all began,

and looking at its structure, including the more than 90 percent of its mass known today as

“dark matter.”

Page 130: America's Investment in the Future

Astronomy — 121

New Visions“All of a sudden, astronomers have turned a bigcorner and glimpsed in the dim light of distantlampposts a universe more wondrous than theyhad previously known,” writes John Noble Wilfordin the February 9, 1997, issue of the New YorkTimes. “Other worlds are no longer the stuff ofdreams and philosophic musings. They are outthere, beckoning, with the potential to changeforever humanity’s perspective on its place inthe universe.”

Wilford is describing research by NSF-fundedastronomers Geoffrey W. Marcy and R. Paul Butlerof San Francisco State University, who were amongthe first to discover planets outside our solarsystem. Wilford’s words highlight the excitementand wonder of research in astronomy.

With these and other recent discoveries,astronomers and astrophysicists are taking afresh look at the realities and mysteries of theuniverse. Indeed, all of humankind is learninghow immense and complex is the space we inhabit.Yet as we start to understand some of the phe-nomena around us, many other mysteries arise.

NSF is not alone in funding studies of the skies;much work was done before NSF was estab-lished in 1950, and universities and other gov-ernment agencies have done much since then toadvance our understanding. But NSF funding—covering such things as state-of-the-art tele-scopes, supercomputer sites, and individualresearchers—is one of the main reasons wehave identified so many pieces of the puzzle thatis our universe.

So how do researchers get a handle on some-thing so big? Where do they start? For someastronomers, the answer is close to home.

Voyage to the Center of the SunDespite its relative proximity to Earth, the Sunhas kept its distance, reluctant to reveal itssecrets. Until recently, its inner workings were amystery of cosmic proportions.

For many years, researchers have known thatdeep in the Sun’s interior, 600 million tons ofhydrogen fuse into helium every second, radiatingout the resulting energy. And while the mechanicsof this conversion have been described in theory,the Sun’s interior has remained inaccessible.Now, however, the Sun is being “opened,” itsinternal structures probed, and its inner dynam-ics surveyed by NSF-supported scientists usinginvestigative techniques—a branch of astronomyknown as helioseismology.

“The Sun is the Rosetta stone for understand-ing other stars,” explains John Leibacher, anastronomer at the National Optical AstronomyObservatories in Tucson, Arizona, and director ofthe NSF-funded Global Oscillation Network Group,or GONG. The Rosetta stone is a tablet with aninscription written in Greek, Egyptian hieroglyphic,and Demonic. The stone’s discovery was the keyto deciphering ancient Egyptian hieroglyphic andunlocking the secrets of that civilization.

GONG researchers study the Sun by analyzingthe sound waves that travel through it. Much asthe waves produced by earthquakes and explo-sions roll through the Earth, these solar soundwaves pass through the Sun’s gaseous massand set its surface pulsating like a drumhead.With six telescopes set up around the Earth collecting data every minute, GONG scientistsare learning about the Sun’s structure, dynamics,and magnetic field by measuring and characterizingthese pulsations.

NSF-funded researcher Andrea Ghez

has discovered the presence of an

enormous black hole at the center of

our galaxy. Her work has enormous

implications for our understanding of

how galaxies evolve.

Page 131: America's Investment in the Future

122 — National Science Foundation

Analysis of data from GONG and other sourcesshows that current theories about the structureof the Sun need additional work. For example,the convection zone—the region beneath theSun’s surface where pockets of hot matter risequickly and mix violently with ambient material—is much larger than originally thought. Furthermore,says Leibacher, the zone ends abruptly. “Thereis turbulent mixing and then quiet. We can locatethe discontinuity with great precision.” Someresearch teams are probing deeper and examiningthe Sun’s core; still others are addressing suchtopics as sunspots—places of depressed tem-perature on the surface where the Sun’s magneticfield is particularly intense.

New insight into the Sun’s core came in thespring of 2000, when NSF-funded researchersanalyzing GONG data announced that they haddiscovered a solar “heartbeat.” That is, they hadfound that some layers of gas circulating belowthe Sun’s surface speed up and slow down in apredictable pattern—about every sixteen months.This pattern appears to be connected to the cycleof eruptions seen on the Sun’s surface.

Such eruptions can cause significant disturbancesin Earth’s own magnetic field, wreaking havocwith telecommunications and satellite systems.A major breakthrough in the ability to forecastthese so-called solar storms also came in thespring of 2000, when NSF-funded astrophysicists,using ripples on the Sun’s surface to probe itsinterior, developed a technique to image explo-sive regions on the far side of the Sun. Suchimages should provide early warnings of poten-tially disruptive solar storms before they rotatetoward Earth.

As our nearest star, the Sun has always beenat the forefront of astrophysics and astronomy.(Astrophysicists study the physics of cosmicobjects, while astronomers have a broader jobdescription-—they observe and explore all of theuniverse beyond Earth.) The more we learn about

the Sun, the more we understand about thestructure and evolution of stars and, by exten-sion, of galaxies and the universe. The Sun alsois host to a family of nine planets and myriadasteroids and cometary bodies. As we investi-gate the richness of outer space, we often lookfor things that remind us of home.

New Tools, New DiscoveriesMuch of astronomy involves the search for thebarely visible—a category that describes theoverwhelming majority of objects in the universe,at least for the time being. One of today’s mosteffective tools for detecting what cannot be seenis Arecibo Observatory in Puerto Rico. The siteis one of the world’s largest and most powerfultelescopes for radar and radio astronomy.Operated by Cornell University under a coopera-tive agreement with NSF, the Arecibo telescopecollects extraterrestrial radio waves of almostimperceptible intensity in a 1,000-foot-wide dish.This telescope, used by scientists from aroundthe world, is a dual-purpose instrument. Aboutthree-quarters of the time, the telescope detects,receives, amplifies, and records signals producedby distant astronomical objects. The rest of thetime, it measures reflected radio signals thatwere transmitted by its antenna. The signalsbounce off objects such as planets, comets, andasteroids, allowing researchers to determineeach object’s size and motion.

It was at Arecibo in 1991 that AlexanderWolszczan of Pennsylvania State University dis-covered the first three planets found outside oursolar system. With support from NSF, Wolszczandiscovered these planets by timing the radio sig-nals coming from a distant pulsar—a rapidlyrotating neutron star—7,000 trillion miles fromEarth in the constellation Virgo. He saw small,regular variations in the pulsar’s radio signal andinterpreted them as a complicated wobble in thepulsar’s motion induced by planets orbiting the

This computer representation depicts

one of nearly ten million modes of sound

wave oscillations of the Sun, showing

receding regions in red tones and

approaching regions in blue. Using the

NSF-funded Global Oscillation Network

Group (GONG) to measure these

oscillations, astronomers are learning

about the internal structure and

dynamics of our nearest star.

Page 132: America's Investment in the Future

Sample Chapter title — 123

While telescopes, spacecraft, andother means of collecting data arecritical, not all researchers turn tothe heavens for inspiration. Someturn to their computers to take acloser look at the big picture.

The Grand Challenge CosmologyConsortium (GC

3) is a collaboration

of cosmologists, astrophysicists, andcomputer scientists who are model-ing the birth and early infancy of theuniverse. Consortium members usehigh performance computers at theNSF-supported SupercomputerCenters—the precursor of the currentPartnerships for AdvancedComputational Infrastructure pro-gram—to create a three-dimensionalmodel of the formation of galaxiesand large-scale structures in the earlyuniverse. The consortium uses someof the most powerful supercomputersavailable to perform the billions ofcalculations required to figure outhow the universe came to be.

In an effort to understand the roleof dark matter in galaxy cluster for-mation, Michael Norman and GregoryBryan carried out a simulation at theNational Center for SupercomputingApplications at the University of

Illinois at Urbana-Champaign in 1994.The simulation produced a modelthat accurately predicted the numberand arrangement of galaxy clusters.The prediction was confirmed byrecent observations by an orbiting X-ray satellite. While the simulation didnot capture exactly the measurableratio of luminous gas to dark matter,efforts are underway to improve themodel’s resolution. “Everyone ismotivated to find out what dark mat-ter is,” says Norman, “but there isnothing definitive yet.”

Perhaps the most intriguing aspectof the GC

3is its ability to simulate

situations never seen by humans. Ina public display of computer simula-tion, members of GC

3teamed up with

IMAX to create the 1996 film CosmicVoyage, a short feature that wasnominated for an Academy Award.From the safety of their theater seats,audiences can view the life of theuniverse, from its explosive, Big Bangbirth, to the current hubbub of galaxylife. The film also includes a startlinganimation of what would happen iftwo spiral galaxies—like the MilkyWay and neighboring AndromedaGalaxy—were to collide.

Visualizing the Big Picture

Page 133: America's Investment in the Future

124 — National Science Foundation

pulsar. Two of the planets are similar in mass tothe Earth, while the mass of the third is aboutequal to that of our moon. It is unlikely that anyof these newly discovered planets support life,because the tiny pulsar around which they orbitconstantly bombards them with deadly electro-magnetic radiation. Wolszczan’s work helpsastronomers understand how planets are formed,and his discovery of planets around an object asexotic as a pulsar suggests that planets may befar more common than astronomers had previ-ously thought.

In 1995, four years after Wolszczan’s discovery,two Swiss astronomers announced that they hadfound a fourth new planet, orbiting a star similarto the Sun. Two American astronomers, GeoffreyMarcy and Paul Butler, confirmed the discoveryand, the following year, announced that their NSF-supported work culminated in the discovery ofanother two planets orbiting sun-like stars. Usingan array of advanced technologies and sophisti-cated analytic techniques, Marcy, Butler, and otherastronomers have since discovered more extra-solar planets. An especially astonishing discoverywas made in 1999 by two independent NSF-sup-ported teams of the first multi-planet system—other than our own—orbiting its own star. At leastthree planets were found by Marcy, Butler, andothers to be circling the star Upsilon Andromedae,making it the first solar system ever seen tomimic our own.

By August 2000, the number of extrasolarplanets had topped fifty, and more such sightingswere expected. Based on the discovery of theseplanets, it seems as if the Milky Way is rife withstars supporting planetary systems. But what ofthe Galaxy itself? Is it a calm stellar metropolis,or are there more mysteries to uncover?

At the Center of the Milky WayIn the mid-eighteenth century, philosopherImmanuel Kant suggested that the Sun and itsplanets are embedded in a thin disk of stars.Gazing at the diffuse band of light we call theMilky Way on a dark night supports Kant’s boldstatement. But understanding the nature andappearance of our galaxy is no small feat, for welive within a disk of obscuring gas and dust.

Our Sun is part of a large disk made up ofstars and large clouds of molecular and atomicgas in motion around the Galaxy’s center. Oursolar system orbits this center, located about30,000 light-years away from Earth, at 500,000miles per hour. It takes our solar system twohundred million years to make a single orbit ofthe galaxy.

Astronomers can infer the shape and appear-ance of our galaxy from elaborate observations,and, as a result, have created maps of our galaxy.Yet parts of the Milky Way remain hidden—blockedby light-years of obscuring material (gas and dust)spread between the stars.

Andrea Ghez is working to penetrate the mys-teries of this interstellar material. Ghez is anastronomer at the University of California at LosAngeles and an NSF Young Investigator, a nationalaward given to outstanding faculty at the begin-ning of their careers. Her observations of thecentral regions of the Milky Way have permittedher to examine its very heart. Ghez, like manyothers, theorized that the Galaxy’s core is thehome of a supermassive black hole. “Althoughthe notion has been around for more than twodecades, it has been difficult to prove that [ablack hole] exists,” says Ghez. Now it appearsher observations offer that proof.

Using one of the two W. M. Keck 10-metertelescopes on Mauna Kea in Hawaii, Ghez lookedat the innermost regions of the Galaxy’s core.For three years, she studied the motions of ninetystars. While scientists already knew that those

As Wolszczan, Marcy, Butler, and others

continue their search for new planets,

other astronomers have found evidence

of the most powerful magnetic field

ever seen in the universe. They found

it by observing the “afterglow” of

subatomic particles ejected from a

magnetar. This neutron star, illustrated

above, has a magnetic field billions of

times stronger than any on Earth and

one hundred times stronger than any

other previously known in the universe.

Page 134: America's Investment in the Future

Astronomy — 125

Detecting PlanetsAround Other Stars

In the vastness of the universe, arewe humans alone? The answerdepends on whether there are otherplanets that are endowed with thewarm climate, diverse chemicals,and stable oceans that provided theconditions for biological evolution toproceed here on Earth.

We and other astronomers recentlytook an important step towardaddressing some of these questionswhen we reported finding that plan-ets do exist outside our own solarsystem . . . . Already the propertiesof these extrasolar planets havedefied expectations, upsetting exist-ing theories about how planets form.

It took a long time to find extraso-lar planets because detecting themfrom Earth is extraordinarily difficult.Unlike stars, which, like our Sun,glow brightly from the nuclear reac-tions occurring within, planets shineprimarily by light that is reflected offthem from their host star.

Astronomers gained the means tofind planets around other stars witha clever new technique that involvessearching for a telltale wobble in themotion of a star. When planets orbita star, they exert a gravitationalforce of attraction on it. The force on the star causes it to be pulledaround in a small circle or oval inspace. The circle or oval is simply a miniature replica of the planet’sorbital path. Two embracing dancerssimilarly pull each other around incircles due to the attractive forcesthey exert on each other. This wob-

ble of a star gives away the pres-ence of an orbiting planet, eventhough the planet cannot be seendirectly.

However, this stellar wobble is verydifficult to detect from far away. Anew technique has proven to beextraordinarily successful. The key isthe Doppler effect—the change inthe appearance of light waves andother types of waves from an objectthat is moving away from or toward aviewer. When a star wobbles towardEarth, its light appears from Earth tobe shifted more toward the blue, orshorter, wavelength, of the visiblelight spectrum than it would have ifthe star had not moved towardEarth. When the star wobbles awayfrom Earth, the opposite effectoccurs. The wavelengths arestretched. Light from the starappears to be shifted toward the red,or longer wavelength, end of thespectrum in a phenomenon known asred shift. Astronomers can determinethe velocity of a star from theDoppler shift because the Dopplershift is proportional to the speedwith which the star approaches orrecedes from a viewer on Earth.—Geoffrey W. Marcy and R. Paul Butler,NSF-funded astronomers at SanFrancisco State University and theUniversity of California at Berkeley,respectively. Excerpted from“Detecting Planets Around OtherStars,” reprinted with permission.Encarta, May 1997

Page 135: America's Investment in the Future

126 — National Science Foundation

stars nearest the center of the Galaxy movequickly in their orbits, Ghez was astonished todiscover that the stars nearest the center of the Milky Way were moving at speeds as high as3 million miles per hour. Only a very largeassembly of superconcentrated mass inside thestars’ orbits could whip them around at thosespeeds. “The high density we observe at thevery center of the Milky Way exceeds thatinferred for any other galaxy, and leads us toconclude that our galaxy harbors a black holewith a mass 2.6 million times that of the Sun,”Ghez notes.

Astronomers do not think that a supermassiveblack hole at the center of a galaxy is unique tothe Milky Way. Rather, it appears to be quite typi-cal of the almost innumerable galaxies in theobservable universe. The fact that black holes maybe the rule rather than the exception makes it evenmore important that we continue to study them.

The Origins of the UniverseBy observing galaxies formed billions of years ago,astronomers have been able to paint an increas-ingly detailed picture of how the universe evolved.According to the widely accepted Big Bang theory,

our universe was born in an explosive momentapproximately fifteen billion years ago. All of theuniverse’s matter and energy—even the fabric ofspace itself—was compressed into an infinitesi-mally small volume and then began expanding atan incredible rate. Within minutes, the universehad grown to the size of the solar system andcooled enough so that equal numbers of pro-tons, neutrons, and the simplest atomic nucleihad formed.

After several hundred thousand years of ex-pansion and cooling, neutral atoms—atoms withequal numbers of protons and electrons—wereable to form and separate out as distinct entities.Still later, immense gas clouds coalesced to formprimitive galaxies and, from them, stars. Our ownsolar system formed relatively recently—aboutfive billion years ago—when the universe was two-thirds its present size.

In April 2000, an international team of cosmolo-gists supported in part by NSF, released the firstdetailed images of the universe in its infancy. Theimages reveal the structure that existed in the uni-verse when it was a tiny fraction of its current ageand one thousand times smaller and hotter thantoday. The project, dubbed BOOMERANG (Balloon

The Gemini 8-meter telescope on Mauna

Kea, Hawaii, is one of the new tools

astronomers are using to search for the

barely visible. This telescope, along

with other NSF-funded observatories—

including Arecibo in Puerto Rico and

the Very Large Array in New Mexico—

enable astronomers to discover and

explain the origins of the universe.

Page 136: America's Investment in the Future

Astronomy — 127

Observations of Millimetric Extragalactic Radiationand Geophysics) captured the images using anextremely sensitive telescope suspended from aballoon that circumnavigated the Antarctic in late1998. The BOOMERANG images were the first tobring into sharp focus the faint glow of microwaveradiation, called the cosmic microwave background,that filled the embryonic universe soon after theBig Bang. Analysis of the images already has shedlight on the nature of matter and energy, and indi-cates that space is “flat.”

The roots of the Big Bang theory reach backto 1929, the year Edwin Hubble and his assistantMilton Humason discovered that the universe isexpanding. Between 1912 and 1928, astronomerVesto Slipher used a technique called photo-graphic spectroscopy—the measurement of lightspread out into bands by using prisms or diffrac-tion gratings—to examine a number of diffuse,fuzzy patches. Eventually, Hubble used thesemeasurements, referred to as spectra, to showthat the patches were actually separate galaxies.Slipher, who did his work at Lowell Observatoryin Flagstaff, Arizona, found that in the vast majorityof his measurements the spectral lines appearedat longer, or redder, wavelengths. From this heinferred that the galaxies exhibiting such “redshifts” were moving away from Earth, a conclusionhe based on the Doppler effect. This effect, dis-covered by Austrian mathematician and physicistChristian Doppler in 1842, arises from the rela-tive motion between a source and an observer.This relative motion affects wavelengths and fre-quencies. Shifts in frequency are what makeambulance sirens and train whistles sound higher-pitched as they approach and lower-pitched asthey move away.

Hubble took these findings and eventuallydetermined the distances to many of Slipher’sgalaxies. What he found was amazing: The galax-ies were definitely moving away from Earth, but,the more distant the galaxy, the faster it retreated.Furthermore, Hubble and Humason discovered

that the ratio of a galaxy’s speed (as inferred fromthe amount of red shift) to its distance seemedto be about the same for all of the galaxies theyobserved. Because velocity appeared proportionalto distance, Hubble reasoned, all that remainedwas to calculate that ratio—the ratio now referredto as the Hubble Constant.

And what is the value of the Hubble Constant?After seventy years of increasingly precise measurements of extragalactic velocities anddistances, astronomers are at last closing in onthis elusive number.

Wendy Freedman is one of the scientists work-ing to define the Hubble Constant. As head of aninternational team at the Carnegie Observatoriesin Pasadena, California, Freedman surveys theheavens using the Hubble Space Telescope tomeasure distances to other galaxies. With grantsfrom NSF, she is building on the legacy ofHenrietta Leavitt, who discovered in the early1900s that the absolute brightness of Cepheidvariable stars is related to the time it takes thestars to pulsate (its period). Scientists can mea-sure the period of a Cepheid in a distant galaxyand measure its apparent brightness. Since theyknow the period, they know what the absolutebrightness should be. The distance from Earth tothe Cepheid variable star is inferred from the dif-ference between absolute and apparent brightness.Freedman and her colleagues are using this methodto determine distances to other galaxies. Withthese Cepheid distances, Freedman’s group cali-brates other distance-determination methods toreach even more far-flung galaxies. This informa-tion, in turn, enables them to estimate theHubble Constant.

Researchers closing in on a definitive value forthe Hubble Constant are doing so in the midst ofother exciting developments within astronomy. In1998, two independent teams of astronomers,both with NSF support, concluded that the ex-pansion of the universe is accelerating. Their

Radio telescopes from the NSF-funded

Very Large Array in New Mexico are

helping astronomers to map our universe.

Page 137: America's Investment in the Future

128 — National Science Foundation

unexpected findings electrified the scientific com-munity with the suggestion that some unknownforce was driving the universe to expand at anever increasing rate. Earlier evidence had supportedanother possibility, that the gravitational attrac-tion among galaxies would eventually slow theuniverse’s growth. In its annual survey of the news,Science magazine named the accelerating universeas the science discovery of the year in 1998.

Jeremy Mould, director of Mount Stromlo andSiding Spring Observatories in Canberra, Australia,has studied another aspect of the expansion ofthe universe. Scientists generally assume thateverything in the universe is moving uniformlyaway from everything else at a rate given by theHubble Constant. Mould is interested in depar-tures from this uniform Hubble flow. These motionsare known as peculiar velocities of galaxies.Starting in 1992, Mould and his colleague JohnHuchra of the Harvard Smithsonian Center forAstrophysics used an NSF grant to study peculiarvelocities of galaxies by creating a model of theuniverse and its velocity that had, among otherthings, galaxy clusters. These galaxies in clusterswere accelerated by the gravitational field of allthe galaxies in the locality. All other things beingequal, a high-density universe produces largechanges in velocity. This means that measure-ments of peculiar velocities of galaxies can beused to map the distribution of matter in the uni-

verse. Mould and Huchra’s model has seededmajor efforts to collect measurements of theactual density of the universe so as to map itsmass distribution directly.

In the modular universe—where stars areorganized into galaxies, galaxies into clusters,clusters into superclusters—studies of galaxies,such as those conducted by Mould, give us cluesto the organization of larger structures. To appre-ciate Mould’s contribution to our understandingof these organizing principles, consider that a richgalaxy cluster can contain thousands of galaxies,and each galaxy can contain tens of billions tohundreds of billions of stars. Astronomers now esti-mate that there are tens of billions of galaxies inthe observable universe. Large, diffuse groupingsof galaxies emerging from the empty grandeur ofthe universe show us how the universe is puttogether—and perhaps even how it all came to be.

Only one of those extragalactic islands ofstars—the Andromeda Galaxy—is faintly visibleto the naked eye from the Northern Hemisphere,while two small satellite galaxies of the MilkyWay—the Large and Small Magellanic Clouds—can be seen from Earth’s Southern Hemisphere.Telescopes augmented with various technologieshave enabled astronomers—notably NSF granteeGregory Bothun of the University of Oregon—todiscover galaxies that, because of their extremediffuseness, went undetected until the 1980s.These “low-surface-brightness” galaxies effec-tively are masked by the noise of the night sky,making their detection a painstaking process.More than one thousand of these very diffusegalaxies have been discovered in the pastdecade, but this is only the beginning.“Remarkably, these galaxies may be as numer-ous as all other galaxies combined,” saysBothun. “In other words, up to 50 percent of thegeneral galaxy population of the universe hasbeen missed, and this has important implica-tions with respect to where matter is located inthe universe.”

This galaxy in the constellation Cygnus

is nearly 20 million light-years from

Earth. Galaxies such as this one are

helping astronomers understand the

expansion of the universe, its density,

and organization.

Page 138: America's Investment in the Future

Cosmologists work to understandhow the universe came into being,why it looks as it does now, andwhat the future holds. They makeastronomical observations that probebillions of years into the past, to theedge of the knowable universe. Theyseek the bases of scientific under-standing, using the tools of modernphysics, and fashion theories thatprovide unified and testable modelsof the evolution of the universe fromits creation to the present, and intothe future . . . .

Do physics and cosmology offer aplausible description of creation? Ascosmologists and physicists push theboundary of our understanding of theuniverse ever closer to its beginning,one has to wonder whether the cre-ation event itself is explainable byphysics as we know it, or can everknow it . . . .

Clearly, these questions are at theheart of humankind’s quest to under-stand our place in the cosmos. Theyinvolve some of the most fundamentalunanswered questions of physical sci-ence. But why, in a time of greatnational needs and budget deficits,should the U.S. taxpayer support suchseemingly impractical research . . . ?

In fact, far from being impractical,cosmological research producesimportant benefits for the nation andthe world. . . . [I]t has unique techni-cal spinoffs. Forefront research incosmology drives developments ininstrumentation for the collection,

manipulation, and detection of radia-tion at radio, infrared, visible, ultravi-olet, X-ray, and gamma-ray wave-lengths. The understanding and appli-cation of such types of radiation arethe foundation for many importanttechnologies, such as radar, commu-nications, remote sensing, radiology,and many more . . . .

Our cosmology—every culture’scosmology—serves as an ethicalfoundation stone, rarely acknowl-edged but vital to the long-term sur-vival of our culture . . . . For example,the notion of Earth as a limitless,indestructible home for humanity isvanishing as we realize that we liveon a tiny spaceship of limitedresources in a hostile environment. How can our species make the bestof that? Cosmological time scalesalso offer a sobering perspective forviewing human behavior. Nature seemsto be offering us millions, perhaps bil-lions, of years of habitation on Earth.How can we increase the chancesthat humans can survive for a signifi-cant fraction of that time? Cosmologycan turn humanity’s thoughts out-ward and forward, to chart the back-drop against which the possiblefutures of our species can be mea-sured. This is not irrelevant knowl-edge; it is vital.—Excerpted from Cosmology: AResearch Briefing. Reprinted with permission of the National ResearchCouncil, National Academy ofSciences.

Why Cosmology?

Page 139: America's Investment in the Future

130 — National Science Foundation

The Hunt for Dark Matter Even with all of the galaxies that Bothun and oth-ers expect to find, researchers still say much ofthe matter in the universe is unaccounted for.

According to the Big Bang theory, the nuclei ofsimple atoms such as hydrogen and heliumwould have started forming when the universewas about one second old. These processesyielded certain well-specified abundances of theelements deuterium (hydrogen with an extra neu-tron), helium, and lithium. Extensive observa-tions and experiments appear to confirm the the-ory’s predictions within specified uncertainties,provided one of two assumptions is made: (1)the total density of the universe is insufficient to keep it from expanding forever, or (2) the dominant mass component of the universe is not ordinary matter. Theorists who favor the second assumption need to find more mass inthe universe, so they must infer a mass compo-nent that is not ordinary matter.

Part of the evidence for the second theorywas compiled by Vera Rubin, an astronomer atthe Carnegie Institution of Washington whoreceived NSF funding to study orbital speeds ofgas around the centers of galaxies. After clockingorbital speeds, Rubin used these measurementsto examine the galaxies’ rotational or orbitalspeeds and found that the speeds do not dimin-ish near the edges. This was a profound discovery,

because scientists previously imagined that objectsin a galaxy would orbit the center in the sameway the planets in our galaxy orbit the Sun. Inour galaxy, planets nearer the Sun orbit muchfaster than do those further away (Pluto’s orbitalspeed is about one-tenth that of Mercury). Butstars in the outer arms of the Milky Way spiraldo not orbit slowly, as expected; they move asfast as the ones near the center.

What compels the material in the Milky Way’souter reaches to move so fast? It is the gravita-tional attraction of matter that we cannot see, atany wavelength. Whatever this matter is, there ismuch of it. In order to have such a strong gravita-tional pull, the invisible substance must be five toten times more massive than the matter we cansee. Astronomers now estimate that 90 to 99percent of the total mass of the universe is thisdark matter—it’s out there, and we can see itsgravitational effects, but no one knows what it is.

At one of NSF’s Science and TechnologyCenters, the Center for Particle Astrophysics atthe University of California at Berkeley, investiga-tors are exploring a theory that dark matter con-sists of subatomic particles dubbed WIMPs, or“weakly interacting massive particles.” Theseheavy particles generally pass undetected throughordinary matter. Center researchers BernardSadoulet and Walter Stockwell have devised anexperiment in which a large crystal is cooled toalmost absolute zero. This cooling restricts themovements of crystal atoms, permitting any heatgenerated by an interaction between a WIMP andthe atoms to be recorded by monitoring instruments.A similar WIMP-detection project is under way inAntarctica, where the NSF-supported AntarcticMuon and Neutrino Detector Array (AMANDA) pro-ject uses the Antarctic ice sheet as the detector.

In the spring of 2000, NSF-supported astro-physicists made the first observations of an effectpredicted by Einstein that may prove crucial inthe measurement of dark matter. Einstein argued

Dark matter makes up most of the uni-

verse, but no one knows how much of

it there is. Researchers use computer

simulations such as this one to test dif-

ferent ratios of cold and hot (dark)

matter in an attempt to learn more

about the components of our universe

Page 140: America's Investment in the Future

Astronomy — 131

that gravity bends light. The researchers studiedlight from 145,000 very distant galaxies for evi-dence of distortion produced by the gravitationalpull of dark matter, an effect called cosmic shear.By analyzing the cosmic shear in thousands ofgalaxies, the researchers were able to determinethe distribution of dark matter over large regionsof the sky.

Cosmic shear “measures the structure of darkmatter in the universe in a way that no otherobservational measurement can,” says AnthonyTyson of Bell Labs, one of the report’s authors.“We now have a powerful tool to test the founda-tions of cosmology.”

Shedding Light on Cosmic VoidsEven with more than 90 percent of its mass dark,the universe has revealed enough secrets to per-mit initial efforts at mapping its large-scalestructure. Improved technologies have enabledastronomers to detect red shifts and infer veloci-ties and distances for many thousands of galaxies.New research projects will plumb the secrets ofnearly one million more. And yet, we have muchmore to learn from the hundreds of billions ofgalaxies still unexplored.

Helping in the exploration is an ingeniousmethod, developed with help from NSF, that iscommonly used to estimate distances to andmap the locations of remote galaxies. R. BrentTully of the University of Hawaii and his colleagueat the NSF-funded National Radio AstronomyObservatory, J. Richard Fisher, discovered thatthe brighter a galaxy—that is, the larger or moremassive it is, after correcting for distance—thefaster it rotates. Using this relationship, scientistscan measure the rotation speed of a galaxy. Oncethat is known, they know how bright the galaxyshould be. Comparing this with its apparentbrightness allows scientists to estimate the galaxy'sdistance. The Tully-Fisher method, when properlycalibrated using Cepheid variable stars, is provingto be an essential tool for mapping the universe.

In early 2000, researchers announced the dis-covery of a previously unknown quasar that quali-fied as the oldest ever found—indeed, as amongthe earliest structures to form in the universe.Quasars are extremely luminous bodies that emitup to ten thousand times the energy of the MilkyWay. Eventually our maps will include everythingwe know about the universe—its newly revealedplanets, the inner workings of the stars, distantnebula, and mysterious black holes. With our mapin hand and our new understanding of how theuniverse began and continues to grow, we humanswill have a better chance to understand our placein the vast cosmos.

Researchers at the Electronic

Visualization Laboratory at the

University of Illinois at Chicago used

data provided by astronomers to create

this image of our universe.

Arecibo Observatorywww.naic.edu/aboutao.htm

National Optical Astronomy Observatorieswww.noao.edu

National Center for Supercomputing ApplicationsMultimedia Online Expo: Science for the MillenniumWhispers from the Cosmos www.ncsa.uiuc.edu/Cyberia/Bima/BimaHome.html

Cosmos in a Computerwww.ncsa.uiuc.edu/Cyberia/Cosmos/CosmosCompHome.html

Center for Particle Astrophysicshttp://cfpa.berkeley.edu/home.html

NSF Division of Astronomical Scienceswww.nsf.gov/mps/ast

Global Oscillation Network Group (GONG)www.gong.noao.edu/index.html

Grand Challenge Cosmology Consortium (GC3)

http://zeus.ncsa.uiuc.edu:8080/GC3_Home_Page.html

National Radio Astronomy Observatorywww.nrao.edu

Carnegie Observatorieswww.ociw.edu

Carnegie Institution of Washingtonwww.ciw.edu

Harvard Smithsonian Center for Astrophysicshttp://sao-www.harvard.edu

To Learn More

Page 141: America's Investment in the Future

Science on the Edge

arctic andantarctic discoveries

Page 142: America's Investment in the Future

he polar regions provide unique

natural laboratories for the study of

complex scientific questions, ranging

from human origins in the New World

to the expansion of the universe.

T

Page 143: America's Investment in the Future

Peop l e h ave s t u d i e d the polar regions for centuries. The extreme cold and

stark beauty of the Arctic and Antarctic capture the imaginations of explorers, naturalists, and

armchair travelers. In the latter half of the twentieth century, NSF-funded scientists discovered

that the Arctic and the Antarctic have much to teach us about our Earth and its atmosphere,

oceans, and climate. For example, cores drilled from the great ice sheets of Greenland and

Antarctica tell a story of global climate changes throughout history. During NSF’s lifetime,

the extreme environments of the Arctic and Antarctic have become learning environments.

Page 144: America's Investment in the Future

Science on the Edge — 135

A Surprising Abundance of LifeBoth the Arctic and Antarctic seem beyond life:icy, treeless, hostile places. Yet these polar regionshost a surprising abundance of life, ranging fromthe microbial to the awe-inspiring, from bacteriato bowhead whales.

Important differences mark North and South.The North Pole lies in the middle of an oceansurrounded by land, while the South Pole risesfrom the center of a continent surrounded by anocean. In the Arctic, human habitation stretchesback for thousands of years. The Inuit and otherindigenous peoples in the Arctic continue to carryout age-old traditions while adopting modern tech-nology for subsistence hunting and fishing. TheAntarctic has no “native” human populations buthosts a visiting population of scientists and sup-port personnel every year. Human migration andmethods of interacting with the environment formimportant research topics for NSF-supported socialscientists who work in the Arctic, while the humanscientists in the Antarctic focus on the effects ofisolated and confined environments.

The poles were still poorly understood placeswhen scientists the world over organized a specialeffort called the International Geophysical Year(IGY) to study the Earth and Sun on an unprece-dented scale. The IGY, which ran from July 1957to December 1958, was modeled on two previousInternational Polar Years and brought NSF firmlyinto the realm of polar science.

During the First Polar Year (1882–83), scien-tists and explorers journeyed to the icy marginsof the Earth to collect data on weather patterns,the Earth’s magnetic force, and other polar phe-nomena that affected navigation and shipping inthe era of expanding commerce and industrialdevelopment. In all, the First Polar Year inspiredfifteen expeditions (twelve to the Arctic and threeto the Antarctic) by eleven nations. Along the way,researchers established twelve research stations.

By the Second Polar Year (1932–33), new fieldsof science had evolved, such as ionosphericphysics, which peers into the outer layer of Earth’satmosphere. Scientists at the time turned to thepolar regions to study the aurora phenomena—known in the Northern Hemisphere as the“northern lights”—and their relation to magneticvariations, cosmic radiation, and radio wave dis-turbances. How did the sun, the atmosphere, andthe Earth’s interior interact at the poles? Couldscientists learn how to anticipate the magneticstorms that sometimes disrupt radio-based com-munications? Data collected during the SecondPolar Year contributed to new meteorologicalmaps for the Northern Hemisphere and verifiedthe effects of magnetic storms on radio waves.

The United States has supported research

at the Amundsen-Scott South Pole

Station continuously since 1956. The

current station, completed in 1975, is

being redeveloped to meet the changing

needs of the U.S. science community.

Today’s research at the South Pole

includes astronomy, astrophysics, and

atmospheric monitoring—e.g. ozone

depletion and greenhouse gas concen-

trations. To the left in the picture is the

geodesic dome that currently houses

the main station buildings. On the right,

a ski-equipped Hercules airplane waits

on the South Pole skiway.

Page 145: America's Investment in the Future

136 — National Science Foundation

Still, scientists lacked a complete picture of howice, atmosphere, land, and oceans worked togetherat the poles as a system of cause and effect.

Technological advancements in rockets, satel-lites, and instrumentation during the 1940s and1950s allowed more and better measurements inthe remote Arctic and Antarctic. By the time of the1957–58 IGY, researchers were free to explorethe ocean floor as well as the upper atmosphere:they could use nuclear-powered submarines toplunge under the ice cap and discover new oceanridges, and launch rocket-powered satellites to make remote geophysical measurements. Forthe first time, the polar regions became year-round research platforms available for widespreadinternational cooperation. Furthermore, everyday

citizens became involved in scientific observations.People in the far north and the far south recordedtheir own aurora sightings and temperature read-ings, information that was funneled to scientists.Sixty-seven countries participated in the IGY,including the United States and the Soviet Union.Despite Cold War tensions between east and west,the world was engaged in cooperative, coordinatedscience at the poles and in other parts of the world.

The IGY set the stage for polar research at NSFin two ways. First, scientists came to think of thepoles as natural laboratories in which to captureand integrate diverse data about “the heavens andthe earth.” Second, polar research became acooperative international undertaking. Followingthe IGY, the twelve countries that had establishedsome sixty research stations in Antarctica concludeda treaty to use Antarctica for peaceful purposesonly, to freely exchange scientific information, toprohibit nuclear explosions and disposal of radio-active wastes, and to maintain free access to anyarea on the continent. By 1999, the AntarcticTreaty had forty-four parties, representing two-thirdsof the world’s human population; other agreementswere made, too, including a protocol for improvedenvironmental protection of Antarctica.

The 1990s also saw cooperation blossom upnorth. In 1996, the eight Arctic nation-states estab-lished the Arctic Council—the result of a processof negotiations aimed at protecting the Arcticenvironment while also allowing for vital research.

Near Antarctica’s coast lie the McMurdo

Dry Valleys, a long-term ecological

research site funded by the NSF. The

bizarre rock formation shown here,

called a "ventifact," was sculpted by

wind-blown particles of ice and stone.

Ventifacts are common to dry, windy

places like Antarctica, the high deserts,

and Mars. In fact, the Dry Valleys served

as practice terrain prior to NASA’s launch

of the Viking probe to Mars.

Page 146: America's Investment in the Future

Ice Cores Hold Earth’s ClimateAs ice forms, gasses and other materi-als are trapped in the layers that buildup over time. This makes the polarregions time machines. With more than500,000 years of snow and ice accu-mulation, the ice sheets are ideal placesfor paleoclimatologists to set up theirtubular drills and extract cores—longcylinders of sediment and rock—in orderto read the history captured therein.

Working in the center of Antarctica’sice sheet, near the Russian researchbase of Vostok, a group of researchersfrom the United States, Russia, andFrance have extracted the world’sdeepest core. As a result, the scien-tists have differentiated more than four ice ages, or about 400,000 yearsof history.

What researchers are discovering is that Earth’s climate is not stable,and never has been. Ice ages are punc-tuated by interglacial periods of relativewarmth, such as the one marking theclose of the twentieth century. Theinterglacial periods have been markedby sudden shifts in temperature, windpatterns, and sea levels.

“Some of these rapid changes occurin two decades,” says Paul Mayewski,

a glaciologist from the University ofNew Hampshire and a thirty-year veter-an of NSF-funded research in Antarctica.“Some [of the pattern changes] actual-ly start in less than two years.” Whilehe finds these dramatic shifts surpris-ing, he also notes that Antarctic coresare in sync with the climate data foundin the ice cores from Greenland.

Mayewski and his colleagues learnabout these changes by examining thechemical indicators, such as sea salt,within the extracted ice cores. Highsea salt levels signal increased stormi-ness and stronger winds. In addition,measurements of oxygen isotopes inthe ice reveal cooling during periods of increased sea salt. Other tests probe for indicators of wind patterns,volcanic activity, and sea level.

However, the researchers still don’tknow what caused the rapid climate pat-tern changes evidenced in the ice cores.

“We need to understand how thesechanges work in order to make a betterassessment of natural climatic change,”Mayewski says, “and a better assess-ment of the human impact on thefuture climate.”

Page 147: America's Investment in the Future

138 — National Science Foundation

Human Migration and Local KnowledgeScientists are only the most recent human arrivalsto the poles—people have lived in the Arctic forthousands of years and the region offered the firstmigratory route for humans moving into NorthAmerica. At least twelve thousand years ago, andpossibly earlier, newcomers to North America arethought to have crossed to present-day Alaskafrom northeast Asia via Beringia, a vast plain—now submerged—that once connected the twoland masses. Until recently, scientists believedthat the newcomers entered the present-day YukonTerritory after crossing the land bridge, then head-ed south through an inland route. But recently, ascience team funded by NSF offered evidence insupport of another theory, which suggests thatrather than going inland, the newcomers usedwatercraft along the southern margin of Beringiaand southward along the northwest coast of NorthAmerica. This may have enabled humans to enter

the southern areas of the Americas prior to themelting of the continental glaciers. In 1997, NSF-funded researchers excavated a cave on Prince ofWales Island, Alaska, and found parts of a humanjaw and pelvis dating to between 9,200 and 9,800years ago, the oldest human bones ever found inAlaska. Isotope analysis of the bones showed thatthe person had subsisted on a marine diet. Thesefirst peoples would have had plenty of fish andother marine resources to eat as they moved inskin-covered boats along the Pacific Coast southto Peru and Chile during the last ice age.

While the story of the first people to arrive inNorth America continues to unfold, another sideof the story tells of the close collaboration betweenscientists and contemporary indigenous commu-nities in the Arctic. During their excavations of thecave on Prince of Wales Island, the archaeologistssought and attained the approval and collaborationof the tribal governments in Alaska. Alaska Nativeinterns work on the site and present researchpapers at archaeology meetings. Tribal councilsdiscuss news of scientific discoveries. This rela-tionship of mutual trust and learning exemplifiesthe Principles for the Conduct of Research in theArctic, a set of guidelines based on the ethicalresponsibility of researchers working in the Northto consult, listen to, and involve the people ofthe North.

The Principles, adopted in 1990 by the NSF-chaired Interagency Arctic Research PolicyCommittee, echo the wish of Arctic peoples thatscience involve them as partners. After all, scienceconsists in part of good, systematic observation,a critical element in the long-term survival ofindigenous peoples who have for generationscarved out economies and cultures in a challenging

Inupiat whalers wait by the sea ice

edge near Barrow, Alaska. When a

bowhead approaches, they will slip

their seal skin umiaq into the water,

anticipating where the whale will surface.

According to the Principles for the

Conduct of Research in the Arctic, a set

of interagency guidelines adopted

under the leadership of NSF, Arctic peo-

ples are partners in research. As experts

of survival in the North, indigenous

elders have already helped scientists

understand, for example, how the

beaver population can affect whale

migration patterns.

Page 148: America's Investment in the Future

Science on the Edge — 139

environment. What’s more, indigenous peopleshave developed time-tested technologies, such astoggle harpoons and skin boats, well suited for theNorth. NSF-supported research teams, includingNative elders and social scientists, have tappedinto this locally held knowledge of the Arctic environment to enrich ecological models and todocument oral traditions.

For example, from 1995 to 1997 researchersconducting an NSF-funded study of beluga whaleecology in the Bering Sea asked Native whalersand elders to analyze patterns of whale migration.Surprisingly, the elders began to talk not only aboutbelugas—the white whales with the “smirk”—butalso about beavers. As the beaver population rises,more streams leading to the bay are dammed,spawning habitat for the salmon disappears, andfewer fish are available as prey for the belugas.Thus, the belugas may start to bypass the rivermouth during their migrations.

The Importance of Sea IceAnother important topic for researchers is sea ice.Polar sea ice undergoes tremendous changesevery year. During the winter, the Arctic ice packgrows to the size of the United States; in thesummer, half of the ice disappears. On the otherside of the globe, ice at the South Pole coversnearly 98 percent of the Antarctic continent andaverages one mile thick. The sea ice surroundingAntarctica changes in size depending on the sea-son, ranging from roughly 4 million square milesin February (the Antarctic summer) to 19 millionin August. So huge is the Antarctic ice pack thatit accounts for 90 percent of the world’s ice and70 percent of its fresh water.

Like Doing Research on the MoonNSF began managing the entireU.S. Antarctic Program (USAP)in 1970, providing not onlyresearch facilitation but alsooverall logistics management.The program maintains three year-round research stations and tworesearch vessels capable of nav-igating through ice, as well aslaboratories, telescopes, and othermajor instruments positionedacross the continent. Besidesdeveloping the U.S. scientificagenda in cooperation withresearchers, NSF provides forthe health, safety, and overallwell-being of 3,500 Americanscientists and support personnel,most of whom arrive and departbetween October and February,when the continent is readilyaccessible by plane or ship.

Due to the continent’s remotelocation and the fact that allprovisions, building materials,fuel, equipment, and instrumentsmust be brought in by ship orcargo plane, scientists say theexperience ofworking in Antarcticais “like doing research on themoon.” Each year, NSF improvesthe connections between theUSAP and the rest of the world,making the region a little lessisolated. All of the research stations now have Internet con-nections, and NSF hopes toextend the program’s telecom-munications capabilities so thatone day scientists can operateequipment remotely and viewreal-time displays of data fromtheir home institutions.

Page 149: America's Investment in the Future

Given the amount of water that sea ice alter-nately puts into or pulls out of the ocean and theatmosphere, sea ice variability plays a major rolein global climate change. During the InternationalGeophysical Year scientists from the United Statesand the Soviet Union spent entire winters on iceislands in the Arctic, measuring depth, salinity,temperature, and other factors to model theextreme variability of sea ice. Forty years later,NSF-funded researchers repeated much of the workdone in the IGY but this time with modern means,greatly improving our understanding of sea icevariability and the connections to climate change.

From the fall of 1997 through the fall of 1998,the Canadian ice-breaker ship known as DesGroseillers was frozen into the Arctic ice pack forscientific studies related to a multinational projectknown as SHEBA, or Surface Heat Budget of theArctic. NSF, the U.S. Office of Naval Research, andthe Japanese government cooperated in fundingthis massive study of heat flow among the water,ice, and air of the northernmost Arctic. For a fullyear, researchers documented how ice, clouds,snow, and the ocean interact and exchange energy.SHEBA researchers are now off the ice and backin the laboratories to integrate and analyze the vastamount of data they have collected but alreadythey have reported a number of surprises.

One unexpected finding concerned the salinityof the water. When the scientists first arrived atthe Arctic ice pack in October 1997, they discov-ered that the water was much fresher than it hadbeen when the same area was analyzed twentyyears earlier. They concluded that the melting ofthe ice pack during the summer of 1997 causedthe water to be proportionally less salty. Such achange can have serious consequences for marinelife as well as for how ocean water circulates andinteracts with the atmosphere. In addition toaltering salinity, melting sea ice also raises world-wide sea levels, with potentially significant effectsfor coastal cities and towns.

All of which, of course, raises questions aboutthe nature of the warm weather associated withsea ice melting. Over the last one hundred years,overall global climate has warmed, on average,about 0.9˚F with the Arctic leading the way.Temperatures at the North Pole have risen nearly

140 — National Science Foundation

Two researchers walk from the ice-

breaker Des Groseillers toward their

lab, one of more than a dozen huts

and tents on the ice where scientists

conducted their work. The ship served

as a floating field camp for the NSF-

funded SHEBA experiment, an

international study of heat flow in the

Arctic. Frozen into the ice in the

Beaufort Sea on October 2, 1997, the

ship and its researchers drifted with

the Arctic ice for a full year, travelling

nearly 2,700 miles.

Page 150: America's Investment in the Future

Why did the ozone hole develop overAntarctica, and not over Detroit or someother manufacturing center where chlo-rofluorocarbons, or CFCs, are releasedprodigiously? The reasons are explainedby Rebecca L. Johnson, who participat-ed in NSF’s Antarctic Artists and WritersProgram in 1991, 1994, and 1997.

In winter, the stratosphere above the Antarctic continent gets colderthan it does anywhere else on Earth.Temperatures frequently drop below -112˚F. Antarctica is also one of thewindiest places on Earth. In May andJune, strong winds in the stratospherebegin to blow clockwise around thecontinent. These howling stratosphericwinds gradually form an enormous ringof moving air, called the Antarctic polarvortex, that swirls around and around,far above the frozen land . . . .

During the winter, temperaturesinside the Antarctic polar vortex fall solow that water vapor and several othertypes of molecules in the stratospherecondense into extremely small icy parti-cles. These icy particles, in turn, makeup polar stratospheric clouds (PSCs). When the sun sets in the Antarcticaround the end of March each year, itsdisappearance marks the beginning of along, dark winter. Once the last rays ofsunlight have faded away, temperatureson land and in the air fall very quickly.

In the stratosphere, high-altitudewinds that create the polar vortexbegin to blow around the continent.

Isolated from warmer air outside thevortex, the air inside gets colder andcolder. Eventually, it is cold enough forPSCs to form. And that is when thetrouble really begins.

Drifting around inside the polar vor-tex are reservoir molecules that havebonded with chlorine atoms and in sodoing prevented them—so far—fromattacking ozone. When PSCs formabove Antarctica, chlorine reservoirmolecules bind to the icy particles that make up the clouds. Once this happens, complex chemical reactionsbegin to take place that result in molecules of chlorine gas (Cl2) beingreleased from the reservoirs. In thisform, however, chlorine doesn’t attackozone. It just collects inside the vortex. All through the long, dark winter, espe-cially during July and August, the chem-ical reactions taking place on the sur-faces of the PSC particles continue, andmore and more Cl2 builds up inside thevortex. At this point, the stage is setfor ozone destruction. All that is neededis a trigger to get the process going.

That trigger comes in late August,when the sun begins to rise. As the first rays of spring sunlight strike thestratosphere high over the frozen conti-nent, conditions change very rapidly.The UV rays coming from the sun strikethe Cl2 molecules inside the vortex.The molecules break apart, releasingbillions of chlorine atoms that begin anattack on ozone molecules. The result

is massive ozone destruction. Beforelong, so much ozone is destroyed insidethe vortex that an ozone hole is formed.

Ozone destruction continues—andthe hole remains—until conditions inthe stratosphere above Antarcticachange. This change usually begins in early October, when the continentand the air above it finally begin towarm up. Warmer temperatures in thestratosphere melt the icy particles thatmake up PSCs. The PSCs disappear,and the reservoir molecules that werebound to the icy particles are released.Free at last, the reservoir moleculesbind chlorine atoms once again, andozone destruction stops.

By early November, the strongstratospheric winds circling Antarcticadie down, and the polar vortex breaksup. As it does, ozone-rich air from outside the vortex flows in, and muchof the ozone that was destroyed isreplaced. In a sense, the hole in theozone layer fills in. Usually by the end ofNovember, the amount of ozone in thestratosphere over Antarctica has almostreturned to normal. The next winter,however, the cycle will begin again.

From Investigating the Ozone Hole byRebecca L. Johnson. © 1993 by LernerPublications Company. Used by permis-sion of the publisher. All rights reserved.Ms. Johnson is also the author ofScience on the Ice: An Antarctic Journal(1995) and Braving the Frozen Frontier:Women Working in Antarctica (1997).

Why the Ozone Hole? A Writer Explains

Page 151: America's Investment in the Future

142 — National Science Foundation

3.6˚F per decade in the last thirty years, signifi-cantly faster than in other regions of the world.The Antarctic is warming up, as well. Ice shelvesfrom the western side of the Antarctic Peninsulahave been shrinking; according to some reports,the 502-square-mile Wordie Ice Shelf disappearedcompletely between 1966 and 1989.

NSF-funded scientists who participated in theScientific Ice Expeditions (SCICEX) program haveconfirmed that there is an acceleration in sea iceshrinkage. The SCICEX program provided theopportunity to use U.S. Navy submarines for Arcticresearch during the 1990s. Data from the SCICEXcruises demonstrate that the Arctic sea ice coveris showing signs of diminished extent and seasonalduration. What’s more, ice observed in the 1990swas more than three feet thinner compared tomeasurements taken two to four decades earlier.Together, the SHEBA and SCICEX projects have

revealed a major climatic factor—shrinking seaice—that is now being incorporated into forecastsof global climate variability. If the ice pack continuesto decrease in coverage and thickness, researcherssuggest the possibility of a nearly ice-free Arctic—an area that has been covered by ice for at leastthree million years—and a vastly changed world.

What is the source of the warming trend?Part of the challenge in answering this questionis learning how to separate the effects of humanactivity (such as the introduction into the atmos-phere of “greenhouse” gases like carbon dioxide)from warming and cooling cycles that occur naturally.In the polar regions, average temperatures havefluctuated on various time scales, from the tensof thousands of years to one hundred thousandyears. Further study of ice and sediment coreswill provide a more detailed picture of ice sheetbehavior during warmer intervals of Earth’s history.Because Earth was warmer in the distant geologicpast, studies of this complex period should shedlight on the future effects of global warming.

The dim, late-summer sun brightens

the Ferrar Glacier in the Transantarctic

Mountains of Antarctica. What can still

seem like the most remote and forbid-

ding region on Earth has become a model

of ongoing international scientific coop-

eration, with the National Science

Foundation playing a lead role.

Page 152: America's Investment in the Future

Science on the Edge — 143

Studying Extremes Above and BelowIce is not the only substance of interest at thepoles. These extreme environments offer windowsinto realms yet to be explored. The universe, forexample. How did the universe evolve? Will theuniverse continue to expand? Astronomers use ayear-round observatory at the South Pole to answerthese questions, taking advantage of the Pole’snatural features: the dark, dry, and cold environ-ment makes for easier detection of infraredwavelengths and small particles. Infrared andsubmillimeter radio telescopes at the South Poledetect wavelengths obscured at most otherobserving sites. NSF-funded researchers use theAntarctic ice sheet to capture invisible, subatomicparticles called neutrinos in order to gain insightinto violent astrophysical events such as blackhole collapses and supernova explosions.

Another territory ripe for exploration can befound deep below the ice. Thousands of feet underthe Antarctic sur face, below the Russian-runresearch station known as Vostok, lies Lake Vostok.The subglacial lake, roughly the size of LakeOntario, has been isolated from Earth’s ecosystemfor millions of years. Cut off from the rest of theEarth, Lake Vostok may be home to ancientspecies of microbes that have been able to survivein this extreme environment. As part of a jointU.S., French, and Russian research project,Russian teams have drilled down into the icecovering the lake and extracted the world’s longest,deepest ice core. They stopped drilling at about395 feet above the ice-water interface to preventpossible contamination of the underlying lake bykerosene-based drilling fluid.

The upper 9,800 feet of the ice core providea continuous paleoclimatic record of the last400,000 years. The record shows that there havebeen four complete climatic cycles, including fourice age or glacial periods associated with the devel-opment of large ice sheets over the NorthernHemisphere, and four warmer interglacial periods.

In 1999, NSF-funded researchers sailed

aboard a U.S. Navy nuclear submarine

(the USS Hawkbill, shown here poking

through the ice), to map the oceanic

ridges and basins beneath the Arctic ice

cap and to study ocean currents that

may have an effect on global climate.

The project, called Scientific Ice

Expedition (SCICEX) ’99, was the fifth

in a series of annual missions, all

taking advantage of sophisticated

scientific instruments aboard highly

maneuverable warships.

In addition, NSF-funded scientists discoveredthat the core contains bacterial forms, showingthat microbes existed under the ice and probablystill thrive in the lake. Supporting this theory is aJuly 2000 report by a separate team of NSF-fundedresearchers that they have discovered metabolicallyactive bacteria surviving in South Pole snow.

How do such “extremophiles” survive? Where dothey get their energy—from geothermal activity?Studying the microbes and their unique and iso-lated environment will tell scientists more aboutwhether life may be able to exist in harsh condi-tions elsewhere in the solar system. Indeed, LakeVostok appears to resemble conditions on Jupiter’sfrozen moon Europa. Scientists and engineers arenow working on methods to sample the subglaciallake while preventing contamination.

Page 153: America's Investment in the Future

144 — National Science Foundation

Moving up in scale from microbes, biologistscontinue to discover important adaptations amonglarger extremophiles. In the late 1960s, physiol-ogist Arthur L. DeVries discovered with the helpof NSF funds that Antarctic notothenioid fish areprotected from subzero temperatures by antifreezeglycoproteins in their blood. Continuing studies tounravel the workings of fish antifreeze could haveprofound implications in a number of areas—fromhuman organ transplantation to agriculture andbeyond. As it happens, Arctic cod have similarglycoproteins. These proteins bind to ice crystalsand keep them from growing. Yet NSF-fundedstudies in the 1990s revealed that the Arctic codand Antarctic notothenioid actually belong to twodifferent orders of fish that diverged in evolutionsome forty million years ago. This is a strikingcase of convergent evolution in polar environments:the fish took different routes toward the identicalsolution of how to stay alive in ice water.

Ozone Hole over AntarcticaLife at the margins may be extreme, but it is alsofragile. The British Antarctic Survey’s first documen-tation of the Antarctic ozone hole in 1985 andsubsequent NSF-funded study of the phenomenonalerted the world to the danger of chlorofluorocar-bons, or CFCs. That research team, led by 1999National Medal of Science winner Susan Solomon,conducted observations that have significantlyadvanced our understanding of the global ozonelayer and changed the direction of ozone research.

Stratospheric ozone protects against ultravioletradiation. The breakdown of this ozone layer by CFCmolecules can have harmful effects on a range oflife forms, from bacteria to humans. The long, cold,dark Antarctic winters allow the formation of polarstratospheric clouds, the particles of which form anideal surface for ozone destruction. The returningsunlight provides energy to start the complex chem-ical reaction that results in the depletion of ozone.The ozone hole above Antarctica typically lasts aboutfour months, from mid-August to late November.

During this period, increased intensity of ultravi-olet radiation has been correlated with extensiveDNA damage in the eggs and larvae of Antarcticfish. Embryos of limpets, starfish, and other inver-tebrates do not grow properly. Other species havedeveloped defenses. The Antarctic pearl wort, amosslike plant on rocky islands, developed a pig-ment called flavenoid that makes it more tolerantof ultraviolet radiation.

In the northern polar regions, ozone levels inthe early 1990s measured 10 percent lower thanthose estimated in the late 1970s. The Arctic doesexperience ozone depletion, but to a lesser degreethan the Antarctic. Unlike the Antarctic, large-scaleweather systems disturb the wind flow in the Arcticand prevent the temperature in the stratospherefrom being as cold. Therefore fewer stratosphericclouds are formed to provide surfaces for the

Antarctica as pictured by the spacecraft

Galileo on its way to Jupiter. The picture

was taken in early December, a time of

year when the ozone hole over the

South Pole is small to nonexistent.

During the cold Antarctic winters—

April through August—icy stratospheric

particles form, which interact with

atmospheric chemicals to foster ozone

destruction upon the return of sunlight

in late August. The ozone hole grows

until October, when warmer temperatures

begin to melt the icy particles. Because

stratospheric ozone protects living

things against harmful radiation, NSF-

funded researchers at the Pole are

working hard to better understand the

cause and effects of ozone depletion.

Page 154: America's Investment in the Future

Science on the Edge — 145

production of ozone-depleting compounds. Someclouds do form, however, and allow the chemicalreactions that deplete ozone. Ozone depletion hasa direct effect on human inhabitants, but researchhas only just begun on the effects of increasedultraviolet radiation on terrestrial and aquatic eco-systems and societies and settlements in the Arctic.

The good news is that countries around the worldhave agreed to ban the manufacture of CFCsthrough the Montreal Protocol. The contributions ofAntarctic researchers led to swift policy action andbecause of that the ozone layer should recover inthe future. In the meantime, however, NSF-fundedresearch continues to monitor the level of the CFCsstill lingering in the atmosphere. The polar regionswill continue to play an important role as earlywarning systems for the rest of the globe.

Knowledge of the Whole Knowledge of life in extreme environments helpsus to understand not only how life may have begunon Earth, but also what we may find beyond ourown planet. Records from ice and sediment coresreveal past climate patterns, helping scientists toanticipate future scenarios and maybe allowingpolicymakers to make more informed decisions.Following ethical principles in partnership withArctic communities brings researchers to a deeperunderstanding of their own scientific methodswhile enabling them to listen to local knowledgeand oral traditions.

What will happen to the sea ice in the Arcticand the massive glaciers in the Antarctic? Howwill ecosystems adapt to the rapid changesobserved over the last few years? Data capturedat the poles show that the Earth is a total systemwhere cause and effect know no north or south.The Arctic and Antarctic both register the effectsof, and have their own influence on, global circu-lation patterns in the ocean and atmosphere.

To Learn MoreNSF U.S. Antarctic Programwww.nsf.gov/od/opp/antarct/usap.htm

Antarctic Muon and Neutrino Detector Array (AMANDA)http://amanda.berkeley.edu

Center for Astrophysical Research in Antarcticahttp://astro.uchicago.edu/cara/home.html

McMurdo Dry Valleys Long-Term Ecological Research (LTER)http://huey.colorado.edu

Scientific Committee on Antarctic Researchwww.scar.org

Scientific Committee on Antarctic Research Global Change and the Antarcticwww.antcrc.utas.edu.au/scar/

NSF Arctic Programwww.nsf.gov/od/opp/arctic/

Arctic Research Consortium of the United Stateswww.arcus.org

International Arctic Environment Data Directorywww.grida.no/prog/polar/add/

Alaska Native Knowledge Networkwww.ankn.uaf.edu

Center for Global Change and Arctic System Researchwww.cgc.uaf.edu

SHEBA Home Pagehttp://sheba.apl.washington.edu

NSF has enabled science to reach the mostremote and seemingly forbidding regions on Earth,only to discover that these regions may hold thekey to a global understanding. As scientists makediscoveries at the ice’s edge, they join earliergenerations of hunters, explorers, and navigatorsin a time-honored quest for knowledge of theextreme, leading to knowledge of the whole.

Page 155: America's Investment in the Future

Disasters &Hazard Mitigation

living more safelyon a restless planet

Page 156: America's Investment in the Future

ature is continually reshaping our world

with volatile and often catastrophic

power. NSF-funded research is helping

to improve our understanding of the

causes and effects of natural disasters

while also making the world safer.

Together, scientists and engineers are

drawing from a wide variety of disciplines

to mitigate the hazards—and answer

the scientific questions—posed by

nature’s most energetic events.

N

Page 157: America's Investment in the Future

Na tu r a l d i s a s t e r s a r e c omp l ex and often global in their effects. That is

why the hallmark of NSF's long involvement in disaster research has been to encourage the

exchange of ideas across national boundaries as well as scientific disciplines. To find answers in

this high-stakes field, NSF programs marshal a wide range of researchers, including atmospheric

scientists, engineers, geologists, sociologists, economists, seismologists, biologists, political

scientists, and others. Their work takes them to wherever nature is in turmoil—earthquakes in

Japan, volcanoes in the Philippines, hurricanes in the Atlantic, and floods on America’s Great

Plains. The resulting discoveries about the inner workings and human risk associated with

nature’s most extreme events are making both warning and mitigation increasingly possible.

Page 158: America's Investment in the Future

Disasters & Hazard Mitigation — 149

The Forces Und lying the FuryThe economic cost of natural disasters in theUnited States has averaged as much as $1 billiona week since 1989—and is expected to rise,according to a 1999 NSF-supported study.Because natural disasters can have such brutalconsequences, it’s easy to think of them interms of human misery that, somehow, must bemitigated. But society cannot mitigate what itdoes not understand. Natural disasters are, afterall, a part of nature, and though human activitiescan influence the impact of extreme events, re-searchers must first learn as much as possibleabout the basic physical forces underlying the fury.At NSF, most of the research into natural disas-ters and their mitigation takes place within theDirectorate for Geosciences, the Directorate forEngineering, and the Directorate for Social, Behavioral,and Economic Sciences.

Take, for example, earthquakes and volcanoes.Almost from its inception, NSF has been a criticalplayer in the global effort to understand and copewith these giant Earth-altering forces. NSF fundeda series of explorations during 1957–58—dubbedthe International Geophysical Year—and again inthe 1960s. These explorations confirmed a wildidea that scientists had begun to suspect was true:the Earth’s seafloors, rather than being congruouslike the rind of a melon, were actually disparatepieces that, at least in some places, were slowlymoving away from each other. These findingspushed geophysicists toward the modern theoryof plate tectonics. Researchers now know that theupper part of Earth’s crust is broken up into anumber of rigid sections or plates, and that theseplates float atop soft-solid rock kept in a moltenstate by an unimaginably hot inner core. As theplates drift, they not only separate but also col-lide and slide past each other, forming valleysand mountain ranges. Occasionally, some of themolten rock breaks through—and a volcano is born.When two plates grind past each other, the shud-dering friction generates earthquakes.

Of the one million or so earthquakes that rat-tle the planet each year, only a few—about oneeach week—are large enough to grab our attention.Predicting when and where the next “big one” willtake place is still far from a certainty. Short-termforecasts are sometimes pegged to swarms ofsmaller quakes that may signal mounting stressat a fault. Or a sudden change in underground watertemperature or composition may be significant:this type of signal led to the successful evacua-tion of a million people before a major earthquakestruck near the city of Haicheng, China, in 1975—the first earthquake to be scientifically foretold.

NSF-funded researchers are making headwayon the difficult question of earthquake predictionby narrowing their focus to specific regions ofthe world. Because the behavior of seismic wavesis so strongly affected by the different kinds ofsoil and geological structures through which thewaves must travel, the effects of an earthquakecan vary widely from place to place, even alongthe same fault. A soft-soil area such as a lakebed,for example, will shake more than a rocky hill.Knowing this, scientists and engineers at theNSF-sponsored Southern California EarthquakeCenter in Los Angeles have reassessed the con-sequences of earthquakes along faults in thesurrounding region. The scientists were able tosimulate the anticipated effects of future localquakes by using sophisticated computer modelsof the Los Angeles basin that accounted for faultgeometry and motion, sediment composition,and other factors that can reflect, prolong, or

With funding from the National Science

Foundation, scientists have made sig-

nificant advances in the accuracy of

storm prediction since the first tornado

forecast in 1948.

Page 159: America's Investment in the Future

amplify quaking motion. Such modeling, supple-mented with data from new digital seismic recorderscapable of sensing a broad range of actualearthquake vibrations, can help researchers andresidents of quake-prone areas to anticipate—atleast in a general way—when and where the nextbig temblor will hit and what damage may result.

Even as local efforts to understand earthquakeactivity improve, scientists are finding new waysto take another look at the big picture. In June1999, NSF-funded researchers joined an interna-tional team headed to the east coast of Japan toestablish long-term seafloor observatories in oneof the world’s busiest earthquake zones: the so-called Japan Trench, where two of Earth’s biggesttectonic plates are colliding. The internationalteam of scientists drilled holes about one kilo-meter deep into the ocean floor along the trench,which itself is two kilometers underwater. Theythen installed instruments at the bottom of theseboreholes to monitor the amount of seismic activ-ity there. Robotically controlled vehicles similar tothose used to investigate the sunken Titanic willperiodically travel to and from the seafloor obser-vatories and help provide scientists with long-termobservations of one of the planet’s most activequake regions.

Another way that NSF is helping researchersgather data close to the moment of seismicactivity is through its funding of the EarthquakeEngineering Research Institute (EERI) in Oakland,California. Besides encouraging regular communi-cation among engineers, geoscientists, architects,planners, public officials, and social scientistsconcerned about natural disasters, EERI quicklyassembles and deploys teams of researchers onfact-finding missions in the wake of earthquakes—anywhere in the world—soon after they occur.

Reducing the RiskThough researchers cannot yet precisely predictthe timing and location of earthquakes, NSF haslong recognized that more can be done to minimize—or mitigate—the damage that quakes can cause.Toward that end, in 1977 Congress passed theEarthquake Hazards Reduction Act, which put NSFin charge of a substantial part of earthquakemitigation research efforts in the United States.Earthquake-related studies, especially with regardto structural and geotechnical engineering, nowmake up the bulk of NSF’s natural disasters re-search under the guidance of the Natural HazardsReduction Program in the Directorate for Engineering.

Why engineering? Because most of the imme-diate deaths from earthquakes occur whenbuildings collapse, and the huge economic lossesassociated with the biggest quakes stem fromdamage to the structures and infrastructures thatmake up cities and towns. In 1997, NSF officiallycharged three earthquake centers with a majorportion of the responsibility for conducting andcoordinating earthquake engineering research inthe United States. The centers, each constitutinga consortium of public and private institutions, arebased at the University of California at Berkeley,the University of Illinois at Urbana-Champaign,and the State University of New York at Buffalo.

The NSF-funded earthquake centers are modelsof cooperation, including not only geoscientistsand engineers but also economists, sociologists,political scientists, and contributors from a hostof other disciplines. The Buffalo center, for example,recently studied the potential economic impact ofan earthquake in the Memphis, Tennessee, areanear the epicenter of several major quakes thatstruck in 1811-12. Participants in the study includ-ed researchers from the University of Delaware’sDisaster Research Center, who examined economic,political, and social elements of the hazard. TheDelaware researchers have also studied the

150 — National Science Foundation

Page 160: America's Investment in the Future

these simulations a century intothe future. Their model suggeststhat if carbon dioxide emissionscontinue to rise at their currentpace, there will likely be a boostin global temperatures as well asa 40 percent jump in winter rainand snow within the southwestregion and Great Plains of theUnited States. The model also showsthat the warming effect would bemore severe in the United Statesthan in Europe or Asia.

While global warming might notrival earthquakes and hurricanesfor dramatic immediacy, such grad-ual but significant climate changescan indeed have disastrous conse-quences for human society. As icecaps melt, sea levels will rise,threatening coastal habitation andcommerce. Warmer temperatureswill also radically alter when, where,and whether farmers can growcertain crops. Climate models thatcan predict such events with a fairdegree of certainty—and perhapssuggest what can be done to mini-mize their impact—will make aninvaluable contribution to the fieldof natural hazards research.Accustomed to the urgency of saving lives, natural disasterresearchers now face the challengeof preserving a way of life, as well.

Before 1950, climatologists spentmost of their time describing andcomparing the current-day climatesof different regions. Even the rela-tively recent climatic past remaineda mystery to them, and interactionsbetween the atmosphere and theoceans that researchers now knowdrive global climate change weretoo complex to study with themathematical tools at hand. Butthen came the computer revolu-tion, funded to a significant degreeby NSF, and today much of nature’sturbulence, past and present, isavailable for study.

With the advent of NSF-sponsoredsupercomputers, climatologistsbegan building models of atmos-pheric change that now embracemillions of years of oceanic, atmos-pheric, biological, geological, andsolar processes. For example, bythe late 1980s NSF-supportedresearchers at the University ofWashington were able to recon-struct the wide extremes of tem-peratures that existed 250 millionyears ago within the giant super-continent of Pangaea.

In 1999, climate modelers atthe NSF-funded National Center forAtmospheric Research in Boulder,Colorado, managed to accuratelysimulate a century of known climatehistory. The scientists then carried

Climate Change—Disaster in Slow Motion

Page 161: America's Investment in the Future

152 — National Science Foundation

impact that the Loma Prieta earthquake (1989)and Hurricane Andrew (1992) had on businessesin the Santa Cruz and Miami areas, respectively.Kathleen Tierney, a sociologist at the Universityof Delaware and a co-principal investigator for theBuffalo earthquake consortium, says the fewprevious studies of long-term disaster impactsfocused on individuals and families rather than onbusinesses. The new Delaware research shouldhelp both policymakers and business owners bet-ter understand the economic impacts of disastersand devise more effective ways of coping with them.

While understanding the economic impact ofdisasters is important, the heart of the earthquakecenters’ mission is to design safer buildings. In1967, the University of California at Berkeleycenter installed what is still the nation’s largest“shake table.” The twenty-foot-by-twenty-foot plat-form reproduces the seismic waves of variousearthquakes, allowing engineers to test modelstructures. After the Loma Prieta quake, NSF fund-ed an upgrade of the table from two- to three-dimensional wave motions; additional digital controls and sensors will soon allow offsite re-searchers to monitor experiments at the shaketable in real time via a computer network.

Ultimately, says William Anderson, senior advi-sor in NSF’s Division of Civil and MechanicalSystems, the research community may be ableto conceptually link geophysical and geotechnicalresearch—such as computer models of faults andsoil liquefaction—to the engineering simulationsof building parts, creating a unified, integratedmathematical model of disaster.

The need for such research has been under-scored numerous times in the latter part of thetwentieth century. Early in the morning on January17, 1994, southern California suddenly heavedand swayed. Deep beneath the town of Northridge,less than 25 miles from downtown Los Angeles,one giant chunk of the Earth's crust slipped overanother, jolting the people and structures abovewith a 6.7 magnitude earthquake. (On the loga-rithmic Richter scale, 7.0 constitutes a majorearthquake. Although the Richter scale has noupper limit, the largest known shocks have hadmagnitudes in the 8.8 to 8.9 range.) More thantwelve thousand buildings were shaken so hardthey collapsed or sustained serious damage, whilemany of the region’s vital freeways and bridgesdisintegrated or were rendered impassable. Sixtypeople died and Californians suffered more than$25 billion in economic losses.

One year later and halfway around the world, thecity of Kobe, Japan, endured its first catastrophicearthquake in a century, a 6.9 magnitude temblor.More than six thousand people died and almost twohundred thousand buildings were destroyed ordamaged. Fires spread across the city while helplessfirefighters failed to draw a drop of water from theshattered pipes. Besides the horrific loss of life,the devastation in Kobe cost between $100 and$200 billion.

The widespread destruction from these disas-ters has been especially alarming to expertsbecause both cities sit atop a seismically activecoastal region known as the Pacific Rim, which iscapable of bestirring earthquakes of even greaterviolence. Close inspection of the rubble from bothearthquake sites revealed one of the main

This geological model of the 1994

Northridge earthquake was created

by researchers at the NSF-funded

Earthquake Engineering Research

Center at the University of California at

Berkeley. This three-dimensional view

of the 6.7 magnitude earthquake gives

scientists a better understanding of the

geological forces behind earthquakes.

Page 162: America's Investment in the Future

Disasters & Hazard Mitigation — 153

contributing factors to the devastation: Buildingswith steel frames exhibited cracks at the weldedjoints between columns and beams. Experts hadexpected old masonry and reinforced-concretestructures to crumble, but steel-framed buildingswere supposed to be relatively safe. In Kobe, thesteel frames failed catastrophically: more than onein eight simply collapsed. In Northridge, more thantwo-thirds of the multistory steel-framed buildingssuffered damage.

Immediately after these disasters, NSF-sponsoredresearchers put new emphasis on developingbetter connection designs. In five short years,researchers have learned to reduce stresses onwelds by altering the joints in the frames, in somecases by per forating or trimming the projectingrims (i.e., flanges) of the steel I-beams. Thesesafer construction techniques have been includedin new building code recommendations issued bythe Federal Emergency Management Agency forall U.S. buildings in earthquake-prone regions.

NSF-funded researchers are finding many otherways to make buildings safer during earthquakes.Shih Chih Liu, program director of NSF’s infra-structure and information systems program, saysnew high-performance concrete uses ash or smallsteel bars for better tensile strength and corrosionresistance. It took smart thinking to concoct thenew concrete but other work is aimed at makingthe buildings themselves “smart.” NSF-fundedengineering professor Deborah Chung at the StateUniversity of New York at Buffalo recently inventeda smart concrete that acts as a sensor capableof monitoring its own response to stress. Theconcrete contains short carbon fibers that lowerthe concrete’s tendency to resist the flow of elec-tricity (a quality that researchers call “resistivity”).Deformations to the material—as can occur duringearthquakes—cause resistivity to rise, a changethat can be gauged by a simple electrical contactwith the concrete. The greater the signal, the greaterthe presumed damage. NSF-funded engineers havealso developed systems such as swinging coun-terweights, which dampen the oscillations of

buildings, and slippery foundations that are shapedlike ball bearings in a bowl—the bearings allowthe structure’s footings to shift sideways nearlyindependently of the structure above.

Other NSF-supported advances include the dev-elopment of smart shock absorbers for buildings,bridges, and other structures. As the structureshakes or sways, electrical signals from motionsensors in the structure cause a special fluid inthe shock absorbers to become thicker or thinner(ranging between the consistency of light oil toone more like pudding), depending on what’sneeded to slow or speed the movement of theshock absorbers’ pistons.

How well these efforts translate into savedlives and minimized economic losses depends onhow widely they are shared. In the new millennium,NSF plans to develop the Network for EarthquakeEngineering Simulation (NEES)—a kind of overar-ching cybersystem for earthquake engineeringexperimental research. Through NEES, researchersaround the world can remotely access a completesystem of laboratory and field experimentationfacilities, of which there are currently more thanthirty in the United States alone.

The 1995 earthquake that devastated

Kobe, Japan, destroyed a section of

the Nishinomiya-ko Bridge. The Kobe

earthquake demonstrated the vital

importance of NSF-funded research

into “smart” materials and other earth-

quake-resistant construction techniques.

Page 163: America's Investment in the Future

154 — National Science Foundation

Hot HeadsVolcanoes are close cousins of earthquakes, aris-ing as they do from the same powerful motions ofthe planet’s tectonic plates. Despite their fieryreputation for chaos and destruction, however, onlyabout sixty volcanoes erupt each year, usually withmore bravado than brawn. What’s more, mostvolcanoes are on the ocean floor where plateboundaries are converging or spreading over“hot spots”—large subterranean pools of magma.

This is not to say that volcanoes pose no peril.Over the last three hundred years more than260,000 people have died from volcanic activity.The 1991 eruption of the Philippines’ MountPinatubo killed more than three hundred peopleand devastated the area’s economy. When MountSt. Helens blew its stack in the state of Washingtonin 1980, 57 people died, nearly 7,000 big gameanimals were killed, more than 12 million salmonperished, forests were devastated, and the econ-omy took a nearly $1 billion hit.

All of this reinforces the need for better waysto model and predict volcanic activity. One way toboth study and monitor a volcano is to place a gassensing device called COSPEC along the volcano’sflanks. COSPEC (for “correlation spectrometer”)measures how much sulfur dioxide gas is escap-ing from the volcano’s interior. A jump in theamount of sulfur dioxide suggests an imminenteruption. Still, a few hours’ or, at most, a few days’warning is the best that scientists can managewith current knowledge and technology. Andsometimes, of course, there is no discerniblewarning at all.

In 1993, nine members of a scientific expeditionwho were taking gas samples died when a suddenspasm of molten rock and ash erupted from thecrater of a volcano called Galeras in Colombia.The tragedy prompted one of the survivors, StanleyWilliams of Arizona State University, to organizea conference that would enable scientists tostandardize their methods and make data consis-tent from one volcano observatory to another.The 1997 NSF-funded conference brought togethervirtually every scientist then working with COSPEC—some twenty-five volcanologists from fourteencountries. Williams has also developed a remote-access instrument called GASPEC that measuresanother early-warning gas, carbon dioxide.

Other volcano-monitoring efforts funded in partby NSF include a network of seismometers (instru-ments that measure ground vibrations caused byearthquakes) and an array of Earth-orbiting satel-lites called the Global Positioning System (GPS).The GPS can alert scientists to volcano-relatedground deformations at the millimeter scale—deformations that might signal an imminent eruption.

On May 18, 1980, a 5.1 magnitude

earthquake shook Mount St. Helens.

The bulge and surrounding area slid

away in a gigantic rockslide and debris

avalanche, releasing pressure and trig-

gering a major pumice and ash eruption

of the volcano. Thirteen hundred feet

of the peak collapsed. As a result, 24

square miles of valley were filled by

a debris avalanche; 250 square miles

of recreation, timber, and private lands

were damaged by a lateral blast; and

an estimated 200 million cubic yards

of materials were deposited into the

river channels.

Page 164: America's Investment in the Future

For example, a special electroniccamera called CHIP (for “chromo-spheric helium imaging photometer”)perches on the volcanic flanks ofHawaii’s Mauna Loa and snapshighly detailed pictures of the solardisk and corona every three min-utes. These pictures are frequentenough to provide scientists witha movie loop of ejections as theydevelop and burst forth. Othersatellite-borne instruments—somelaunched and retrieved by the spaceshuttle Discovery to escape distor-tions caused by Earth’s dustyatmosphere—mine the Sun’s radi-ation for clues about its magneticbehavior. Along with piecingtogether the basic science behindsolar storms, these instrumentsshould help scientists do a betterjob of predicting the next seriousbout of bad space weather.

Sun known as sunspots, whoseactivity follows an eleven-year cycle.And not only is the most recentsunspot cycle expected to reachits maximum activity in the year2000, but overall, these so-calledsolar maximums have become twiceas powerful as they were in theearly 1900s. With our civilization’swell-being tied ever more closelyto power stations and satellite-based communication systems,the atmospheric disturbancestriggered by solar storms pose apotentially significant threat.

From 1980 to 1989, the NSF-funded Solar Maximum Missionsatellite collected the most detaileddata yet on coronal mass ejections.NCAR researchers used this datato develop a new suite of observa-tion tools that work in space andon the ground.

In the spring of 1989, six millionpeople in Canada, Sweden, and theUnited States lost electric powerfor up to nine hours thanks tostormy weather—not on Earth, buton the Sun. During particularly vig-orous solar storms, billions of tonsof plasma erupt from the Sun'sgaseous outer layer (called thecorona), speed toward Earth athundreds of miles per second, anddisrupt the Earth's magnetic field.

Although they also producespectacularly beautiful auroras—those colorful atmospheric stream-ers known as the northern lights—”coronal mass ejections” consti-tute a poorly understood naturalhazard of growing concern to thescientists at NSF’s National Centerfor Atmospheric Research (NCAR).That’s because the ejections areassociated with features on the

How’s the Weather Up There?

Page 165: America's Investment in the Future

156 — National Science Foundation

Stormy WeatherWhile earthquakes and volcanoes capture much

of the public’s imagination, weather-related disas-ters can wreak far more economic havoc. Accordingto the Worldwatch Institute, 1998 set a new recordfor global economic losses related to extremeweather—$89 billion, a 48 percent increase overthe previous record of $60 billion in 1996 and farmore than the accumulated losses for the entiredecade of the 1980s.

A major player in the world’s efforts to learnabout and live with extreme weather is the NationalCenter for Atmospheric Research (NCAR) inBoulder, Colorado, funded by NSF’s Division ofAtmospheric Sciences. Central to NCAR’s activitiesis the use of supercomputers to develop large-scalesimulations of atmospheric and ocean dynamics.These models help to explain the formation oftornadoes, windstorms, and hurricanes, as well asmore mundane climatic events. For example, inthe late 1970s, NCAR researcher Joseph Klemp,working with Robert Wilhelmson of the NSF-fundedsupercomputing center at the University of Illinois,developed the first successful model of the mostdangerous of all thunderstorms, the “supercell”storm. In a thunderstorm, air moves up and downin a turbulent mix. A single-cell storm means thatthere is just one updraft/downdraft component,which generally produces only moderately severeweather. A multicell storm can kick out the occa-sional tornado, but sometimes a main, intenselyrotating updraft develops within a multicell stormand transforms it into a supercell storm capable

of producing the most devastating weather, com-plete with violent tornadoes, raging winds, hail,and flooding.

The model developed by Klemp and Wilhelmsonconfirmed other researchers’ observations thatthis special, rotating brand of thunderstorm coulddevelop by splitting into two separate storm cells.According to their simulation, the southern stormin the pair was the most likely to concentrate itspowers to make a tornado. Meteorological mod-elers have since improved these simulations tothe point where researchers can study the abilityof rotations midway up in a thunderstorm to dev-elop tornado-like whirls at the ground. Such work,coupled with NSF-sponsored ground reconnais-sance of tornadoes, may eventually solve themystery of how tornadoes are born, which, in turn,could lead to better warning systems.

Warning systems can save lives, but whetheror not a building survives a tornado’s onslaughtdepends largely on how it was constructed. Sincethe 1970s, scientists and engineers at the NSF-funded Texas Tech (University) Institute for DisasterResearch have been picking through the aftermathof tornadoes’ fury for clues about what predis-poses a structure to survival. When the researchersfirst began their work, it was common for emer-gency preparedness manuals to recommend thatduring a tornado building residents open theirwindows so that pressure inside the building couldequalize with the low-pressure interior of theapproaching twister. But after much doggeddetective work, the Texas Tech researchers weresurprised to learn that rather than exploding fromunequal pressure, the walls of homes destroyedby tornadoes appeared to flatten when winds priedup the roof, just as aerodynamic forces will lift upan airplane wing. Wind was also discovered tocontribute to structural damage by blowing debrisfrom poorly built homes into homes that wereotherwise sound.

The key to survivable housing—at least in allbut the worst cases of tornadoes—turns out tobe roofs that are firmly anchored to walls and walls

Researchers at the NSF-funded National

Center for Atmospheric Research in

Boulder, Colorado, use computer models

to learn more about tornadoes, hurri-

canes, and other weather events. These

models enable atmospheric scientists to

more accurately predict when and where

severe storms will hit. Greater forecast-

ing accuracy can save lives and minimize

property damage.

Page 166: America's Investment in the Future

Most hurricanes kill and destroy with a surge ofseawater. Not Hurricane Andrew—the monster of1992—a storm that led to at least fifty deathsand more than $30 billion in property damage insouthern Florida. Sufficient warning enabled peo-ple to evacuate from dangerous beaches even asthe worst of the storm miraculously skirted down-town Miami. But Andrew pummeled south Floridawith particularly intense air currents that leveledwell-built homes and demolished Homestead AirForce Base. The damage from the winds was moresevere than expected, given that the region’sbuilding codes had been considered among thebest in the country. As it turned out, however,enforcement of those codes had grown lax duringthe region’s recent building boom.

All the science-based predictions and warningsin the world will not mitigate a natural disastermade more devastating by human folly. Ironically,improved hazard warnings in the United Statesmay be one of the factors encouraging more andmore people to move to homes on the earthquake-and hurricane-prone coasts. As noted in a 1999report by the National Research Council’s Boardon Natural Disasters, 80 percent of Florida’s popu-lation now lives within 22 miles of the beach—afivefold increase since 1950. A steady rise in themigration to cities has also made more people morevulnerable to the effects of natural disasters asthey live amidst aging infrastructures increasinglysusceptible to the slightest ill wind or tremor.Urban growth also translates into more pavement

and less exposed soil, which forces rain to run offrather than soak into the ground and tends toincrease flood damage. All of this means that thesharply upward trend in the costs of natural disas-ters is attributable not so much to the occurrenceof more hazards but rather to human choices thatplace more of our structures and possessions at risk.

Sometimes, too, steps taken with the best ofintentions to limit the dangers of natural hazardscan turn out to amplify the problem. The intenserains and flooding that occurred in 1993 along theMississippi River provide an example. The leveesand dikes that had been built along the river toprotect communities from the occasional mid-levelflood allowed more development in the area andalso effectively eliminated the flood plain, exacer-bating the damage caused by the unusually mas-sive surge of water in 1993.

“Disasters by Design: A Reassessment of NaturalHazards in the United States,” a five-year-long NSF-funded study, was released in the spring of 1999. The study compiled the thoughts of 132 experts onhow communities can better prepare themselves byevaluating potential local threats up to two hun-dred years in the future, determining acceptablelosses, and then planning for them.

Says study leader Dennis Mileti of the Universityof Colorado’s Natural Hazards Research andApplications Information Center, “We need to changethe culture to think about designing communitiesfor our great-grandchildren’s children’s children.”

The Human Factor

Page 167: America's Investment in the Future

158 — National Science Foundation

that are firmly anchored to foundations. Wide eavesalong the roofline, which can act as handles forpowerful winds, should be avoided. And the re-searchers found that weak points in the structure,such as garage doors and opened windows, actu-ally increase the risk of damage by inviting inwinds that will blow down the opposing walls,exposing people to injury from breaking glass andflying wreckage. The advice might be simple—shutyour windows during a tornado rather than openthem—but it is rooted in the long investigation ofcomplex physical forces.

Trustworthy ToolsOur ability to understand tornadoes and other

natural forces is only as good as the tools research-ers have to study them. One highlight in this regardis Doppler radar, developed in the mid-1970s withthe help of NSF funds. Since the 1960s, meteor-ologists’ ability to predict violent weather patternshas depended largely on two kinds of technology:an array of orbiting space satellites that observeEarth’s environment on a scale not previouslypossible, and ground-based radar technology. Radarpeers inside clouds for clues about their potentialfor severe weather by sending out electromagneticpulses that bounce off particles and return withvaluable information about the location and intensityof precipitation. Most weather radars send out sig-nals with relatively short wavelengths that, whileoffering a precise picture of a cloud’s interior, canbe absorbed by the very particles they’re supposedto measure. On the other hand, Doppler radar useslonger wavelengths, so that even distant weathersystems will appear on the radar screen withaccurately rendered intensity. What’s more, Dopplerradar provides additional information (such as thevelocity at which precipitation is moving) that iscritical to short-term forecasting.

In the last decade, the National Weather Servicehas installed Doppler radar systems at fixed loc-ations across the country, improving meteorologists’ability to issue timely flash flood and severethunderstorm warnings and cutting by more than60 percent the number of tornadoes that strike

without public notice. Recently NSF-funded scien-tists have also begun experimenting with more-advanced mobile Doppler instruments mountedon flatbed trucks, which allow the hardier breedof researcher to chase down storms for even moreprecise readings.

With Doppler radar, NCAR scientists helped a University of Chicago wind expert, the late T. Theodore Fujita, to confirm in the 1980s a wholenew atmospheric hazard—the microburst.Microbursts are concentrated blasts of downdraftsfrom thunderstorms that have been responsiblefor airplane crashes killing more than five hundredpeople in the United States. And in 1999, NCARresearchers began testing whether Doppler radarsystems installed on airplanes can detect so-calledconvective turbulence, associated with storms andclouds, which can rip sections off small planesand injure crew and passengers.

Another significant new observing technologydeveloped at NCAR is a probe employed by hurri-cane-hunting aircraft. The probes are dropped fromgovernment planes into offshore hurricanes toprofile previously hard-to-measure factors such aslow-level winds, pressures, and temperaturesaround the storm's eye. Data from these probeshave greatly improved the National Weather Service’sability to predict the course and intensity of hurri-canes. Hurricanes develop over the warm tropicaloceans and have sustained winds in excess of 75 miles per hour. One hundred years ago, coastalresidents generally had less than a day’s warningbefore a hurricane struck. Today, thanks to satel-lites and radar, these same residents know daysin advance that a hurricane is maturing and moving their way.

El Niño Bears Unwanted GiftsHurricanes are dramatic examples of how theatmosphere and the oceans interact to drive thecourse of Earth’s climate in sometimes perilousways. Another example is El Niño, a weak warmcurrent of water that appears for several weekseach Christmas off the coast of Ecuador and Peru.Every three to five years, however, this otherwise

Page 168: America's Investment in the Future

Disasters & Hazard Mitigation — 159

mild-mannered current becomes a real “hazardspawner,” says NCAR senior scientist MichaelGlantz, by growing in size and strength and lastingfor many months. Unusual weather conditionsresult as tropical monsoons that normally centerover Indonesia shift eastward, influencing atmos-pheric wind patterns around the world. Massivefish kills, droughts, heavy rains: These are justsome of the gifts that a robust El Niño can bear.

After a particularly devastating El Niño event in1982–83, researchers vowed not to be caught offguard again. NSF coordinated a global scientificeffort to set up a network of ocean-drifting, data-gathering buoys in the Pacific Ocean. In the springof 1997, the investment paid off when the instru-ments began recording abnormally high tempera-tures off the coast of Peru, giving scientists andpolicymakers their first inkling of an El Niño eventthat would turn out to be the most devastating infifty years. Supplemented with satellite observa-tions, the advance warning from the buoys allowedfarmers in Central and South America to steelthemselves for record-breaking drought andCalifornians to fix their roofs before the onset ofan unprecedented rainy season that also causedlife-threatening floods and mudslides. Now NCARresearchers are incorporating what they’ve learnedabout this massive El Niño event into supercom-puter-based climate models designed to simulateatmospheric circulation changes over the courseof decades and even centuries. And in May 1999,NCAR began working with the United NationsEnvironment Programme to conduct a nineteen-month study of the impact of the 1997–98 El Niño,with the goal of developing programs to helpcountries better prepare themselves for the daywhen El Niño makes a muscular comeback.

A Safer FutureIn recognition of the rising dangers and costsassociated with natural disasters around the world,the United Nations declared the 1990s theInternational Decade for Natural Disaster Reduction.At the close of the decade, NSF could look backon fifty years of sponsored programs whose aimshave been—and continue to be—to better under-

NSF Directorate for Engineeringwww.eng.nsf.gov

NSF Directorate for Geoscienceswww.geo.nsf.gov/

NSF Directorate for Social, Behavioral, and Economic Scienceswww.nsf.gov/sbe

Network for Earthquake Engineering Simulation (NEES)www.eng.nsf.gov/nees

Earthquake Engineering Research Institute (EERI)www.eeri.org

Federal Emergency Management Agencywww.fema.gov

Mid-America Earthquake Center (at the University of Illinois at Urbana-Champaign)http://mae.ce.uiuc.edu

Multidisciplinary Center for Earthquake Engineering Research (at the State University of New York at Buffalo)http://mceer.buffalo.edu

National Center for Atmospheric Research (NCAR)www.ncar.ucar.edu/ncar/

National Information Service for Earthquake Engineering (at the University of California at Berkeley)http://nisee.ce.berkeley.edu

National Weather Servicewww.nws.noaa.gov

Pacific Earthquake Engineering Research Center http://peer.berkeley.edu

Southern California Earthquake Centerwww.scec.org

U.S. Geological Survey Hazards Researchwww.usgs.gov/themes/hazard.html

U.S. Geological Survey Cascades Volcano Observatoryhttp://vulcan.wr.usgs.gov/home.html

University of Oklahoma Center for Analysis and Prediction of Storms (CAPS)www.caps.ou.edu

To Learn More

stand and prepare for the kinds of extreme nat-ural events that can prove disastrous for humancommunities. While the world can never beabsolutely safe, its human inhabitants can atleast rest easier in the knowledge that nature’sviolence holds less sway in an age where scien-tists and engineers are working so closely togetherto mitigate what cannot be controlled.

Page 169: America's Investment in the Future

The National Science Foundation is profoundlygrateful to photographer and NSF grantee FeliceFrankel for her contribution to America’s Investmentin the Future. Her photographs are compellingvisual metaphors for the scientific and technologicaladvances celebrated in this book. While they arenot literal representations of the researchdescribed, Frankel’s images enable the Foundationto communicate the dramatic impact of the basicresearch it advances.

Frankel is an artist-in-residence and researchscientist at the Massachusetts Institute ofTechnology. Her first NSF grant was awarded in1997 for “Envisioning Science,” a project in whichshe works with students and researchers to raisethe standards in scientific imaging and visualexpression of data. She is writing a handbook forscientists on how to communicate their researchthrough accurate and compelling images. MIT Presswill publish Envisioning Science in 2001. Frankeland colleagues from MIT and around the countryare also establishing an initiative to promote newcollaborations among researchers, imaging experts,

and science writers. The initiative will begin withthe Image and Meaning: Envisioning andCommunicating Science and Technology conferencefrom June 13-16, 2001 (http://web.mit.edu/i-m/).The conference is partially funded by NSF in part-nership with various corporations.

Frankel has been a Guggenheim fellow and aLoeb Scholar at Harvard University, and hasreceived grants from the National Endowment forthe Arts, the Camille and Henry Dreyfus Foundation,and the Graham Foundation. In 1997, Frankel co-authored On the Surface of Things: Images ofthe Extraordinary in Science with George Whitesides,National Medal of Science winner and MallinckrodtProfessor of Chemistry at Harvard University.Frankel, who began her career as a landscapephotographer, also wrote the award-winning ModernLandscape Architecture: Redefining the Garden.

About the Photographs

160 — National Science Foundation

Page 170: America's Investment in the Future

About the Photographs — 161

InternetThis extreme close-up of a computer monitorscreen symbolizes the transforming power of theInternet, which NSF was instrumental in building.The interlocking pixels suggest the complex net-work of processors, packets, switches, and wiresthat make up the global network of networks.

Advanced MaterialsMiniaturized wireless communication devices, nonlinear optical crystals, artificial skin, and thinmetals are just some of the discoveries madepossible by NSF’s longstanding support of materialsresearch. The freestanding origami-like microstruc-tures in this photograph were formed by printing apattern of thin metal on glass capillaries.

From the laboratory of Rebecca Jackman, ScottBrittain, and George Whitesides, Department ofChemistry and Chemical Biology, Harvard University.

EducationNSF-funded education projects and informationtechnologies capture the imagination of studentsand spark their interest in science, mathematics,engineering, and technology. This scanning electron micrograph of a CD-ROM, taken at MIT’sMicrosystems Technology Laboratories, revealsthe dots and empty spaces (the 1s and 0s) ofthe disk’s binary code. The result: everythingfrom the music of Mozart to the adventures ofThe Magic School Bus®.

Micrograph taken with the help of postdoctoralfellow Albert Folch, Department of ElectricalEngineering and Computer Science, MassachusettsInstitute of Technology.

ManufacturingA microrotor, the blades of which are depicted here,is just one example of MEMS, or microelectro-mechanical systems—machines built at diametersless than a human hair. Part of a microscale revo-lution in instrumentation design, MEMS has becomea multibillion-dollar industry thanks in large part toearly, basic research funded by NSF.

Research from the laboratory of MartinSchmidt, Stephen Senturia, and Chunang-Chia Lin,Department of Electrical Engineering andComputer Science, Massachusetts Institute ofTechnology.

ArabidopsisThrough the eye of the photographer, even themustard weed, Arabidopsis thaliana, becomes awork of art. With support from NSF, plant biologiststhe world over are cooperating in research to create a genetic map of Arabidopsis and therebyunlock the mysteries of all flowering plants.

From the laboratory of Gerald Fink, WhiteheadInstitute, Massachusetts Institute of Technology.

Decision SciencesA femtosecond is one quadrillionth of a second. Inthis photo, femtosecond laser pulses have createdmicron-sized holes in quartz. NSF-funded mathe-maticians have systematically studied the complexdecision-making pathways that lead to winningstrategies—femtoseconds of thought captured inelegant, timeless theorems.

From the laboratory of Eric Mazur and Eli Glezer, Harvard University.

Page 171: America's Investment in the Future

Disasters & Hazard MitigationThis photograph shows cracking of a silicon chipthat was previously deposited by a plasma-enhanced chemical vapor. By understanding theconditions that cause such cracking, researchersgain new insight into silicon microfabricationprocesses required to create high-powered micromachines. Just as this laboratory “disaster”leads to a greater understanding of matter andmanufacturing, so NSF-funded research helps usbetter understand and mitigate the effects of natural disasters.

Research from the laboratory of Martin Schmidtand Arturo Ayon, Massachusetts Institute ofTechnology.

162 — National Science Foundation

VisualizationComputer visualization techniques pioneered byNSF-funded researchers reveal otherwise hiddendetails of complex events—the evolution of a storm,the beating of a heart. For this image, Frankelphotographed a printed version of the ferrofluidpictured on the cover. The pattern serves as anothervisualization of the ferrofluid’s gryphon-like nature.

EnvironmentWhat better symbol is there of the environment’senduring, yet fragile, nature than the butterfly?Nightly, the tropical morpho butterfly folds itswings to display camouflaging browns and grays.But sunlight striking the morpho’s wings at justthe right angle reflects a brilliant, mate-attractingblue, as captured here by Frankel. The environ-ment in all its mysterious beauty remains a vitalfocus of NSF-sponsored research.

AstronomyThe human race has long yearned to explore andunderstand worlds beyond our own. This close-upof a telescope lens, so suggestive of a planet’scurving horizon, celebrates the many avenues ofastronomical research, including the support ofobservatories worldwide, made possible with sup-port from NSF.

Lens courtesy of Philip and Phylis Morrison,Massachusetts Institute of Technology.

Science on the EdgeScientists and engineers endure harsh conditionsat the Earth’s icy Poles to find abundant answersto many questions, from global climate change tothe cosmic origins of life. Residing in relative com-fort, Frankel shot this image of ice crystals con-densing and growing on her windowpane in winter.

Page 172: America's Investment in the Future

Photo Captions and Credits — 163

Photo Captions and Credits

ForewordPage v: CBS Inc. (top)Page v: News Office/Woods Hole

Oceanographic Institution

IntroductionPage 1: Sam KittnerPage 3: Anwar Huq

InternetPage 6: NSF funding played a role in the develop-

ment of the fiber optic technology that powers today’s Internet. Photographer: Ken Reid/FPG International

Page 7: Greg Foss/Pittsburgh Supercomputing Center

Page 9: Photo illustration by Adam SaynukPage 10: Frank Summers/Princeton University,

National Center for Supercomputing Applications

Page 11: D. Cox and B. Patterson/ National Center for Supercomputing Applications

Page 13: Photo illustration by Adam SaynukPage 15: PhotoDiscPage 16: Electronic Visualization Laboratory/

University of Illinois at Chicago

Advanced MaterialsPage 20: PhotoDiscPage 21: NSF CollectionPage 22: Bruce Novak/University of MassachusettsPage 23: The strong, flexible buckeyball molecule.

NSF CollectionPage 24: PhotoDisc Page 25: San Diego Supercomputer Center

NSF CollectionPgs.26-27: PhotoDiscPage 28: Peter Howard/Rice UniversityPage 30: NSF Collection

EducationPage 34: Visitors go Beyond Numbers and inves-

tigate the stretches, curves, and angles of mathematics at the New York Hall of Science. The traveling exhibit was dev-eloped at the Maryland Science Center. Peter Howard

Page 37: PhotoDiscPage 38: It’s About Time Publishing, Inc.Page 39: Diane Soderholm/MITPage 40: San Francisco Exploratorium Page 41: PhotoDiscPage 42: Bill Winn/University of WashingtonPage 43: Scholastic’s The Magic School Bus®,

the Emmy award-winning animated science adventure series for children based on Scholastic’s best-selling book series of the same name. Scholastic and The Magic School Bus®are trademarks of Scholastic Inc. © Scholastic Inc. All rights reserved.

Page 44: Teachers Experiencing Antarctica/Rice University

Page 45: PhotoDisc

ManufacturingPage 50: At the NSF-supported Laboratory for

Manufacturing and Productivity at MIT, researchers have developed a three-dimensional printing (3DP) process for the rapid and flexible production of parts and tools. 3DP works by building parts in layers from a computer (CAD) model. In the part shown here, a surface texture was defined in CAD and then mapped onto different solids. Such surface textures can be used to enhance heat transfer or create a prescribed surface roughness, among other things. Emanuel M. Sachs/MIT

The National Science Foundation is grateful to the many scientists, engineers, and institutions that havesubmitted photographs of their work to NSF. Over the years, these contributions have enabled NSF tobuild a collection of images (the NSF Collection) that enable the Foundation to better communicateabout the research and education programs it funds.

Page 173: America's Investment in the Future

164 — National Science Foundation

VisualizationPage 90: PhotoDiscPage 91: Charles Peskin and David McQueen/

Pittsburgh Supercomputing CenterPage 93: PhotoDiscPage 94: Electronic Visualization Laboratory/

University of Illinois at Chicago Page 95: Bill Wiegand/University of Illinois at

Urbana-ChampaignPgs.96-97:This image from a CAVE virtual reality

roller coaster ride lets viewers design and ride their roller coaster. Visualization courtesy of Jason Leigh/Electronic Visualization Laboratory, University of Illinois at Chicago

Page 99: The mathematics of fractals has given scientists and engineers the ability to visualize and understand complex systems. NSF Collection

Page 100: John Rosenberg/Pittsburgh Supercomputing Center

EnvironmentPage 104: T.W. PietschPage 105: Alan K. Knapp/Kansas State UniversityPage 107: Diane Hopp/Central Arizona – Phoenix

LTER Project, Arizona State UniversityPage 109: NSF CollectionPage 110: North Temperate Lakes/University of

Wisconsin LTERPage 111: A view from the top of the crane at the

Wind River Canopy Research Facility on the Olympic Peninsula in Washington.The crane is used to help study forest canopy organisms and interactions. Jerry Franklin

Page 113: G. David Tilman/Cedar Creek Natural History Area, University of Minnesota

Page 115: Close-up of a chytrid, a little known groupof fungi linked with frog deaths in Australia and Panama. Martha J. Powell/University of Alabama

Page 51: Rodney Hill/Engineering Research Center, University of Michigan

Page 53: NSF CollectionPage 54: John Consoli/University of MarylandPage 57: NSF supported a study of molten glass.

The glass was pulled into microthin optical fibers that can carry 1,000 times more information than an electrical wire. Photo courtesy of Corning Glass Works

Page 58: CyberCut Research Team/University of California at Berkeley

Page 60: Ron LeBlanc/University of Arizona

ArabidopsisPage 64: Felice Frankel/Laboratory of Gerald Fink,

Whitehead Institute, Massachusetts Institute of Technology

Page 65: Martin Yanofsky/University of California at San Diego

Page 66: Martin Yanofsky/University of California at San Diego

Page 67: Close-up of an Arabidopsis flower. Martin Yanofsky/University of California at San Diego

Page 68: Martin Yanofsky/University of California at San Diego

Page 69: Close-up of Arabidopsis cells.Martin Yanofsky/University of California at San Diego

Page 70: Martin Yanofsky/University of Californiaat San Diego

Page 71: Martin Yanofsky/University of California at San Diego

Page 73: Another cellular close-up of Arabidopsis.Martin Yanofsky/University of California at San Diego

Page 74: Martin Yanofsky/University of California at San Diego

Decision SciencesPage 78: PhotoDiscPage 81: PhotoDiscPage 83: Dale GlasgowPage 84: Dale GlasgowPage 85: PhotoDisc Page 86: Dale Glasgow

Page 174: America's Investment in the Future

Photo Captions and Credits — 165

AstronomyPage 120: NSF CollectionPage 121: National Radio Astronomy Observatory.Page 122: Association of Universities for Research

inAstronomy, Inc. (AURA).All rights reserved.Page 123: The head-on collision of two neutron stars.

This is an extract from a more complete analysis of the changes in pressure and density that occur from the collision and eventual coalescenceof twostars thathavereached the final phase in their evolution. Charles Evans, California Institute of Technology; Visualization by Ray Idaszak and Donna Cox, Illinois Supercomputer Center.

Page 124: Dr. Robert Mallozzi/University of Alabama in Huntsville and Marshall Spaceflight Center

Page 125: NSF CollectionPage 126: Tom Sebring/AURA/Gemini Observatory/

NOAO/NSFPage 127: NSF CollectionPage 128: WIYN/NOAO/NSFPage 129: NSF CollectionPage 130: Gregory Bryan and Michael Norman/

National Center for Supercomputing Applications

Page 131: Trina Roy and Jon Goldman/ElectronicVisualization Laboratory, University of Illinois at Chicago

Science on the Edge Page 134: PhotoDiscPage 135: Office of Polar Programs,

National Science FoundationPage 136: Scott Borg/National Science Foundation,

Office of Polar ProgramsPage 137: Office of Polar Programs,

National Science FoundationPage 138: Henry HuntingtonPage 140: Sandra Hines/University of WashingtonPage 141: Antarctica as photographed from the

spacecraft Galileo. NASA

Page 142: Ferrar Glacier, Stuart KlipperPage 143: USS Hawkbill, 1998 Commander

Submarine Force, U.S. Pacific Fleet Page 144: NASA

Disasters & Hazard MitigationPage 148: The remains of trees leveled by the

eruption of Mount St. Helens. David E. Wieprecht/USGS

Page 149: University Corporation for Atmospheric Research, Inc.

Page 151: A sand dune on Hog Island, one of NSF’s Long-Term Ecological Research Program sites. Bruce Hayden/University of Virginia/NSF

Page 152: Northridge Collection/Earthquake Engineering Research Center/University of California, Berkeley

Page 153: Christopher R. Thewalt/Great Hanshin Bridge Collection/Earthquake EngineeringResearch Center/University of California, Berkeley

Page 154: Austin Post/U.S. Geological Survey Page 155: An x-ray satellite image of a solar storm.

NASA/NSSDCPage 156: Pittsburgh Supercomputing CenterPage 157: PhotoDisc

About the PhotographsPage 160: Fabrik Studios Ltd.

AcknowledgementsPage 170: Photo illustration by Adam SaynukPage 171: Emanuel M. Sachs/MITPage 172: Martin Yanofsky/University of California

at San DiegoPage 173: NSF Collection at the U.S. Geological

Survey

About the National Science FoundationPage 173: ©ImageCatcher News/Christy Bowe 2000

Page 175: America's Investment in the Future

166 — National Science Foundation

The credit for the discoveries highlighted inAmerica’s Investment in the Future belongs to thethousands of scientists, engineers, educators,universities, and research centers that the NationalScience Foundation has supported since 1950.Just as advances in science and engineering arethe result of collaboration, so, too, is this bookcelebrating the Foundation’s first fifty years.

America’s Investment in the Future was devel-oped by NSF’s Office of Legislative and PublicAffairs (OLPA), under the guidance of ActingDirector Michael Sieverts. Ellen Weir, actinghead of OLPA’s Communications ResourcesSection, is the project director.

The book reflects the vision of former OLPADirector Julia A. Moore, currently a public policyscholar at the Woodrow Wilson InternationalCenter for Scholars. Stacy Springer, former headof the Communications Resources Section, over-saw the project through much of its development.

NSF is grateful to Low + Associates, Inc. forits communications expertise. Terry Savagedirected a talented team of writers, editors, anddesigners that included Cindy Lollar, Adam Saynuk,Chris Leonard, Susan Lopez Mele, Scott Allison,and Christine Enright Henke.

The Foundation thanks Mike Cialdella of The Foundry for his print management expertise,and also acknowledges the excellent prepressand printing services of Hoechstetter Printing.

NSF expresses its thanks both to the authorsand editors who researched and wrote about the scientists, engineers, teachers, and others who,with support from NSF, made discoveries thathave changed the way we live, and to the manyreviewers, who ensure that what we say is true.

InternetAUTHORS: Suzanne Harris and Amy HansenEDITORS: Amy Hansen and Ellen WeirREVIEWERS

Mike Bailey, San Diego Supercomputing Center

William Bainbridge, National Science Foundation

Jerome Daen, National Science Foundation

Tom DeFanti, University of Illinois

Tom Finholt, University of Michigan

Tom Garritano, National Science Foundation

Charles Goodrich, University of Maryland

Ellen Hoffman, Merit Network

Jack Johnson, Scripps Research Institute

Larry Landweber, University of Wisconsin

Mark Luker, EDUCAUSE

David Mills, University of Delaware

George Strawn, National Science Foundation

Ken Weiss, Pennsylvania State University

Stephen Wolff, Cisco Systems

Paul Young, Cisco Systems

Acknowledgments

Advanced MaterialsAUTHOR: Suzanne HarrisEDITORS: Amy Hansen and Ellen WeirREVIEWERS

Norbert M. Bikales, National Science Foundation

Robert Curl, Rice University

Alan Gent, University of Akron

W. Lance Haworth, National Science Foundation

Alan Heeger, University of California at Santa Barbara

Art Heuer, Case Western Reserve University

Lynn Jelinsky, Cornell University

Joseph P. Kennedy, University of Akron

Harold Kroto, Sussex University

David Lee, Cornell University

Andrew J. Lovinger, National Science Foundation

Alan MacDiarmid, University of Pennsylvania

Douglas Osheroff, Stanford University

Lynn Preston, National Science Foundation

Donald Paul, University of Texas

Robert Richardson, Cornell University

Page 176: America's Investment in the Future

Acknowledgments — 167

EducationAUTHOR AND EDITOR: Cindy LollarREVIEWERS

David Anderson, Wake Forest University

William Blanpied, National Science Foundation

John Bradley, National Science Foundation

Jane Butler Kahle, National Science Foundation

Roosevelt Calbert, National Science Foundation (retired)

John Cherniavsky, National Science Foundation

Daryl Chubin, National Science Board Office

Susan Duby, National Science Foundation

Arthur Eisenkraft, Bedford Public Schools

Elissa Elliott, TEA teacher

Hughes Pack, Northfield Mount Hermon School

Lynn Preston, National Science Foundation

Lawrence Scadden, National Science Foundation

Susan Snyder, National Science Foundation

Dorothy Stout, National Science Foundation

Jane Stutsman, National Science Foundation

Wayne Sukow, National Science Foundation

Judy Sunley, National Science Foundation

Linda Walker, Cobb Middle School

Gerry Wheeler, National Science Teachers Association

ArabidopsisAUTHOR: Suzanne HarrisEDITORS: Amy Hansen and Ellen WeirREVIEWERS

Machi Dilworth, National Science Foundation

David Meinke, Oklahoma State University

Elliot Meyerowitz, California Institute of Technology

DeLill Nasser, National Science Foundation

Chris Somerville, Carnegie Institution of Washington

ManufacturingAUTHORS: Bruce Schechter and Cindy LollarEDITOR: Cindy LollarREVIEWERS

Joseph Bordogna, National Science Foundation

Morris Cohen, Wharton School, University of Pennsylvania

Robert Graves, Rensselaer Polytechnic Institute

George Hazelrigg, National Science Foundation

Bruce Kramer, National Science Foundation

Louis Martin-Vega, National Science Foundation

Lynn Preston, National Science Foundation

Mihail C. Roco, National Science Foundation

Herbert Voelcker, Cornell University

Eugene Wong, University of California at Berkeley

Paul Wright, University of California at Berkeley

Advanced Materials (continued)Richard Smalley, Rice University

Richard Stein, University of Massachusetts

Ulrich Strom, National Science Foundation

Samuel Stupp, University of Illinois at Urbana-Champaign

Thomas A. Weber, National Science Foundation

IoannisYannas,Massachusetts Institute of Technology

Page 177: America's Investment in the Future

168 — National Science Foundation

VisualizationAUTHORS: Sheila Donoghue and Suzanne HarrisEDITORS: Amy Hansen and Ellen WeirREVIEWERS

Tom DeFanti, University of Illinois

Don Greenberg, Cornell University

Richard S. Hirsh, National Science Foundation

Anne Morgan Spalter, Brown University

Andries van Dam, Brown University

EnvironmentAUTHORS: Mari Jensen and Cindy LollarEDITOR: Cindy LollarREVIEWERS

Sandy J.Andelman, University of California at Santa Barbara

Scott Collins, National Science Foundation

Charles Driscoll, Syracuse University

Cheryl Dybas, National Science Foundation

Penelope Firth, National Science Foundation

Nancy Grimm, CAP LTER, Arizona State University

W. Franklin Harris, University of Tennessee at Knoxville

Timothy K. Kratz, University of Wisconsin at Madison

Gene Likens, Institute of Ecosystem Studies

Jane Lubchenco, Oregon State University

Robert Parmenter, Sevilleta LTER, University of New Mexico

Steward Pickett, Baltimore LTER

Martha J. Powell, University of Alabama in Tuscaloosa

Charles L. Redman, Arizona State University

Joann Roskoski, National Science Foundation

James Rodman, National Science Foundation

David Tilman, University of Minnesota

Jianguo Wu, Arizona State University West

Terry Yates, Sevilleta LTER, University of New Mexico

Decision SciencesAUTHORS: Suzanne Harris and Peter GwynneEDITOR: Terry SavageILLUSTRATIONS: Dale GlasgowREVIEWERS

Colin Camerer, California Institute of Technology

Catherine Eckel, National Science Foundation

Daniel Kahneman, Princeton University

Howard Kunreuther, Wharton School, University of Pennsylvania

Jonathan Leland, National Science Foundation

Paul Milgrom, Stanford University

Dan Newlon, National Science Foundation

Charlie Plott, California Institute of Technology

Al Roth, University of Pittsburgh

Page 178: America's Investment in the Future

Acknowledgments — 169

Disasters & Hazard MitigationAUTHOR: Jeff RosenfeldEDITOR: Cindy LollarREVIEWERS

William Anderson, National Science Foundation

Anatta, University Corporation for Atmospheric Research

Deborah Chung, University of Buffalo

Jay Fein, National Science Foundation

Michael Glantz, National Center for Atmospheric Research

Donald Goralski, Multidisciplinary Center for Earthquake Engineering, University of Buffalo

Tom Henyey, Southern California Earthquake Center, University of Southern California

Michael Knolker, National Center for Atmospheric Research

Shih Chi Liu, National Science Foundation

Stephen Mahin, University of California at Berkeley

Kishor Mehta, Texas Tech University

Vanessa Richardson, National Science Foundation

Kathleen Tierney, University of Delaware

Lucy Warner, National Center for Atmospheric Research

James Whitcomb, National Science Foundation

Stan Williams, Arizona State University

Stephen Zebiak, Lamont-Doherty Earth Observatory, Columbia University

AstronomyAUTHORS: James White, Astronomical Societyof the Pacific,

and Suzanne Harris

EDITOR: Terry SavageREVIEWERS

Morris Aizenman, National Science Foundation

Gregory D. Bothun, University of Oregon

Gregory Bryan, National Center for Supercomputing Applications

R. Paul Butler, San Francisco State University

J. Richard Fisher, National Radio Astronomy Observatory

Wendy Freedman, Carnegie Observatories

Andrea Ghez, University of California at Los Angeles

John Leibacher, National Optical Astronomy Observatories;Director of Global Oscillation Network Group (GONG)

Geoffrey Marcy, San Francisco State University

Jeremy Mould, Mount Stromlo and Siding Spring Observatories

Michael Normal, National Center for Supercomputing Applications

Vera Rubin, Carnegie Institution of Washington

Bernard Sadoulet, University of California at Berkeley

Walter Stockwell, University of California at Berkeley

R. Brent Tully, University of Hawaii

Alexander Wolszczan, Pennsylvania State University

Science on the EdgeAUTHORS: Guy Guthridge, Faye Korsmo, and

Suzanne Harris

EDITOR: Cindy LollarREVIEWERS

Erick Chiang, National Science Foundation

Karl Erb, National Science Foundation

John Lynch, National Science Foundation

Lynn Simarski, National Science Foundation

Paul Young, University of Washington

Page 179: America's Investment in the Future

170 — National Science Foundation

Dr. John A. ArmstrongIBM Vice President for Science & Technology(retired)

Dr. Nina V. FedoroffWillaman Professor of Life Sciences and Director, Life Sciences, Consortium andBiotechnology Institute, Pennsylvania State University

Dr. Pamela A. FergusonProfessor of Mathematics, Grinnell CollegeGrinnell, Iowa

Dr. Mary K. GaillardProfessor of Physics, Theory GroupLawrence Berkeley National Laboratory

Dr. M.R.C. GreenwoodChancellor, University of California, Santa Cruz

Dr. Stanley V. JaskolskiChief Technology Officer and Vice President,Technical Management, Eaton Corporation,Cleveland, Ohio

Dr. Anita K. Jones, Vice Chair Lawrence R. Quarles Professor of Engineering andApplied Science, University of Virginia

Dr. Eamon M. Kelly, ChairPresident Emeritus and Professor Payson Centerfor International Development & Technology Transfer,Tulane University

Dr. George M. LangfordProfessor, Department of Biological Science,Dartmouth College

Dr. Jane LubchencoWayne and Gladys Valley Professor of MarineBiology and Distinquished Professor of Zoology,Oregon State University

Dr. Joseph A. Miller, Jr.Senior Vice President for R&D and ChiefTechnology Officer, E.I. du Pont de Nemours &Company, Experimental Station

Dr. Diana S. NatalicioPresident, University of Texas at El Paso

Dr. Robert C. RichardsonVice Provost for Research and Professor of Physics,Cornell University

Dr. Michael G. RossmannHanley Professor of Biological Sciences, Purdue University

Dr. Vera C. RubinResearch Staff, Astronomy, Department ofTerrestrial Magnetism, Carnegie Institution ofWashington

Dr. Maxine SavitzGeneral Manager, Technology Partnerships,Honeywell

Dr. Luis SequeiraJ.C. Walker Professor Emeritus, Departments of Bacteriology and Plant Pathology,University of Wisconsin at Madison

Dr. Daniel SimberloffNancy Gore Hunger Professor of EnvironmentalScience, University of Tennessee

Dr. Bob H. SuzukiPresident, California State Polytechnic University

Dr. Richard TapiaNoah Harding Professor of Computational & Applied Mathematics, Rice University

Dr. Chang-Lin TienNEC Distinguished Professor of Engineering,University of California at Berkeley

Dr. Warren M. WashingtonSenior Scientist and Section Head, National Center for Atmospheric Research

Dr. John A. White, Jr.Chancellor, University of Arkansas at Fayetteville

Dr. Mark S. Wrighton* Chancellor, Washington University

Dr. Rita R. Colwell(Member Ex Officio and Chair, Executive Committee)Director, National Science Foundation

Dr. Marta CehelskyExecutive Officer

National Science Board Members

*NSB nominee pending U.S. Senate confirmation.

Page 180: America's Investment in the Future

National Science Board Members and NSF Executive Staff — 171

Dr. Rita R. Colwell, Director Dr. Joseph Bordogna, Deputy DirectorOffice of the Director

Dr. Mary E. Clutter, Assistant DirectorDirectorate for Biological Sciences

Dr. Ruzena Bajcsy, Assistant DirectorDirectorate for Computer and InformationSciences and Engineering

Dr. Judith S. Sunley, Interim Assistant DirectorDirectorate for Education and Human Resources

Dr. Louis A. Martin-Vega, Acting Assistant DirectorDirectorate for Engineering

Dr. Margaret S. Leinen, Assistant DirectorDirectorate for Geosciences

Dr. Robert A. Eisenstein, Assistant DirectorDirectorate for Mathematical and Physical Sciences

Dr. Norman M. Bradburn, Assistant DirectorDirectorate for Social, Behavioral, and EconomicSciences

Dr. Karl A. Erb, DirectorOffice of Polar Programs

Ms. Ana A. Ortiz, Equal Opportunity Coordinator

Mr. Lawrence Rudolph, Esq., General Counsel

Dr. Christine C. Boesz, Inspector General

Dr. Nathaniel G. Pitts, DirectorOffice of Integrative Activities

Mr. Michael C. Sieverts, Acting DirectorOffice of Legislative and Public Affairs

Mr. Thomas N. Cooley, DirectorOffice of Budget, Finance, and Award Management

Ms. Linda P. Massaro, DirectorOffice of Information and Resource Management

National Science Foundation Executive Staff

Page 181: America's Investment in the Future

172 — National Science Foundation

NSF funds research and education in scienceand engineering through grants, contracts, andcooperative agreements to about 1,600 colleges,universities, K-12 schools, academic consortia,nonprofit institutions, small businesses, and otherresearch institutions in all parts of the UnitedStates. NSF is one of the federal government’smost cost-effective agencies. Its internal operationsconsume about 4 percent of its total budget,leaving more than 96 percent for investment inmerit-reviewed research and education projects.In the 1999 fiscal year, NSF invested $2.8 billionin research and $614.7 million in educationactivities. While NSF’s budget accounts for onlyabout 3 percent of the total federal expenditureon research, the Foundation provides half of thefederal support to academic institutions for non-medical basic research. Not only does NSF-sponsored research result in new knowledge andtechnologies, it also helps to educate future gen-erations of scientists, engineers, educators, andother technically trained professionals.

NSF is an independent federal agency created bythe National Science Foundation Act of 1950, asamended. Its aim is to promote and advanceprogress in science and engineering research andeducation in the United States. The idea of such afoundation was an outgrowth of the important con-tributions made by science and technology duringWorld War II. Among federal agencies that providefunds for basic research, only NSF is responsiblefor strengthening the overall health of U.S. scienceand engineering across all fields. In contrast, otheragencies support inquiry focused on a specificmission such as defense or energy. The NSF focuson basic research supports these and other mis-sions, as well as the advance of fundamentalknowledge for humankind. NSF leads the nation’sefforts to achieve excellence in science, mathe-matics, engineering, and technology education atall levels. The Foundation is committed to ensuringthat the United States has a strong cadre of sci-entists, engineers, and science educators; a workforce that is scientifically and mathematicallyliterate; and a public that fully understands basicconcepts of science, engineering, and technology.

About the National Science Foundation

Page 182: America's Investment in the Future

About the National Science Foundation — 173

Through its investments—in future generations,in merit-reviewed research and education pro-jects, and in the extensive distribution of newknowledge—NSF is committed to enhancing thenation’s capacity for achieving excellence in allfields of science and engineering, thereby ensur-ing new sources of prosperity and opportunity for all Americans.

NSF welcomes proposals from all qualified sci-entists, engineers, and educators. The Foundationstrongly encourages women, minorities, and personswith disabilities to compete fully in its programs.In accordance with federal statutes, regulations,and NSF policies, no person on grounds of race,color, age, sex, national origin, or disability shallbe excluded from participation in, be denied thebenefits of, or be subjected to discrimination underany program or activity receiving financial assistancefrom NSF (unless otherwise specified in the eligi-bility requirements for a particular program).

Facilitation Awards for Scientists and Engineerswith Disabilities (FASED) provide funding for specialassistance or equipment to enable persons withdisabilities (investigators and other staff, includingstudent research assistants) to work on NSF-sup-ported projects. See the program announcement orcontact the program coordinator at (703) 292-6865.

NSF has Telephonic Device for the Deaf (TDD)and Federal Relay Service (FRS) capabilities thatenable individuals with hearing impairments tocommunicate with the Foundation regarding NSFprograms, employment, or general information.TDD may be accessed at (703) 292-5090 orthrough FRS at 800-877-8339.

The National Science Foundation is committedto making all of the information we publish easyto understand. If you have a suggestion about howto improve the clarity of this document or otherNSF-published materials, please contact us [email protected].

Page 183: America's Investment in the Future
Page 184: America's Investment in the Future
Page 185: America's Investment in the Future

4201 Wilson Blvd.Arlington, VA 22230703-292-5111TDD 703-292-5090

www.nsf.govNSF 00-50

Design: Low + Associates Inc.