Virtual Reality and Technologies for Combat Simulation

57
Virtual Reality and Technologies for Combat Simulation September 1994 OTA-BP-ISS-136 NTIS order #PB95-109658 GPO stock #052-003-01397-5

Transcript of Virtual Reality and Technologies for Combat Simulation

Virtual Reality and Technologies forCombat Simulation

September 1994

OTA-BP-ISS-136NTIS order #PB95-109658

GPO stock #052-003-01397-5

Recommended Citation: U.S. Congress, Office of Technology Assessment, Virtual Realityand Technologies for Combat Simulation--Background Paper, OTA-BP-ISS-136(Washington, DC: U.S. Government Printing Office, September 1994).

Foreword

v irtual reality (VR) is the popular name for an absorbing, interactive, com-puter-mediated experience in which a person perceives a synthetic (i.e.,simulated) environment by means of special human-computer interfaceequipment and interacts with simulated objects in that environment as if

they were real. Several persons can see one another and interact in a shared synthet-ic environment, such as a synthetic battlefield. For over a decade the Department ofDefense (DOD) has been developing and expanding virtual battlefields to be usedboth for training and to develop combat systems and doctrine. This background pa-per describes applications of synthetic-environment technologies in simulatingcombat. It traces technology development from the 1929 Link Trainer through theSAGE air defense system, the first head-mounted display, and the Defense Ad-vanced Research Projects Agency’s SIMNET simulator networking project.

Synthetic-environment technology is dual-use. Research funded by DOD seed-ed the field; now there is a large commercial market, and DOD is activelyexploiting the dynamism and efficiency of that market. Advances in synthetic-environment technologies such as computer image generation are reducing thecosts of cockpit simulators and facilitating other applications. This paper describestechnical challenges and discusses issues of validation, standardization, scalability,flexibility, effectiveness, cost-effectiveness, and infrastructure.

This background paper is the first of several publications of the Office ofTechnology Assessment’s (OTA’S) assessment of combat modeling and simulation,which was requested by Representatives Ronald V. Dellums (Chairman) and FloydSpence (Ranking Minority Member) of the House Committee on Armed Services,Senators Sam Nunn (Chairman) and Strom Thurmond (Ranking Minority Member)of the Senate Committee on Armed Services, and Senators Jeff Bingaman (Chair-man) and Bob Smith (Ranking Minority Member) of its Subcommittee on DefenseTechnology, Acquisition, and Industrial Base.

In undertaking this assessment, OTA sought the contributions of a wide spec-trum of knowledgeable individuals and organizations. OTA gratefully acknowl-edges their contributions of time and intellectual effort. OTA also appreciates thehelp and cooperation of officials of the Department of Defense and the Departmentof Energy. As with all OTA publications, the content of this background paper is thesole responsibility of the Office of Technology Assessment and does not necessari-ly represent the views of our advisors or reviewers.

ROGER C. HERDMANDirector . . .

Ill

George Rathjens, ChairMassachusetts Institute of

Technology

Donald BlumenthalLawrence Livermore National

Laboratory

Jerome BrackenYale University

Edward C. BradyStrategic Perspectives, Inc.

David R. CheritonStanford University

Paul K. DavisThe RAND Corp.

CoI. T. N. DupuyThe Dupuy Institute

John EnglundAnalytic Services, Inc.

Joseph P. FeareyNational Aeronautics and Space

Administration, Jet PropulsionLaboratory

Amoretta M. HoeberAMH Consulting

John D. Kettelle

Frank LanzaLORAL Corp.

Creve MaplesSandia National Laboratory

Jed MartiThe RAND Corp.

Duncan MillerMIT Lincoln Laboratory

Stuart H. StarrThe MITRE Corp.

Larry StoneMetron, Inc.

Jack ThorpeScience Applications International

Corp.

Verena S. VomasticElectrospace Systems, Inc.

Jordan WeismanVirtual World Entertainment

Note: OTA appreciates and is grateful for the valuable assistance and thoughtful critiques provided by the advisory panel members.The panel does not, however, necessarily approve, disapprove, or endorse this background paper. OTA assumes full responsibility forthe background paper and the accuracy of its contents.

iv

Peter Blair PRINCIPAL STAFFAssistant Director, OTA Brian McCue

Industry, Commerce, & Project DirectorInternational Security Division

Michael CallahamAlan Shaw Senior AnalystProgram Manager

Preject Staff

ADMINISTRATIVE STAFFJacqueline Robinson-Boy kinOffice Administrator

Ellis LewisAdministrative Secretary

International Security andSpace Program Darian Unger

Intern

c ontents

Focus: High-Tech Combat Simulation 2

Why Congress Cares 4

Milestones in Simulation Technology 5Link Trainers (1929) 7The Whirlwind Computer (1946) and the SAGE Air Defense

System (1958) 9The ’’Sword of Damocles’’ Head-Mounteded Display

(1966) 10

Virtual Reality 11Types of VR Systems 12Cockpit Simulators 13Image Generators 14Manipulation and Control Devices 15Position Tracking 16Stereo Vision 17Head-Mounted Displays 18Haptic Transducers 20Degrees of Virtual Reality 22Shared Virtual Reality 23

Observations 24Synthetic Environment Technology Is Dual-Use 24Cost-Effectiveness Is Increasing 25

Challenges 26High-Density, Color Flat-Panel Displays 26Fast Head Tracking 26Wideband Networks With Low Latency 27Multilevel Network Security 27Automating Object and World Description for Scene

Generators 27Simulating Infantry and Noncombatants 28

vii

Issues 30Validation 30Standardization 31Scalability 32Flexibility 33Effectiveness 34Cost-Effectiveness 34Infrastructure 34

References and Bibliography 37

A Acknowledgments 47

B Glossary of Acronyms 49

viii

virtual Realityand Technologies

for CombatSimulation

v irtual reality (VR) is the popular name for an absorbing,interactive, computer-mediated experience in which aperson, or persons, perceives and interacts through sen-sors and effecters with a synthetic (i.e., simulated) envi-

ronment, and with simulated objects in it, as if it were real. Theexperience is characterized by inattention to the artificiality of theexperience. The experience is provided by sensory stimuli gener-ated by special human-computer interface (HCI) systems in re-sponse to the user’s movements or speech, which are detected byother interface systems and interpreted by computer.

HCI systems have become the visible symbols of VR. Themost distinctive system is the head-mounted display (HMD),which monitors the position and orientation of the wearer’s headand displays a view (and might generate sounds) of an artificialworld from that perspective. Another distinctive system is theglove-input device, which monitors the position and orientationof the wearer’s hand and the flexure of each joint, so that the dis-play can show the wearer’s hand, and the computer can interprethand gestures as commandsl or sense whether the wearer is at-tempting to push a button or flip a switch on a control panel thatthe wearer sees in a display. An advantage of these HCI devices istheir versatility. In principle, a simulator equipped with an HMDand a glove-input device could serve, in successive sessions, as afighter cockpit simulator, a ship’s bridge simulator, and a tankturret simulator—if appropriate software and databases wereavailable.

II1 For example, a pointing gesture may mean “fly in this direction!”

2| Virtual Reality and Technologies for Combat Simulation

However, in none of these applications does thecombination of an HMD and a glove-input deviceyet offer the realism provided by the best instru-mented cockpit mock-up with out-the-windowcomputer image generation. Virtual (or synthetic)environment technology-the technology ofVR-evolved from aircrew training devices(ATDs) such as the Link Trainer, which wasintroduced in 1929. Cockpit and cab simulatorsstill provide the most realistic, absorbing experi-ences and are used for entertainment as well astraining and research-just as Link Trainers were.

“Desktop VR” systems use more familiar HCIequipment—a monitor, a keyboard, and a mouse.These HCI systems are important because they areso pervasive—not only in the civil sector (homes,schools, business, government), but also in themilitary. Desktop VR may have the greatest near-term impact on improving American educationand productivity through VR. It may enable mostAmericans to visualize, manipulate, and interactwith computers and extremely complex data in asimple, natural way. This could be useful in mili-tary applications such as acquisition managementand intelligence analysis.

In opening the 1991 Senate hearing on VR,Senator Albert Gore said “virtual reality promisesto revolutionize the way we use computers. . . . Ithas the potential to change our lives in dozens ofways [70].”2 A witness, Thomas A. Furness, III, apioneer of military applications of VR, testified:

Instead of using highly coded, highly com-plex interfaces like you have to program a VCR,or the old type of fighter cockpit, we want to goback to the basics of how humans work, realiz-ing that we are three-dimensional beings. Wesee things and hear things and touch things inthree dimensions [65].

This background paper describes demonstratedand potential applications of VR technologies tocombat simulation for training and for develop-ment of combat systems and doctrine. It traces thedevelopment of VR technology from the 1929

Link Trainer, an aircrew training device (“flightsimulator”); through digital flight simulation;through the Semi-Automatic Ground Environ-ment (SAGE) air defense system, which im-mersed radar operators and weapon controllers invirtual air combat; to the first HMD, begun in1966 with the aim of immersing the wearer in avirtual visual environment.

VR technology is dual-use. Research funded bythe Department of Defense (DOD) seeded thefield; now there is a large commercial market, andDOD is actively exploiting its dynamism and effi-ciency. As system performance improves andprices decline, many new military and commer-cial applications are becoming cost-effective. Ad-vances in particular VR technologies, such ascomputer image generation, are applicable incockpit simulators and other applications that pur-ists do not call “VR.”

Hardware challenges include the developmentof high-density, high-brightness, high-contrastcolor displays (especially lightweight flat-paneldisplays); fast head trackers; and wideband net-works with high availability and multilevel secu-rity. Automating object and world description forscene generators and simulating infantry and non-combatants are also technical challenges. This pa-per describes these challenges and discussesissues of validation, standardization, scalability,flexibility, effectiveness, cost-effectiveness, andinfrastructure.

FOCUS: HIGH-TECH COMBATSIMULATIONThis background paper is the first of several publi-cations planned to document the results of the Of-fice of Technology Assessment’s (OTA’s)assessment of combat modeling and simulation.Only incidentally does OTA’s assessment consid-er simulation of military activities other than com-bat, such as logistics, nor does it investigateengineering simulations used in the development

2 Num~r~ ~ square bmckets cite ~efemnces ]is&d in tie ~tion “References ~d Bibliography” at the end of this document.

Virtual Reality and Technologies for Combat Simulation 13

of military equipment or simulations used inscientific research for military applications. Suchsimulations are important, and their results areused in the design, acquisition, and operation ofcombat simulations. However, they are beyondthe scope of OTA’s assessment. This white paperfocuses on high-tech human-machine simula-tion—i.e., simulation using computer hardware,software, and human-computer interface technol-ogies that provide qualitatively new capabilitiesor improvements in quantitative measures of per-formance. It focuses particularly on their applica-tion to virtual reality, which poses the greatestdemands on performance.

DOD Directive 5000.59 defines simulation as:. . . a method for implementing a model overtime. Also, a technique for testing, analysis, ortraining in which real-world systems are used, orwhere real-world and conceptual systems are re-produced by a model.

The directive defines model as:

. . . a physical, mathematical, or otherwise log-ical representation of a system, entity, phenome-non, or process [191].

The phrase “over time” in the definition of sim-ulation is important: a simulation represents a sys-tem, entity, phenomenon, or process that changeswith time; it implements a dynamic model andpredicts (although it may not display) a history ofintermediate conditions (“states”) that are ex-pected to occur over the span of time between theinitial condition and the final time. Not all modelsare dynamic; some combat models predict whichside will win a battle, or how many soldiers andvehicles will survive, based on the numbers andtypes of units committed to battle when it begins,without predicting how the situation will changeduring the battle. Simulations, in contrast, predicthow an initial situation will change during abattle. Some simulations do this automatically.Others use human participants to make decisionsand plans, issue orders, operate sensors, or pulltriggers; these were formerly called man-in-the-

loop simulations and are now called human-ma-chine simulation or, if no computer is involved,human-centered simulations in accordance withInstitute of Electrical and Electronic Engineers(IEEE) Standard 610.3 [96]. Human-machine orhuman-centered simulations of combat are some-times called wargames.

Simulation of combat by moving paper count-ers (or more elaborate models) representing sol-diers, companies, or battalions over map boards orsand tables is still done for recreation, but in pro-fessional military education such games havebeen largely superseded by computer-mediatedwargames, in which computers act as fast, reliabledata banks and as referees.3 Current trends in-clude:

1. the networking of such computers in growingnumbers to simulate—and to involve as partici-pants—larger forces over larger areas in greaterdetail;

2. continued growth in the information storage ca-pacity and processing speed per processor, percomputer, and per dollar; and

3. the use of ever more sophisticated HCI tech-nology, such as HMDs with head trackers anddigital three-dimensional (3-D) sound synthe-sizers.

The newest of these technologies wereintroduced almost three decades ago, but continu-ing advances in microelectronics and othertechnologies have made them useful, affordable,and cost-effective for some training tasks. Furtheradvances promise to make computer-mediatedsimulation cost-effective for additional trainingtasks.

Human-machine simulation of combat is usednot only for training but also for research-+. g., toassess the effectiveness of proposed tactics orweapons. However, it would be prohibitively ex-pensive to assess the effect of, say, a proposedcorps surveillance system under 1,000 differentconditions by putting a corps’s personnel in simu-lators for 1,000 exercises. Such analytical tasks

3 Cf. [153], p< 152, which shows U.S. Army forces engaged in a sand-table exercise during Operation lksert Stem.

4 I Virtual Reality and Technologies for Combat Simulation

are usually accomplished by computer-based (or,synonymously, machine-centered) simulation,which simulates human behavior along with otherdynamic processes. This allows the simulation torun much faster than real time, so the research maybe completed quickly and inexpensively; it re-quires neither human participants nor HCI sys-tems. However, the validity of the results dependson the validity of the model of human behaviorthat is implemented by the simulation. This is notanew issue [161], and it is not a hardware issue; itwill be treated in depth in the final report of thisassessment.

WHY CONGRESS CARESCongress authorizes and appropriates funds fordevelopment, testing, evaluation, procurement,operation, and maintenance of simulation equip-ment and therefore has a direct interest in howthose funds are spent. In addition to the funds au-thorized and appropriated specifically for simula-tion technology development programs andprocurement of simulators, an indeterminate frac-tion of what DOD spends for operations and main-tenance is spent on combat simulation, as is anindeterminate fraction of what DOD spends for re-search, development, testing, and evaluation(RDT&E). How much is spent depends partly onthe expansiveness of the definition one uses. Forexample, General Paul Gorman, USA-Ret.,Chairman of DOD’s Simulation Policy Study, tes-tified in 1992 that “all tactical training short ofcombat itself is a simulation” [72]. The 1993 re-port of the Defense Science Board Task Force onSimulation, Readiness, and Prototyping went astep further by stating “everything is simulationexcept combat” [196]. However, these expansivedefinitions are not used in most estimates ofspending. According to one estimate, DODspends about $2.5 billion per year on simulationequipment and services [170], to say nothing ofthe pro rata share of the pay of military and otherfederal employees engaged in simulation activi-ties.

The widespread and increasing use of simula-tion to evaluate developmental systems at acquisi-

tion milestones is of potentially greatersignificance. Decisions for all acquisition catego-ry I programs are based partly on cost and opera-tional effectiveness analyses (COEAs), which usemodels, and often simulations, to predict howcandidate systems would perform in service—e.g., combat. Much is staked on the validity ofthose models and simulations. Some force em-ployment decisions are also informed by simula-tion results.

DOD’s Advanced Research Projects Agency(ARPA) is undertaking to allow simulation andvirtual reality to play an even greater role in sys-tem acquisition (as well as training, logistics, andoperations) with its Simulation-Based Design(SBD) program:

SBD will integrate computer and informa-tion science technology developments such asadvanced visualization techniques, high band-width networks, human/computer interaction,and massive database management to develop arevolutionary design environment for the ac-quisition of complex systems [181].

In 1993 ARPA awarded a $1 .4-million contractfor research on a Simulation-Based Design Sys-tem (SBDS) [175] and later solicited proposals todevelop specific technologies (including virtualenvironment technologies) required for an SBDsystem to completely integrate ship design, ac-quisition, and support processes [178]. This year(1994), ARPA solicited proposals to “develop anddemonstrate a Simulation-Based Design . . . sys-tem.” ARPA anticipates that:

. . . the requirements for a SBD system, havingthe capabilities stated above, would include: . . .an open, scalable architecture with a distributedcomputing structure; . . . multilevel security;. . . multiple personnel immersion capability inthe synthetic environment, where participantsmay be physically remote; . . . synthetic envi-ronment graphical display rates of 30 frames persecond and polygonal representations of modelsas large as 10 million polygons displayed from adatabase of greater than 10 billion polygonmodels; . . . network capable of greater than 1gigabit/see; . . . Analysis capabilities including

Virtual Reality and Technologies for Combat Simulation 15

models with as many as one million degrees offreedom [18 1].

If the effort succeeds in revolutionizing the de-sign and acquisition of complex military systemssuch as ships, its fiscal impact could overshadowits direct cost.4

Members and committees of Congress, includ-ing the requesters of this assessment, have shownpersistent keen interest in the potential of ad-vanced simulation technology, not only for com-bat simulation but for other purposes as well. At a1992 hearing on advanced simulation technology[173], members of the Senate Committee onArmed Services questioned DOD witnessesabout:■

m

the limits of substitutability of virtual simula-tion for live simulation (e.g., training exer-cises),the prospect for modeling political and eco-nomic processes to make simulators for crisismanagement training,the use of simulators for command and stafftraining and for better training of NationalGuard units,the funding requirements for a rational invest-ment strategy, andthe potential for technology transfer to the civilsector, particularly the Federal Aviation Ad-ministration (FAA), the National Aeronauticsand Space Administration (NASA), and educa-tion.

Similar questions were raised at the 1991 hear-ing on virtual reality before the Subcommittee onScience, Technology, and Space of the SenateCommittee on Commerce, Science, and Trans-portation [172]. Questions and testimony con-cerned the dual-use applications and benefits ofthis very rapidly developing technology, require-ments for U.S. competitiveness in the virtual real-

ity marketplace, and potential benefits ofapplications of the technology in other sectorssuch as manufacturing, medicine, education, andentertainment, as well as defense.

Human-computer interaction is one of six “pri-orities . . . identified by the NSTC (National Sci-ence and Technology Council) Committees asneeding new or additional emphasis in FY 1996”to help achieve the Administration’s goal ofharnessing information technology-one of thesix FY 1996 research and development goals ar-ticulated by the President’s Science Advisor andthe Director of the Office of Management andBudget in their May 6, 1994, Memorandum forthe Heads of Executive Departments and Agen-cies [69]. It said, “Development and use of the fol-lowing should be advanced: virtual reality;simulation; flat-panel displays; video and high--definition systems; 3-D sound, speech interfaces,and vision.”

MILESTONES IN SIMULATIONTECHNOLOGYMany techniques and technologies used in state-of-the-art computer-based simulation and human-computer simulation have been around fordecades (see table l-l). In this section we reviewLink Trainers, introduced in 1929; the Whirlwindcomputer, begun in 1946 for digital flight simula-tion and eventually used as a technology demon-strator for the SAGE air defense system; and thefirst head-mounted display (HMD) for computer-generated imagery, begun in 1966. Many newtechnologies have been introduced since then, butthe most obvious novelty is the detail and speedwith which virtual visual scenes may be synthe-sized digitally and displayed. The use of digital,rather than analog, optical, or mechanical, tech-niques facilitates the networking of simulators,which is very recent and significant.

4 me program ~a~ funded for $3 million in FY 93 and for $9.074 million in FY 94. The President’s Budget Contains funding for tie Program

through FY 98: $15.77 million in FY 95,$15.28 millioninFY96,$16.11 million in FY 97, and $20.15 million in FY 98 [182].

6 I Virtual Reality and Technologies for Combat Simulation

1929

1931

1934

1943

1945

WW II

1946

1946

1949

1949

1949

1950

1953

1958

1950s

1965

1966

1968

1972

ca. 1973

1976

First Link Trainer,

Navy buys a fully instrumented Link Trainer for $1,500.

Army buys six Model A Link Trainers for $3,400 each.

Development of the Electronic Numerical Integrator and Calculator (ENIAC) is begunat the University of Pennsylvania to calculate artillery range tables by simulating theflight of artillery shells.

ENIAC performs calculations for hydrogen bomb development.

Airplane Stability Control Analyzer (ASCA) is begun at Massachusetts Institute ofTechnology (MIT) for the Navy, intended for engineering simulation, experiential pro-totyping, and training,

Jay Forrester begins building a digital computer, later named Whirlwind, for ASCA,Whirlwind was later used for the Air Force’s Cape Cod System, a technology demon-strator for the Semi-Automatic Ground Environment (SAGE) air-defense system.

ENIAC is publicly announced.

Link Co. presents the Air Force with a sales brochure citing experiments comparingeffectiveness of flight simulation to flight training and estimating savings in dollars,man-hours, and lives.

Link Co. wins an Air Force contract to develop first simulator for a jet, the F-80.

Digital Radar Relay (DRR) transmits data from Microwave Early Warning (MEW) ra-dar at Hanscom Field to the Air Force Cambridge Research Center by telephone lineusing modems at 1,300 bits per second.

MEW radar sends data to Whirlwind computer using DRR.

Cape Cod System is used in live simulations, directing Air Defense Command andAir Research and Development Command interceptors against Strategic Air Com-mand bombers acting as hostile aircraft. Cape Cod System is also used in virtualsimulations, with computer-generated simulated targets presented to surveillanceoperators and weapon controllers.

First Air Force SAGE center is operational.

SAGE air defense system is used for mixed live and virtual simulation,

Ivan Sutherland presents his seminal conference paper, “The Ultimate Display, ” ar-ticulating the vision of virtual reality.

Ivan Sutherland begins developing the first head-mounted display (the “sword ofDamocles”) at MIT’s Lincoln Laboratory and continues at the University of Utah.

Evans and Sutherland, Inc., is founded to create electronic “scene generators, ”

General Electric builds the Advanced Development Model (ADM) for the U.S. Navy tomeasure the effectiveness of computer image generation for pilot training,

Computers rendered scenes composed of 200 to 400 polygons at 20 frames persec-ond. This was adequate for pilot training-e.g., simulating night approaches to land-ing on an aircraft carrier. Studies showed that pilot performance suffered at slowerframe rates.

Myron Krueger demonstrates VI DEOPLAC E (two-dimensonal) and coins term “artifi-cial reality. ”

Virtual Reality and Technologies for Combat Simulation 17

1980

1982

1983

1984

1986

1987

1991

1992

1993

1994

1994

1997

McDonnell-Douglas Aircraft Corp. develops a helmet with magnetic head trackingand see-through HMD for its Virtual Image Takeoff and Landing (VITAL) F-18 aircraftcockpit simulator.

Thomas Furness Ill demonstrates Visually Coupled Airborne Systems Simulator(VCASS) at the Air Force’s Wright Laboratories. The HMD used a Polhemus six-di-mensional magnetic tracker and custom 1-in, diameter monochrome cathode-raytubes (CRTs) with 2000 scan lines, which provided more detail than anything com-mercially available even today.

Defense Advanced Research Projects Agency (DARPA) begins Simulator Network(SIMNET) project.

Michael MacGreevy and Stephen Ellis of NASA’s Ames Research Center build theVirtual Visual Environment Display (ViVED, pronounced vivid), the first $2,000 head-mounted display.

The Air Force funds phase 2 of VCASS: “Super Cockpit. ” Voice command recognitionand voice response plus three-dimensional synthetic audio environment added.

SIMNET has about 250 simulators fielded.

University of North Carolina computer renders scenes at 1 million untextured poly-gons per second or 200,000 textured polygons per second.

Virtual Cockpit project begins at the U.S. Air Force Wright Laboratories.

Institute of Electrical and Electronic Engineers issues IEEE Standard 1278-1993 (DISprotocol).

Kaiser Electro-Optics, Inc., offers the SIMEYE TM 60 HMD, providing 1,024 interlacedcolor scan lines, for $165,000.

1,000 entities (vehicle simulators, etc.) are netted for Synthetic Theater of War(STOW) exercise.

Synthetic Theater of War goal is 100,000 entities.

I Link Trainers (1929)The development of Link Trainers led to digitalflight simulation and then to distributed human-computer digital combat simulation. The market-ing of Link Trainers is also of interest, because itused cost-effectiveness analyses based on experi-ments to gauge the transfer of learning in aircrewtraining devices (ATDs) to proficiency in thecockpit.

Lloyd Kelly’s book, The Pilot Maker [1 10],contains many passages about the substitutabilityof Link Trainer time for flight time, for example:

Ed [Edwin A. Link] taught his brother George tofly with only forty-two minutes of actual flighttime, after George had taken a concentrated six-hour course in the trainer.

.Casey [Charles S. Jones] was deeply im-pressed with the young man (Edwin A. Link)from Binghamton, who would teach people tofly up to first solo for eighty-five dollars. It wascosting Casey at that time over two hundred dol-lars to accomplish the same task.

Kelly also quotes the New York Herald Tribune asprinting in 1933:

This system [the Link Trainer] does not pur-port to turn out a finished “blind” flier withoutthe necessity of actual time in the air at the con-trols and behind the instrument board of a realairplane, but its sponsors do claim that it willshorten the essential air training by more than 50per cent. They maintain that 15 hours of the newstyle “hangar flying” and five hours in the

8 I Virtual Reality and Technologies for Combat Simulation

school’s blind flying training plane will result ina proficiency equivalent to that obtained by 25hours’ blind flying instruction on an all-airplanebasis, which has been the average time requiredto qualify America’s airline pilots in this highlyspecialized field.

The first fully instrumented Link Trainer wassold to the Navy in 1931 for $1,500. By mid-1932it was still the only fully instrumented trainer thathad been sold, still the only one sold to the mili-tary, and one of only three sold to the “aviation in-dustry”: one had been sold to the PioneerInstrument company to demonstrate the turn andbank indicator, the airspeed indicator, and themagnetic compass; another had been sold to a mu-seum. But by mid-1932, almost 50 had been sold,at cost ($300 to $500), for use in amusementparks. This demand trend contrasts with that fornon-cockpit “virtual reality” systems (describedbelow), which were first developed largely by thegovernment for military and space-related use,with entertainment-related demand now growingrapidly, about three decades later.5

When the Post Office canceled all airmail con-tracts in 1934 and asked the Army Air Corps to flythe mail, the Army soon discovered it would needto train its pilots to fly on instruments and recog-nized the value of the Link Trainer as a trainingaid, but had no money to purchase them. EdwinLink and “Casey” Jones lobbied CongressmanHoward McSwain, Chairman of the House Mili-tary Affairs Committee, who promised coopera-tion, with the result that an emergencyappropriation was passed by both houses of Con-

gress and signed by the President on March 28,1934. Less than three months later, on June 23, theArmy took delivery of its first six Link Trainers,Model A’s, for which it paid $3,400 each. Sales toJapan, the Soviet Union, and other countries fol-lowed, and many new types of trainers were devel-oped and sold to the Army and the Navy duringWorld War II.

These were later estimated to have saved theArmy Air Force “at least 524 lives, $129,613,105,and 30,692,263 man-hours. . . in one year,” and tohave saved the Navy “a potential total savings of$1,241,281,400 in one year,” according to “a re-port . . . to the Subcommittee of the Committee onAppropriations of the United States House ofRepresentatives” quoted by Kelly.

Nevertheless, there was a postwar slump inATD sales, as well as competition. ResearcherStanley Roscoe later wrote that “in 1949 Edwin A.Link was building more canoes than flight train-ers” [58]. The fortunes of the company turnedaround in 1949, when the Link Co. won the AirForce contract to build a simulator6 (the C- 11) forthe F-80 jet fighter. Not only was this the first sim-ulator of a jet airplane, it also used analogue elec-tronic computer technology.7 Roscoe attributedthe company’s success to a sales brochure writtenby Paul Dittman, a graduate student, summarizingthe results of experiments in the effectiveness oftransfer of training from simulator to airplane.

Details of these experiments, which were con-ducted by Ralph Flexman, were not published un-til 1972. Flexman wrote: “On all exercises an hour

5 me ~p~ent of commerce’s l~usrrial Outlook 1994 states that, as of 1993, most virtual reality applications were entertainment-ori-

ented [174]. However, this accounting counts appficarions (i.e., types of computer programs), not sales volume in dollars, and it appears toexclude cockpit simulator applications, for which there is a large non-entertainment demand from the military and the airlines.

A recent forecast by 4th Wave, Inc., projects worldwide nongovernmental demand for virtual reality hardware, software, and services togrow from $115.8 million in 1994 to $569.9 million in 1998. Consumer sales, public play, and entertainment equipment purchases by operatorsof public play installations are expected to amount to 57.2?Z0 of the 1994tota1 and 69.1 Yo of the 1998 total [1,120]. Aircrew training devices werenot counted as VR systems.

6 Around this time, Cutiiss-wright sought unsuccessfully to trademark the words “simulation” and “simulator,” and Link Aviation sought to

protect the term “Link Trainer,” after competitors began establishing “link trainer” departments [110]. General Precision Equipment Corpora-tion acquired Link Aviation in 1954 and merged with The Singer Company in 1%8 [ibid.] to form Singer-General Precision, Inc., which regis-

tered the trademark “LINK Trainer” [139] in a further effort to protect product identity.

7ThrW yews earlier, Jay Forrester had begun building a digital computer for the Navy’s Airplane Stability Control Analyzer.

Virtual Reality and Technologies for Combat Simulation 19

of training in the Link was equivalent to no lessthan a half hour in the aircraft, and for slow-flighttraining an hour in the Link equaled an hour in theair” [58]. This does not imply that simulator train-ing could replace all flight training on a two-for-one basis. The marginal substitutability ofsimulator training for flight training may (andprobably does) vary with the ratio of simulatortraining time to flight training time at which sub-stitutability is assessed, the total amount of train-ing, and the particular tasks that are taught in thesimulator.

Summarizing the research, Roscoe wrote [58]:The measurement of [training] transfer is a

complex business. . . . [Simulator training inone flight maneuver transfers not only to its air-borne counterpart; it transfers to other similarmaneuvers performed either in the simulator orin the airplane, as does training in the airplaneitself. Furthermore, training in one aspect of theoverall flight task, say verbal communication,may appear to transfer to another quite differentaspect, say motor coordination, simply becausethe early mastery of the first may thereafter al-low the student to concentrate more on the sec-ond. . . .

One consistent result was that instrumentflight training produced less transfer . . . thandid contact flight training [i.e., by visual refer-ence to the ground and horizon] . . . .

In general, procedural tasks, such as startingthe airplane, resulted in more effective transfer. . . than did psychomotor skills, such as levelturns. . . . Apparently higher transfer occurswith procedural tasks than with psychomotortasks because the former are less adversely af-fected by the imperfect simulation of such dy-namic factors as physical motion, visual andkinesthetic cues, and control pressures.

. . . [Opportunity for the transfer of newlearning diminishes as a pilot masters more andmore elements common to various tasks. . . .

Further research is needed. This perennialplea of the scientific community applies to thewhole field of educational strategy, not only tothe training of pilots. It is strikingly notable,however, that vast sums are invested in new andinnovative pilot training devices and programs

in the virtual absence of experiments providingquantitative estimates of relative transfer effec-tiveness attributable to the variable characteris-tics of the devices and training strategiesemployed.

Surprisingly, simulated instrument flight train-ing was found to produce less transfer than simu-lated contact flight training. Because it is easierfor a simulator to realistically simulate instrumentconditions (e.g., zero visibility) than the visualscene that a pilot would see in clear weather, onewould expect the opposite to be true. Roscoe spec-ulated that simulated instrument-flight trainingproduced less transfer than did simulated contact-flight training because pilots generally learn con-tact flying before learning instrument flying.Hence by the time a pilot begins instrument train-ing, he or she has already learned the basic flyingskills, which are also used in instrument flying.Thus the phenomenon is consistent with the otherconclusion that “opportunity for the transfer ofnew learning diminishes as a pilot masters moreand more elements common to various tasks.”That is, in economic jargon, there are diminishingreturns to scale as one invests more in training, ei-ther in a simulator or a cockpit.

The quantitative data that these conclusionssummarize were collected only after the simula-tors were built. It is difficult to imagine how trans-fer effectiveness (or savings or averted cost) couldhave been estimated objectively or measured ac-curately before both a simulator and the systemthat it simulates were available for experimenta-tion. The advent of electronic (especially digital)simulators has made it possible to vary simulatorparameters, such as display detail and frame rate,in order to see how decreasing them would de-grade learning and performance of specific partialtasks (e.g. approach to landing) in the simulatorand transfer of proficiency to actual flight opera-tions.

| The Whirlwind Computer (1946) and theSAGE Air Defense System (1958)

The use of the ENIAC in 1945 to simulate aspectsof a thermonuclear detonation and, later, to simu-late the flight of artillery shells [87] were the first

10 I Virtual Reality and Technologies for Combat Simulation

uses of an electronic digital computer to simulatedetails of combat. However, the first use of a digi-tal computer to simulate combat as it would beperceived by one or more combatants probablyoccurred in the development of the Air Force’sSemi-Automated Ground Environment (SAGE)air-defense system. The SAGE system evolvedfrom a Navy flight simulator development pro-gram, the Airplane Stability Control Analyzer(ASCA) program, which began during World WarII. In 1946, Jay Forrester, of the MassachusettsInstitute of Technology (MIT), persuaded theNavy to use a digital computer instead of an ana-log computer for ASCA, and he began buildingone, called Whirlwind, at MIT. In 1949 the SovietUnion detonated an atomic device, and the U.S.Department of Defense became more concernedabout improving air defenses. MIT, a leading de-veloper of radar systems in World War II, becameinvolved, as did Whirlwind. In 1950 digital radardata were transmitted from the Microwave EarlyWarning (MEW) radar at Hanscom Field in Bed-ford, Massachusetts, to Whirlwind at MIT inCambridge, Massachusetts.

By 1953, the Cape Cod System, a scaled-downtechnology demonstrator for SAGE, was demon-strating what is now called virtual human-ma-chine distributed interactive simulation, in whichradar operators and weapon controllers reacted tosimulated targets presented to them as real targetswould be in war. This was relatively easy to do, thehuman-computer interface being a radar screen.Presenting a tank gunner or a fighter pilot with adetailed, rapidly changing visual scene requiredfurther decades of advances in computer imagegeneration (CIG) technology.

By 1955, the Air Force had contracted with theRAND Corp. to develop the System Training Pro-gram (STP) for SAGE. As John F. “Jake” Jacobs,an architect of SAGE, recounts:

One of the features of SAGE was the exten-sive use of simulation programs and operationaltesting aids; the STP program created simulatedairplanes by signals entered into the system atthe radar sites. These and other simulated inputswere integrated so as to create a simulated sce-nario against which the operators could direct

intercepts and be scored on their performance.Dave Israel pushed for a battle simulation pro-gram with the [SAGE] direction center and in-ternal to the FSQ-7 [digital computer]. In thissystem, an internal program simulated airplanesignals that could be mixed with live signalsgenerated by real aircraft. The simulation wasan extremely useful feature and was coordi-nated with the STP design.

The SAGE system later pioneered the network-ing of two collocated computers, for reliability,and eventually the networking of remote comput-ers for distributed computing, including distrib-uted interactive live and virtual simulation.

| The “Sword of Damocles”Head-Mounted Display (1966)

Development and integration of an assortment oftechnologies to provide (or approximate) the ex-perience now called virtual reality is more recentthan the development of Link Trainers and theSAGE system. Much of the vision of virtual real-ity was articulated by Ivan Sutherland of DOD’sAdvanced Research Projects Agency (ARPA) in aconference paper, “The Ultimate Display,” readand published in 1965. He proposed the use of eyetrackers and many other innovations:

Machines to sense and interpret eye motiondata can and will be built. It remains to be seenwhether we can use a language of glances tocontrol a computer. An interesting experimentwill be to make the display presentation dependon where we look.

The following year, at MIT’s Lincoln Labora-tory, he began developing the frost head-tracker-controlled head-mounted display (HMD), nowthe trademark, if not a sine qua non, of virtual real-ity. Later in 1966 he joined the University of Utah,where he continued refining the hardware andsoftware until it was functional on New Year’sDay in 1970 [41] (cf. [17]). The heavy HMD wassuspended from the ceiling by a gimballed me-chanical apparatus equipped with electromechan-ical sensors that determined the direction in whichthe “wearer” was looking, so that the computercould generate the image of the virtual world thatthe wearer should see. The assembly’s precarious

Virtual Reality and Technologies for Combat Simulation 111

mounting over the user’s head led to its beingcalled the “sword of Damocles.” The best modemHMDs are much lighter. They, and the CIG hard-ware and software that feed them imagery, provideimagery vastly more detailed and realistic than thewire-frame imagery displayed by Sutherland’sfirst HMD.

VIRTUAL REALITYMuch of the material in this section (up to Ob-servations) that is not otherwise attributed hasbeen excerpted or paraphrased from a popularelectronic text by Jerry Isdale [98].

The term virtual reality (VR) means differentthings to different people. To some, VR is a specif-ic collection of technologies—a head-mounteddisplay, a glove-input device and an audio synthe-sizer. Some other people stretch the term to in-clude books, movies, or unaided fantasy. We viewVR as an interactive, usually computer-mediated,experience characterized by a suspension of (or in-attention to) disbelief in the unreality of the expe-rience.

A good film or novel maybe so engaging as tocause most viewers or readers to suspend disbeliefin the unreality of the vicarious experience, but theexperience is neither interactive nor computer-mediated and hence is not usually considered VR.However, flying an analog (rather than digital) air-plane cockpit simulator that generated imagery bymoving a video camera over a model board couldbe VR. A recent NASA report offered this defini-tion:

Virtual reality is the human experience ofperceiving and interacting through sensors andeffecters with a synthetic (simulated) environ-ment, and with simulated objects in it, as if theywere real [137].

The book Silicon Mirage [8] defines “virtualreality” as “a way for humans to visualize, manip-ulate and interact with computers and extremelycomplex data.”

Here visualization refers to the computer gen-erating visual, auditory or other sensory outputs tothe user of a world within the computer.

This world may be a computer-aided design(CAD) model, a scientific simulation, or a viewinto a database. The user can interact with theworld and directly manipulate objects within theworld. Some worlds are animated by other proc-esses, perhaps physical simulations, or simpleanimation scripts. Interaction with the virtualworld with at least near-real-time control of theviewpoint is usually considered an essential ele-ment of virtual reality. Some definitions requirethe view of the world to be three-dimensonal; theyexclude Lucas Film’s Habitat and Club Caribe[149], Fujitsu Habitat [149], the DIASPAR TM

Virtual Reality Network [42], and the online sur-rogate travel demonstration [146] at the NationalInstitute for Standards and Technology, all ofwhich use two-dimensonal (nonperspective)graphical presentations,8 but other definitionswould include them. There is more than a spec-trum of virtual reality—there are several dimen-sions of virtual reality, and experiences havevarying degrees of each.

Some people object to the term “virtual real-ity,” saying it is an oxymoron. Other terms thathave been used are Synthetic Environments,Cyberspace, Artificial Reality, SimulatorTechnology, etc. VR is the most common. It hascaught the attention of the media.

The applications being developed for VR spana wide spectrum, from games to architectural andbusiness planning. Many applications are worldsthat are very similar to our own, like CAD or ar-chitectural modeling. Some applications, such asscientific simulators, telepresence systems, andair traffic control systems, provide views from anadvantageous perspective not possible with thereal world. Other applications are very differentfrom anything most people have ever directly ex-perienced before. These latter applications maybe

8 The DIASPARTM Virtual Reality Network also offers a three-dimensional virtual world, the Polygon Playground, that can be shared bytwo users.

12 I Virtual Reality and Technologies for Combat Simulation

the hardest and most interesting: visualizing theebb and flow of the world’s financial markets,navigating a large corporate information base, etc.

| Types of VR SystemsVR systems maybe categorized according to theiruser interfaces. This section describes some of thecommon modes used in VR systems.

Cockpit (or cab) simulators are the oldest, mostmature, and-for many important military mis-sions—most realistic VR systems. They evolvedfrom the Link Trainer of 1929 and the Navy’s Air-plane Stability Control Analyzer (ASCA) ofWorld War II.

Some systems use a conventional computermonitor to display the visual world. This is oftencalled desktop VR and has been called a windowon a world (WOW) [98]. This concept traces itslineage back to Ivan Sutherland’s 1965 paper,“The Ultimate Display” [162].

A variation of the WOW approach merges a vid-eo input of the user’s silhouette with a two-dimen-sional computer graphic. The user watches amonitor that shows his body’s interaction with theworld. Myron Krueger has been a champion ofthis form of VR since the late 1960’s. At least onecommercial system, the Mandala system, usesthis approach. This system is based on a Commo-dore Amiga with some added hardware and soft-ware. A version of the Mandala is used by thecable TV channel Nickelodeon for a game show(Nick Arcade) to put the contestants into what ap-pears to be a large video game.

The ultimate VR systems completely immersethe user’s personal viewpoint inside the virtualworld. These “immersive” VR systems are oftenequipped with a head-mounted display (HMD).This is a helmet or a face mask that holds the visu-al and auditory displays. The helmet may be freeranging, tethered, or it might be attached to aboom.

One type of immersive system uses multiplelarge projection displays to create a room in whichthe viewer(s) stand. An early implementation wascalled “The Closet Cathedral” for the ability to

create the impression of an immense environmentwithin a small physical space. A more recent im-plementation is the CAVE Automatic Virtual En-vironment (CAVE) at the Electronic VisualizationLaboratory of the University of Illinois at Chicago[32]; it has been so highly publicized that the term“cave” is now being used to describe the generictechnology. A futuristic extrapolation of thetechnology is the “Holodeck” used in the televi-sion series “Star Trek: The Next Generation.”

Telepresence is a variation on visualizing com-plete computer-generated worlds. This technolo-gy links remote sensors in the real world with thesenses of a human operator. The remote sensorsmight be located on a robot, or they might be onthe ends of waldo-like tools. (A waldo is a hand-like tool that mimics the motions of a user’s hand.The term was coined by Robert A. Heinlein in his1942 science-fiction story “Waldo” [82].) Fire-fighters use remotely operated vehicles to handlesome dangerous conditions. Surgeons are usingvery small instruments on fiber-optic cables to dosurgery without cutting a major hole in their pa-tients. The cable carries light to the surgical siteand guides light reflected by tissue thereto a videocamera outside the body. The image is displayedon a video monitor which the surgeon observeswhile manipulating the instrument [3,179].

Robots equipped with telepresence systemshave already changed the way deep sea and vol-canic exploration is done. NASA plans to use tele-robotics for space exploration. There is currently ajoint U.S.-Russian project researching telepres-ence for space rover exploration. Military uses oftelepresence include guidance of the camera-equipped GBU-15 bombs that provided stunningTV scenes from the Persian Gulf war, unmannedaerial vehicles (also used in the Gulf War), and un-manned space and undersea vehicles.

Merging the telepresence and virtual realitysystems gives the mixed reality or seamless simu-lation systems. Here the computer-generated in-puts are merged with telepresence inputs and/orthe user’s view of the real world. A surgeon’s viewof a brain surgery is overlaid with images from

Virtual Reality and Technologies for Combat Simulation 113

earlier computed axial tomography (CAT) scansand real-time ultrasound. A fighter pilot sees com-puter-generated maps and data displays inside hisfancy helmet visor or on cockpit displays.

The phrase “fish tank virtual reality” has beenused to describe a Canadian VR system that com-bined a stereoscopic monitor display using liquid-crystal shutter glasses with a mechanical headtracker. The resulting system is superior to simplestereo-WoW systems due to the motion parallaxeffects introduced by the head tracker.

| Cockpit SimulatorsCockpit and cab simulators use a compartment-amock-up, in engineering terminology-to en-close the user, whose out-the-window view of thevirtual world is provided by either a directly-viewed display device, such as a cathode ray tube(CRT) monitor or a liquid-crystal display (LCD),or a projection screen on which imagery is pro-jected from the front or rear. Many types of projec-tion systems have been used, including projectionCRTs and liquid-crystal light valves. Potentiallyapplicable is the technology of the Digital Micro-mirror Device (DMD) developed by Texas Instru-ments with Defense Advanced Research ProjectsAgency (DARPA) funding [154].

Cockpit simulators have a history dating backto the Link Trainer introduced in 1929. The cock-pit is often mounted on a motion platform that cangive the illusion of a much larger range of motion.Bridge simulators, such as the Link CART (Colli-sion Avoidance Radar Trainer) [110], are a variantthat simulate the steering, and response, of a shipas it would be experienced from the bridge.

Cab simulators are used in driving simulatorsfor locomotives,9 earth-moving equipment,10

trucks, tanks, and BattleMechs, the fictional walk-ing tanks featured in Virtual World Entertain-ment’s BattleTech Center in Chicago (whichopened in 1990 [140]) and its 13 other VirtualWorld centers—six in the United States and sevenin Japan. Each center has about 24 cockpit simula-tors (of several types—not just BattleMechs),which are being networked by CyLinkTM long-haul links over Integrated Services Digital Net-work (ISDN) lines.11

Experts consider the best cockpit simulators tobe more realistic and immersive than the bestHMD-based systems, in the sense of engaging asubject’s attention and suspending her or hisawareness of simulation. Dave Evans of Evansand Sutherland, a pioneer in computer image gen-eration for cockpit simulators as well as HMDs,has said that one pilot flying a Boeing 747 airplanesimulator fainted when his virtual airplane hit aditch on landing. Professor Frederick Brooks,quoting him, said that nothing so dramatic hadhappened to a trainee wearing an HMD [17].HMDs do, nevertheless, provide realistic andsometimes threatening experiences. Somepsychologists are using HMDs to give acrophobiasubjects the impression of being in a high loca-tion, in order to identify sensory cues that producephysiologically measurable stress and design de-sensitization therapy [88,1 15,1 16].

A 1993 Air Force Institute of Technology Mas-ter’s thesis rated domed aircraft cockpit simula-tors higher than HMDs in each of five“immersion” attributes (field of view, panorama,

9 me SD-45 l~omotive simulator produced by the Link division of General Precision Equipment Corporation for the SiUIU Fe Raihoad in

the late 1960s had a number of interesting features, including a hydraulic motion-base platform, digitally-controlled electronically synthesizedstereophonic sound (along with mechanically generated sound), and scenery pro~cted from 16-rnm film. “A unique lens mounted before thefront window provided images up to infinity with realistic depth of field. This lens forced the viewer to refocus his eyes as he would in the realworld when looking at varying distances” [1 10].

10 Catemi]]w has developed a simulator that includes a real steering wheel, gearshift, levers, pedals, and other controls but relies On a HMDfor visual presentation [41 ].

1 I vi~al world Entertainment is not using the IIIS protocol [97], because it wants the network to pass details of collisions (e.g., crmhhginto a cli~ that are valued by patrons for entertainment but for which DIS protocol data units (PDUS) have not been defined.

14 I Virtual Reality and Technologies for Combat Simulation

perspective, body representation, and intrusion)as well as in visual acuity, a “visualization” attrib-ute [53]. Similarly, recent Center for Naval Analy-ses memorandum concluded that HMDtechnology was not yet sufficiently mature for ap-plication to strike training and mission rehearsal[210].

I Image GeneratorsA number of specialized types of hardwaredevices have been developed or used for virtualreality applications. One of the most computer-time-consuming tasks in a VR system is the gen-eration of the images. Fast computer graphicsopens a very large range of applications aside fromVR, so there has been a market demand for hard-ware acceleration for a long while.

The simulator market has spawned severalcompanies that build special-purpose computersand software for real-time image generation.These computers often cost several hundreds ofthousands of dollars. The manufacturers includeEvans and Sutherland, Inc.; the General ElectricCo. (GE); Division, Ltd.; and Silicon Graphics,Inc. (SGI). Evans and Sutherland, Inc., wasfounded in 1968 to create electronic scene genera-tors; it has sold scene generators for aircrew train-ing devices to the airlines and the military andcurrently markets the ESIG-2000 graphics systemfor VR applications (including cab and cockpitsimulators). SGI graphics workstations are someof the most common processors found in VR labo-ratories, including DOD simulators. SGI boxesrange in price from under $10,000 to over$100,000. The latest is the scalable SGI OnyxReality Engine.

There are currently several vendors sellingimage generator cards for personal computers.The cards range in price from about $2,000 up to

$10,000. Many of them use one or more Intel i860microprocessors. A recent offering in this class,by Division, Ltd., is the Merlin graphics card de-signed for IBM PC/AT-compatible computers.12

There are two versions of the Merlin graphicscard; both are a variant of Division’s VPX graph-ics card (described below). One version of Merlinhas the same architecture and performance as theVPX. The other version of Merlin is a depopulatedboard-i.e., it has fewer i860 microprocessorsand hence is slower than the VPX and will costless, although Division has not announced pricingas this background paper goes to press. Like theVPX, the Merlin card supports a variety of videoformats, but with Merlin, software allows the userto select the desired format. VPX cards had 10video output connectors-one per format-andthe user would select the desired format by con-necting a video cable to the appropriate connector.A fully populated Merlin card draws 19 amperesof current and generates more heat than can besafely dissipated in a typical PC/AT enclosure, soDivision offers a separate enclosure with chassisand power supply for Merlin cards.

The VPX card, announced on Feb. 1, 1994[46], uses the Pixel-Planes 5 rendering architec-ture developed by Professor Henry Fuchs at theUniversity of North Carolina at Chapel Hill withfunding from the National Science Foundationand later DARPA [172]. The Pixel-Planes archi-tecture uses an array of processors-8,192 on theVPX board-to compute pixel data in parallel,with each processor dedicated to one or more pix-els on the screen. This provides significant perfor-mance advantages, particularly where complexlighting and texture mapping is required on eachpixel. Designed for Division’s PROVISION 100VR system and the original equipment manufac-turer (OEM) market, the VPX is a single card that

12 Division also ~~ils tie PC dVIEW and MAC dVIEW image-rendering c~ds.

Virtual Reality and Technologies for Combat Simulation 115

is compatible with the extended industry-standardarchitecture (EISA) bus. The card is capable ofrendering up to 300,000 photo-textured,13

Gou-raud-shaded, 14 specularly lit polygonsls and300,000 spheres per second16 at over 160 millionpixels per second in a variety of formats with amaximum resolution of 1,024x768 pixels in syn-chronized stereo using multiple cards [44]. OEMprices for a single VPX board started at £12,000when they were introduced [46]. Prices in theUnited States now start at about $21,000 [143].

| Manipulation and Control DevicesA VR system must provide a means for its user tomove around in the virtual world to explore; to se-lect objects to be manipulated; and to grasp, carry,paint, throw, or drop them. The simplest controlhardware is a conventional mouse, trackball, orjoystick. Although these devices are normallyused to control the position of a cursor in a two-dimensional (2-D) space (a monitor screen),creative programming can use them for six-dimensional (6-D) controls.17

There are a number of 3-D and 6-D mice, track-balls, and joysticks on the market. These have ex-tra buttons and wheels that are used to control notjust the translation of a cursor in the two dimen-sions of the screen (X and Y), but its apparentdepth (Z) and rotation about all three axes (X, Y,

and Z). The Global Devices 6-D Controller is onesuch 6-D joystick. It looks like a racquetballmounted on a short stick. A user can pull and twistthe ball in addition to the left/right & forward/back of a normal joystick. Other 3-D and 6-Dmice, joysticks, and force balls are available fromLogitech, Mouse System Corp., and others.

A less common but more distinctive VR deviceis the instrumented glove. The use of a glove tomanipulate objects in a synthetic environmentwas pioneered by Thomas Zimmerman at the Ata-ri Research Center. In 1983, after leaving Atari, hebegan collaborating with Jaron Lanier, improvingthe design. Together they founded VPL Research,Inc., in 1985 and sold their first product, the Data-Glove, to NASA. VPL was granted a U.S. Patentfor the DataGlove and licensed Mattel tomanufacture and sell an inexpensive version,called the PowerGlove, for use with home videogames. In 1992 VPL sold the patent to Thomson-CSF of France [174].

The DataGlove is outfitted with sensors on thefingers as well as an overall position/orientationtracker. There are a number of different types ofsensors that can be used. VPL Research, Inc.,made several types of DataGloves, mostly usingfiber optic sensors for finger bends and magnetictrackers for overall position. For a short time Mat-tel manufactured, under license, a related glove in-

13 In computer graphics, a fe~ture dexribes the pigmentation, reflectivity, transmissivity, and smoothness Of a SUrfKe. ‘l’’his allows tie Cre-ator of an objecrfife (a digital description of an object or scene to be rendered) to specify very concisely that a surface is to look like polishedaluminum, burlap, finished maple, or whatever. The rendering software chooses the color and brightness of each point in the picture (eachpicture element, or pixel) to achieve the specified effect. photo-texturing, also called image-map texturing, refers to the use of a digital image togenerate a texture for an object surface. A related process, color-mapping, uses a digital image to generate the color or pigment pattern on anobject’s surface.

14 In G~ura~ s~ding, a color value is calculated for each vertex of a polygon based on the polygon’s orientation, the orientation of each

polygon that shares the vertex, the pigmentation specified for the part of the object represented by the polygon, and the color of the light source.The color values for interior pixels of a polygon are interpolated from the color values of its vertices in order to yield a smooth shading effect.Rendering an object in this fashion takes longer than simply making all pixels of a polygon the same color, which is calledflar shading.

15 A ~o/ygon is a Plme figure bo~ded by straight line segments. Most mode/ing programs used to create object files sPecifY tie forms of

objects in terms of polygons such as triangles; some also specify curvilinear shapes.

16 polygons ~r second (or poly-H@ is a useful measure of pel-form~ce for i~ge-generating systems, because more complicated scenes

(having more polygons) typically take longer [o render, and the resulting frame rate (frames per second) is lower. The poly-l+z metric adjusts forscene complexity in characterizing rendering speed.

17 me more accumte phrase “six degrees of freedom” (6-~~ is sometimes used ~stead. It refers to movement in three directions (for

example, forward, right, and up) and rotation about three axes (for example, pitch, roll, and yaw).

16 I Virtual Reality and Technologies for Combat Simulation

put device—the PowerGlove-for use with theNintendo game system. The PowerGlove is easilyadapted to interface to a personal computer. It pro-vides some limited hand location and finger posi-tion data using strain gauges for finger bends andultrasonic position sensors. These data are used todisplay the user’s hand correctly in the virtualworld, and the computer can also be programmedto recognize several hand gestures and interpretthem as commands, such as “fly forward.” Thegloves are getting rare, but some can still be foundat Toys ‘R’ Us and other discount stores.

The concept of an instrumented glove has beenextended to other body parts. Full body suits withposition and bend sensors have been used for cap-turing motion for character animation, control ofmusic synthesizers, etc., in addition to VR ap-plications.

Some boom-mounted stereoscopic displays,such as the Binocular Omni-Orientation Monitor(BOOM, described below), require the user tograsp and steer the display unit with one or twohands, which makes it difficult to use a glove inputdevice or a joystick to select and manipulate ob-jects. 18 Buttons on the BOOM handgrips can beused to perform a limited set of software- and user-determined functions, but for convenience somesystems, such as the Multi-Dimensional User-Oriented Synthetic Environment (MUSETM)[30,1 11,138] and the Virtual Interactive Environ-ment Workspace (VIEWSTM) [5] developed atSandia National Laboratory, have been given a ca-pability to recognize commands spoken by theuser.

It is awkward for a BOOM or HMD user who iscompletely immersed in a virtual world (e.g., re-pairing a virtual jet engine in a training course) toenter commands to the computer system (e.g., re-quests for maintenance manual pages) by meansof a keyboard. There are several potential solu-tions to this problem. One is to have the user typeon a virtual keyboard using a glove input device.

At present, glove input device gesture recognitionis not discriminating enough to approach touch-typing speed and accuracy, and most glove inputdevices do not provide haptic feedback. An alter-native is to use a chording keyboard, such as theKAT (Keyboard-Alternative Technology) key-board by Robicon Systems, Inc., that the userholds or wears in the palm of one hand. It has abutton for each fingerand one for the thumb [140].It provides haptic feedback, and a wearer can use itwhile walking or suspended in a harness. Yetanother approach is to use voice command recog-nition. This is the most natural approach andworks very well if the system is “trained” to recog-nize the user’s voice. Performance can be quitegood without any training.

| Position TrackingOne key element for interaction with a virtualworld is a means of tracking the position of a realworld object, such as a head or hand. There are nu-merous methods for position tracking and control.Ideally, a technology should provide three mea-sures for position coordinates (X, Y, Z) and threemeasures of orientation (roll, pitch, yaw). One ofthe biggest problems for position tracking is laten-cy, or the time required to make the measurementsand preprocess them before input to the simula-tion engine.

Mechanical armatures can be used to providefast and very accurate tracking. Such armaturesmay look like a desk lamp (for basic position/orientation) or they may be highly complex exo-skeletons (for more detailed positions). Thedrawbacks of mechanical sensors are the encum-brance of the device and its restrictions on motion.EXOS, Inc., builds one such exoskeleton for handcontrol. It also provides force feedback. ShootingStar system makes a low-cost armature system forhead tracking. Fake Space Labs and LEEP Sys-tems, Inc., make much more expensive and elabo-

18 Other boom. mo~ted stereo~opic displays are strapped to the user’s head and leave the user’s hands free to gesture or type on a virtual

keyboard. Such displays are not head-supported but are sometimes called head-mounted.

Virtual Reality and Technologies for Combat Simulation 117

rate armature systems for use with their displaysystems.

Ultrasonic sensors can be used to track positionand orientation. A set of emitters and receivers areused with known relationships between the emit-ters and between the receivers. The emitters arepulsed in sequence and the time lag to each receiv-er is measured. Triangulation gives the position.Drawbacks to ultrasonics are low resolution, longlag times, and interference from echoes and othernoises in the environment. Logitech and Transi-tion State are two companies that provide ultra-sonic tracking systems.

Magnetic trackers use sets of coils that arepulsed to produce magnetic fields. The magneticsensors determine the strength and angles of thefields. Limitations of these trackers area high la-tency for the measurement and processing, rangelimitations, and interference from ferrous materi-als within the fields. However, magnetic trackersseem to be one of the preferred methods. The twoprimary companies selling magnetic trackers arePolhemus and Ascension.

Optical position tracking systems have beendeveloped. One method uses an array of infraredlight-emitting diodes (LEDs) on the ceiling and ahead-mounted camera. The LEDs are pulsed in se-quence and the camera’s image is processed to de-tect the flashes. Two problems with this methodare limited space (grid size) and lack of full mo-tion (rotations). Another optical method uses anumber of video cameras to capture simultaneousimages that are correlated by high-speed comput-ers to track objects. Processing time (and cost offast computers) is a major limiting factor here.One company selling an optical tracker is OriginInstruments.

Inertial trackers have been developed that aresmall and accurate enough for VR use. However,these devices generally only provide rotationalmeasurements. They are also not accurate for slowposition changes.

I Stereo VisionStereo vision is often included in a VR system.This is accomplished by creating two differentimages of the world, one for each eye. The imagesare computed with the viewpoints offset by theequivalent distance between the eyes. There are alarge number of technologies for presenting thesetwo images. One of the simplest is to present theimages side-by-side and ask the viewer to crossher or his eyes while viewing them. There aretechniques for assisting the viewer in this unnatu-ral task. The images can be projected through dif-ferently polarized filters, with correspondingfilters placed in front of the eyes.

Another technique is to present anaglyphicimages (anaglyphs), which are false-color imagesanalogous to double exposures. Each anaglyph isa superposition of a red image intended to be seenby one eye and a blue image intended to be seen bythe other eye. The anaglyph is designed to beviewed with special eyeglasses having a red filterfor one eye and a blue filter for the other eye; theviewer sees a monochromatic image with appar-ent depth.

The two images can be displayed sequentiallyon a conventional monitor or projection display.Liquid crystal (LC) shutter glasses are then usedto shut off alternate eyes in synchronization withthe display. When the brain receives the images inrapid enough succession, it fuses the images into asingle scene and perceives depth. A fairly highdisplay swapping rate (at least 60 Hz) is requiredto avoid perceived flicker. A number of compa-nies made low-cost LC shutter glasses for use withTVs (Sega, Nintendo, Toshiba, etc.). Softwareand diagrams for circuits to interface these glassesto a computer are available on many computerbulletin-board systems (BBSs) and InternetFTP19 sites. However, locating the glasses them-selves is getting difficult as none are still beingmade or sold for their original use. StereoGraphics

19 File Tr~sfer ~otoeol: a sti&rdized procedure communications procedure for transferring a data fiie from one Computer to moher

over the Internet.

18 I Virtual Reality and Technologies for Combat Simulation

Corp. sells a model, called CrystalEyes, thatweighs only 85 g. It is considered to represent the“highend” of the spectrum of shutter glasses, bothin price and in quality [98,158]. One reviewer,however, described them as “incredibly fragile” aswell as “very nice.”

Another method for creating stereo imagery ona computer is to use one of several split-screenmethods. These divide the monitor into two partsand display left and right images at the same time.One method places the images side by side andconventionally oriented. It may not use the fullscreen or may otherwise alter the normal displayaspect ratio. A special hood viewer is placedagainst the monitor which helps position the eyescorrectly and may contain a divider so each eyesees only its own image. Most of these hoods,such as the Fresnel stereopticon viewer included(along with software) with the book Virtual Real-ity Creations [158], use Fresnel lenses to enhancethe viewing. An alternative split screen methodorients the images so the top of each points out theside of the monitor. A special hood containingmirrors is used to correctly orient the images.Simsalabim Systems sells a unit of this type, theCyberscope, for under $200 [ 156]. Split-screen ordual-screen methods are used in head-mounteddisplays, which are discussed in the next section.

| Head-Mounted DisplaysOne hardware device closely associated with VRis the head-mounted display (or device), whichuses some sort of helmet or goggles to place asmall video display in front of each eye with spe-cial optics to focus and stretch the perceived fieldof view. Most HMDs use two displays to providestereoscopic imaging. Others use a single largerdisplay, sacrificing stereoscopic vision in order toprovide higher resolution.

HMDs attracted public attention in 1991, whenW Industries, Ltd.—now called Virtual Entertain-

ment, Inc. (VEI)-sent its networked Stand-UpVirtuality arcade machines equipped with the Vi-sette HMD (introduced late in 1990 [73]) on a tourof the United States [140]. VEI recentlyintroduced its second-generation Stand-Up Virtu-ality machine [73].

Most lower priced HMDs (those costing$3,000 to $10,000) use LCDs. Others mountsmall CRTs, such as those found in camcorders,alongside the head. The price of low-end liquid-crystal HMDs is decreasing, but their resolution isinadequate for many applications. One of the leastexpensive, listed at $750, is “The Alternative”HMD, which weighs about 6 pounds and providesa monoscopic view using a 3.8-inch-diameterLCD with an effective resolution of 240 pixelshorizontally by 146 pixels vertically. The actualLCD resolution is greater, but a grouping of threemonochrome pixels-one for each of three prima-ry colors—”is necessary to produce one color pix-el. The field of view is approximately 70 degreeshorizontally by 60 degrees vertically. For compar-ison, a NTSC20 color television displays 525 lines(or pixels) vertically, with a field of view that de-pends on the distance from the viewer to thescreen but is typically less than 60 degrees verti-cally. Thus the low-end LCD provides a less de-tailed (more grainy or blurred) picture than doesan ordinary television, but over a larger field ofview.

There are many higher priced liquid-crystalHMDs on the market. An example-one of threewinners of a CyberEdge Journal 1993 Product ofthe Year Award in the hardware category-is theVIMTM personal viewerTM developed by KaiserElectro-Optics, Inc., of Carlsbad, California. TheModel 500pv VIMTM personal viewerTM uses fullcolor multiple active matrix LCDs, currentlymanufactured in Japan, with 232,300 color ele-ments (77,433 color groups) providing a 50-de-gree (horizontal) field of view with a 4:3 aspect

Zo National Television S@ndards Convention. This standard, proposed by the National Television System Committee, specifies details of

television signal format-such as scan lines per frame, frames per second, and interlacing of scans-that U.S. television broadcasters have usedfor over four decades.

Virtual Reality and Technologies for Combat Simulation 119

ratio. It accepts two channels of NTSC TV videoinput. It weighs 24.5 ounces and lists for $2,975.Kaiser’s Model 1000pv VIMTM personal viewerTM

is similar but has twice as many pixels, provides a30-degree (vertical) by 100-degree (horizontal)field of view, and uses a custom video format. Itweighs 26.5 ounces and lists for $9,975 [104,86].

The more expensive HMDs ($60,000 and up)use special CRTs. Examples are the Color SIMEYETM HMD systems produced by Kaiser Elec-tro-Optics, Inc. The SIMEYETM 40 HMD is capa-ble of displaying full stereo imagery (i.e., with100-percent overlap of the images from the twoCRTs) over a 40-degree circular field of view, or ofproviding a 40-degree (vertical) by 60-degree(horizontal) field of view with full stereo imageryin the central 20 degrees. It supports several videoformats, including 1,280 by l,024 pixels 2: 1 inter-laced (in color, or noninterlaced monochrome) at aframe rate of 50 or 60 Hz. It weighs 4.5 pounds andis listed at $145,000 (or $128,000 each, in quanti-ty).

The SIMEYETM 60 HMD uses a different opti-cal relay system to display full stereo imageryover a 60-degree circular field of view, or it canprovide a 60-degree by 80-degree field of viewwith full stereo imagery in the central 40 degreesor a 60-degree by 100-degree field of view withfull stereo imagery in the central 20 degrees. Itweighs 5.1 pounds and is listed at $165,000($146,000 each in quantity). Kaiser will lease ei-ther SIMEYETM model at $13,000 per month forone year or $7,500 per month for two years. BothSIMEYE TM models are see-through-i.e., theysuperimpose images of virtual objects on the actu-al visible background [103]. SIMEYETM HMDshave been purchased by the U.S. Air Force, theU.S. Navy, NASA, the Australian Defence Force,and others [86].

Another type of HMD, called a fiber-opticHMD (FOHMD) uses a bundle (viz., cable) of op-tical fibers to pipe the images (e.g., one for eacheye) from nonhead-mounted displays, eitherCRTs or LCDs. The Army Research Institute atFort Rucker, Alabama, is experimenting with aFOHMD developed by CAE Electronics, Ltd., ofMontreal, Canada. The Naval Air Systems Com-

mand (PMA-205) is investigating the effec-tiveness of FOHMD technology for navalapplications [210].

One of the most ambitious concepts for anHMD is the Virtual Retinal Display (VRD) men-tioned by Thomas Furness in the 1991 Senatehearing on virtual reality [65]. As of mid-1993,the University of Washington’s Human InterfaceTechnology Laboratory had built a bench-mounted technology demonstrator producing a500 by 800-pixel monochromatic image using asingle diode laser, an acousto-optical horizontalscanner, and an electromechanical vertical scan-ner. Planned improvements include use of an arrayof multicolored diode lasers (or incoherent light-emitting diodes, in a different version) to generatea color image with almost 12 million pixels, anddifferent scanning methods that promise a reduc-tion in size and weight [1 13,126].

An HMD requires a head position and (espe-cially) orientation tracker, so that the graphics pro-cessor can determine the field of view to berendered. A tracking device can be mounted on thehelmet(or head). Alternatively, the display can bemounted on a boom for support and tracking.

There is some concern about potential healthhazards from HMDs. There have been a number ofanecdotal reports of HMDs and other stereoscopicdisplays causing stress, but there have been fewgood clinical studies of the hypothesis. One ofthem-a study by the Edinburgh Virtual Environ-ment Laboratory of the Department of Psycholo-gy of the University of Edinburgh—found thatstereoscopic HMDs caused stress in many sub-jects. The basic test was to put 20 young adults ona stationary bicycle and let them cycle around avirtual rural road setting using an HMD (VPL LXEyePhone and a second HMD LEEP optic-equipped system). After 10 minutes of light exer-cise, the subjects were tested. An article in the laypress summarizing the results said, “The resultswere alarming: measures of distance vision, bin-ocular fusion and convergence displayed clearsigns of binocular stress in a significant number ofthe subjects. Over half the subjects also reportedsymptoms of such stress, such as blurred vision”[79].

20 I Virtual Reality and Technologies for Combat Simulation

The article described the primary reason for thestress—the difference between the image focaldepth and the disparity. Normally, the when a per-son’s eyes look at a close object they focus (ac-commodate) close and also rotate inward(converge). When they accommodate on a far ob-ject, the eyes look more nearly in the same direc-tion. However, a stereoscopic display does notchange either the effective focal plane (which isset by the optics) or the disparity depth. The eyesstrain to decouple the signals. The article dis-cussed some potential solutions but noted thatmost of them are difficult to implement. It specu-lated that monoscopic HMDs, which were nottested, might avoid the problems. The article didnot discuss non-HMD stereoscopic devices.

Olaf H. Kelle of the University of Wuppertal inGermany reported that only 10 percent of his usersshowed signs of eye strain [98]. His system has athree-meter focal length, which seems to be acomfortable viewing distance. Others have notedthat long-duration monitor use often leads to theuser staring or not blinking. Video display termi-nal users are often cautioned to look away fromthe screen occasionally in order to adjust their fo-cal depth, and to blink.

John Nagle provided the following list of otherpotential problems with HMDs: electric shock orbum, tripping over real objects, simulator sick-ness (disorientation caused by conflicting motionsignals from eyes and inner ear), eye strain, and in-duced post-HMD accidents: with “some flightsimulators, usually those for military fighter air-craft, it been found necessary to forbid simulatorusers to fly or drive for a period of time after flyingthe simulator” [98]. In some cases “flashbacks”occurring some hours after spending hours in asimulator have nearly caused fatal automobile ac-cidents. “We have gone too fast technologically,”opined researcher Robert Kennedy, who has stud-ied simulator sickness for 15 years [101].

| Haptic TransducersSensations in the skin-pressure, warmth, and soon—are haptic perceptions. In the context ofvirtual-reality, haptics refers to the design of

clothing or exoskeletons that not only sense mo-tions of body parts (e.g., fingers) but also providetactile and force feedback for haptic perception ofa virtual world. If users are to feel virtual objects,haptic transducers are needed to turn the electricalsignals generated by a computer into palpablestimuli.

Haptics is less mature than are stereo image and3-D sound generation, partly because haptic per-ception is less important than visual perceptionand audio perception for most tasks, partly be-cause haptic perception is less well understood,and partly because some transduction tasks thatmay be imaginable and desirable are simply quitedifficult to achieve. Others are quite simple. Forexample, treadmills have been used for years inwalk-throughs of virtual architectural prototypes;walking on the treadmill makes the scene pres-ented in a walker’s HMD change and also givesthe walker a realistic sensation of exertion. Simi-larly, stationary bicycles have been used in arcadesystems to allow a customer to pedal through avirtual world displayed on a screen (or to pilot avirtual pterodactyl).

Semiconductor thermoelectric (TE) heatpumps are being incorporated into gloves to allowwearers to feel the temperature of virtual objects[28,56]. Shape-memory metal wires have been in-corporated into gloves and electrically actuated toprovide limited force feedback. Larger, heavierelectromechanical actuators—in some cases onexoskeletons—provide more forceful feedback[160]. One of the first such devices was theGROPE-II virtual molecule manipulator devel-oped at the University of North Carolina and com-pleted in 1971. Its effectiveness was severelylimited by the computational power available atthat time. Its successor, GROPE-III, begun in1986, was better and proved to be a useful researchtool for chemists. Specifically, the haptic feed-back that it provided reduced the time required forchemists to perform certain subtasks necessary tofind the best (i.e, lowest potential energy) site fordocking a virtual drug molecule to a virtual en-zyme molecule [16]. Overall elapsed time was notsignificantly reduced, but the subjects were gener-

Virtual Reality and Technologies for Combat Simulation 121

ally able to find better (i.e., lower potential ener-gy) docking sites than were found by a commoncomputational technique that does not requirevisual or haptic human-computer interaction.

Today haptic interface technology is still littleused; most visitors to virtual worlds cannot feeland wield virtual instruments, tools, and weapons.If technological advances make haptic interfaceslighter, smaller, less expensive, and more realistic,they could see wider use in military applications,including aircrew training, telesurgery, and as-sessment of the design of habitable spaces onships.

Analysts at the Center for Naval Analyses havedescribed the need for tactile feedback in aircrewtraining devices as follows [210]:

. . . in a virtual cockpit, an image of the user’shand would be displayed (on either a CRT or anHMD) along with a virtual control panel. Theuser interacts with buttons and switches on thecontrol panel by moving his hand to manipulatethem.

When a virtual switch has been flipped, theuser should be given some form of feedback orindication. This can take the form of visual, au-ditory, or tactile cues. But unless the system pro-vides tactile feedback, the user will not feel anyresistance as the object is touched. Without tac-tile cues, the user may be uncertain whethercontact has been made with the switch.

There is a similar need in surgical simulatorsand systems for minimally invasive surgery andtelesurgery. Researchers at the Georgia Institute ofTechnology are trying to simulate surgery on thehuman eye with force feedback. Some researchersspeculate that force feedback may be useful in cer-

tain types of endoscopic oncological surgery,21

because cancerous tissue is often detected bytouch [3]. A solicitation issued by DOD’s Ad-vanced Research Projects Agency in January 1994[179] announced the creation of a program in Ad-vanced Biomedical Technology, stating that:

The purpose of this effort is to develop andtransition these technologies [remote sensing,secure broad bandwidth communications, tele-operation, advanced imaging modalities, virtu-al reality and simulation] to the medical arena toreduce the battlefield casualty rate by an orderof magnitude through early intervention. A pri-mary objective is to demonstrate that criticalmedical information can be acquired by body-worn, noninvasive sensors and rapid therapeuticintervention can be applied through telemedi-cine.

ARPA solicited proposals in three technologyareas. In the area of Remote Therapeutic Interven-tion, ARPA sought “a capability to perform accu-rate surgical procedures from a centralworkstation to a mobile remote operative site(which contains all requisite logistical support)”:

The ideal workstation would have 3-D video,at least 6 degree of freedom (DOF) master, andsensory input. The remote site must have 3-Dcamera, 6 DOF manipulator slave, and tactile,force feedback and audio sensors. Manipulatorlatency must be less than 100 milliseconds. Theinterface should be intuitive and transparent,and support multiple digital imaging technolo-gies.22

A more recent ARPA solicitation [180,181,1 82] sought proposals to develop and demon-strate technology for “simulation-based design”

z] In endoscopic Surgq, a Sul-geon guides an optical-fiber cable through an orifice or incision in the patient. SOIIIe fibers GNTy light from a

light source to illuminate the cavity or tissue just beyond the end of the cable; other fibers carry reflected light back to a video camera outside thepatient, which sends its video signal to a monitor watched by the surgeon. The tip of the same cable, or a separate one, may have a surgicalinstmment that is manipulated by the surgeon from outside the body.

22 ARpA also solicited Prowsals in tie flea of medical simulation, saying “The next critical advancement in medical education appears to

require a virtual environment (cadaver) of human anatomy that permits education in basic and abnormal (e.g., military wounds) anatomy. Sucha simulation should have realistic interaction capable of supporting surgical simulation for training. There is also interest in a virtual environ-ment for individual simulated soldiers and medics which is compatible with SIMNET for use in medical forces planning and training. Thesevarious components must be individually useful while also integratable into a single large medical support system which uses AT?WSONETtechnology in gigabit-rate network.”

22 I Virtual Reality and Technologies for Combat Simulation

(SBD) of complex systems such as naval ships.One of many possible applications of VR to thistask is the experiential prototyping [1 18] of shipsto assess the design of habitable spaces. Bumpinginto pipes, stanchions, bulkheads, and so forth is areal hazard that cannot be assessed readily by VRwithout force feedback. A VR system can detectwhether a user bumps her head into a virtual pipeeven if the user cannot feel it, but if the user cannotfeel it, the user will not learn the adaptive behaviorthat, on a real ship or a mockup, would tend to re-duce the incidence of such collisions.

Haptics remains a challenging research area—and a potentially dangerous one. Too much forcefeedback by an exoskeleton could crush a wearer’shand. It maybe a long time before a pilot wearingan HMD and a force-feedback suit and gloves willbe able to reach up and feel his virtual canopy, orthe effort of moving his virtual stick or throttle.When it does become feasible, it may not be ascost-effective as simply building a cockpit mock-up for the pilot to sit in—the approach used inLink trainers and even earlier aircrew training de-vices [139, fig. 3-l].

| Degrees of Virtual RealityAlthough there is a bewildering variety of virtualreality systems that differ from one another in de-tails of hardware and software, connoisseurs liketo sort them into a few categories describing thedegree or level of VR that they provide. The pricetag is often a good index, but other levels havebeen defined. The following are categories de-fined by Jerry Isdale; other writers have proposeddifferent categories.

An entry-level VR system takes a stock person-al computer or workstation and implements aWOW system. The system may be based on anIBM clone (MS-DOS/Windows) machine or anApple Macintosh, or perhaps a Commodore Ami-ga. The DOS-type machines (IBM PC clones) arethe most prevalent. There are Macintosh-basedsystems, but few very fast-rendering ones. What-ever the computer, it includes a graphic display, a2-D input device like a mouse, trackball or joy-stick, the keyboard, hard disk, and memory. The

term desktop VR is synonymous with entry-levelVR.

Although desktop VR systems rank lowest inperformance, they are by far the most commontype of VR system. The promise of virtual realityfor making desktop computers user-friendly andthereby increasing their effectiveness for educa-tion and other uses, including military uses, willdepend to an important degree on the quality, vari-ety, and cost of desktop VR software.

Stepping up to basic VR requires adding somebasic interaction and display enhancements. Suchenhancements would include a stereographicviewer (LCD Shutter glasses) and an input/controldevice such as the Mattel PowerGlove and/or amultidimensional (3-D or 6-D) mouse or joystick.

The next step up, advanced W?, adds a render-ing accelerator and/or frame buffer and possiblyother parallel processors for input handling, etc.The simplest enhancement in this area is a fasterdisplay card. Accelerator cards for PC-class ma-chines can make a dramatic improvement in therendering performance of a desktop VR system.More sophisticated and costly image processorsuse an array of processors to achieve stunningrealism in real-time animation. An advanced VRsystem might also add a sound card to providemono, stereo, or true 3-D audio output. With suit-able software, some sound cards can recognize auser’s spoken commands to fly, turn left, stop,grasp an object, and so forth, and translate theminto electronic commands that the virtual environ-ment system can respond to. This is especiallyuseful for immersive VR systems, the next higher(and highest) degree of virtual reality.

An immersive (or immersion) VR system addssome type of immersive display system, such asan HMD, a BOOM, or multiple large projectiondisplays (as in the CAVE). A command- and data-entry system other than an ordinary keyboard is avery useful adjunct to immersive systems; a voicecommand recognition system or a single-hand-held chording keyboard are options. An immer-sive VR system might also add some form ofhaptic interaction mechanism so that the user canfeel virtual objects. Some writers further classify

Virtual Reality and Technologies for Combat Simulation 123

immersive systems as garage VR systems, if theyare constructed in a labor-intensive manner fromrelatively inexpensive components and subsys-tems, or high-end VR systems, if they use expen-sive commercial equipment.

| Shared Virtual RealityJaron Lanier, then-president of VPL Research,Inc., emphasized at the 1991 Senate hearing onVR that “it is important to remember that virtualreality is a shared social experience, very muchlike the everyday, physical world” [1 18]. VPL Re-search, founded in 1985, pioneered the develop-ment of software and equipment—such as itsEyephones HMD, DataGlove glove input device,and DataSuit (a full-body version of the Data-Glove)—with which several persons could expe-rience a virtual world in which they could see andhear one another. Earlier, DOD projects fromSAGE to SIMNET allowed several persons to ex-perience a virtual world in which each could seeothers’ actions on a radar screen (SAGE) or simu-lated periscope (SIMNET).

Increasingly, similar networking is being donefor entertainment, both in arcades [73] and fromhome. Enthusiasts on different continents maydial one another to play Capture (capture the flag)in virtual armored vehicles [73], Energy Duel (adogfight), or Cyberball [206]. One can dial a hostcomputer to play cat-and-mouse (wearing a cat ormouse face) with another player in a three-dimen-sional “Polygon Playground” [42] or drive a virtu-al tank and fight (and talk to) others in the ParallelUniverse multi-player demonstration offered bythe Virtual Universe Corporation of Alberta, Can-ada. Office workers whose workstations arelinked by local-area networks (LANs) team up af-ter hours to fight Former Humans, Former HumanSergeants (“much meaner, and tougher”), Imps,Demons, Spectres, and the like on a virtual mili-tary base on Phobos, in the popular computer

game DOOM (by id Software). Others link up byLAN or Internet to dogfight in virtual fighterplanes (in Airwarrior).

All this requires networks, and networking re-quires standards. Networked DOD simulationsfrom those of SAGE to those of SIMNET haveused the public telephone network. Today, a vari-ety of other networks are also used, including ad-hoc LANs, the Intemet,23 and the DefenseSimulation Internet (DSI).

The DSI was originally known as the Terres-trial Wide Band Network (TWBNet), which waspart of the North Atlantic Treaty Organization’s(NATO’s) Distributed Wargaming System(DWS), which was sponsored by DARPA and byNATO’s Supreme Allied Commander-Europe(SACEUR). TWBNet was orphaned briefly afterthe Cold War ended but was later adopted byNATO’s Warrior Preparation Center (WPC),DARPA’s SIMNET project, DOD’s Joint War-fighting Center (JWFC), and the Walker Simula-tion Center in the Republic of Korea. The DSI hasgrown to over 100 nodes and is now being man-aged jointly by DOD’s Advanced Research Proj-ects Agency (ARPA), which developed thetechnology, and the Defense Information SystemsAgency (DISA), which will be responsible for theDSI when it is fully operational.

The DSI (together with other networks) haslinked heterogeneous simulators in exercises thathave included engagements of simulated tanks,other armored vehicles, ships, helicopters, fixed-wing aircraft, cruise missiles, and teams of Ma-rines equipped with laser designators to designatetargets for Air Force aircraft equipped with laser-guided bombs. Simulators physically located inthe continental United States, Germany, and theRepublic of Korea have participated in the sameexercise, operating on the same virtual battlefield.The DSI is used for video teleconferencing (VTC)

23 me Internet is ~ ~o]]ection of ~my networks hat exch~ge digi~l info~ation Using the TcP/IP suite of protocols, including the Trans-

mission Control (TCP), the Internet Protocol (1P), the File Transfer Protocol (FIT), the Simple Mail Transfer Protocol (SMTP), and others.

24 I Virtual Reality and Technologies for Combat Simulation

in support of traditional wargaming, as well as fordistributed interactive simulation.24

The SIMNET program defined a communica-tions protocol—a specification of electronic mes-sage types and formats, and other matters—to beused by vehicle simulators so they could interop-erate over a network and share the same virtualbattlefield. The SIMNET protocol is still used bysome simulators, but it has been superseded byIEEE Standard 1278-1993, which is based on theSIMNET protocol but defines additional messagetypes to permit simulation of additional types ofvehicles and phenomena.

OBSERVATIONS| Synthetic Environment Technology

Is Dual-UseMuch of the hardware and software used by DODfor combat simulation, including synthetic envi-ronment technology, has commercial application.DOD appears to be using or adapting commercialequipment wherever possible, in order to reducecost and acquisition time, and it has participated inthe deliberations of nongovemment standardsbodies, such as the Institute of Electrical and Elec-tronic Engineers (IEEE) to steer industry stan-dards in directions responsive to DOD’s needs.IEEE Standard 1278-1993, for distributed interac-tive simulation communication protocols, is oneproduct of that process.

DOD will have to pay for the development ofsome simulator software and hardware for whichthere is no commercial market, and for the valida-tion of the software and hardware that it uses.However, the location-based entertainment (i.e.,arcade) industry provides a surprisingly large andrapidly growing commercial market for distrib-uted interactive combat simulators. Some of theproducts being developed for this market are verysophisticated, realistic, and expensive. An exam-ple is the Hornet-l F/A-18 flight simulator devel-oped by Magic Edge. Its user sits in a cockpit in a

capsule on a three-degree-of-freedom motion basethat can roll the capsule 60 degrees left or right,pitch the nose up 45 degrees or down 25 degrees,and provide 32 inches of vertical motion. The userlooks out the canopy at a projection screen onwhich a state-of-the-art, high-end computerimage generation system projects a scene of sky,ground, and up to five other F/A-18s controlled bythe pilots of other Hornet-l simulators, while afour-channel surround-sound synthesizer pro-vides a background of engine whine and tacticalradio traffic.

To be commercially successful, the Hornet-lneed only be sufficiently realistic, or at least enter-taining, to draw a crowd. It need not have the fidel-ity required to train Navy pilots to land safely onaircraft carrier decks. Future Navy ATDs may usesimilar hardware and software, but DOD mustbear the cost of determining whether they faithful-ly represent critical aspects of reality (i.e., DODmust validate the simulation) and deciding wheth-er they are acceptable for the intended purpose(i.e., DOD must accredit the simulators). Mostlikely, in order to be accredited, some of the soft-ware (and perhaps hardware as well) may need tobe designed to DOD specifications.

DOD will also have to pay for the communica-tions infrastructure or service (or combination ofinfrastructure and service) required to link simula-tors with low latency. The Distributed InteractiveSimulation (DIS) Steering Committee has re-ported that the primary bottleneck has been local-area networking—the connection of simulators toothers nearby and to wide-area network nodes[43].

The DSI is being expanded to meet the need forwide-area networking. Access to the DSI costs$150,000 to $300,000 per node per year [43].DOD has used, and will continue to need to use,the rapidly growing commercial, academic, andnonmilitary government networks, such as thosethat are part of the Internet, for wide-area network-ing.

24 SIMNET, DIS, ~d he DSI will be discussed in greater detail in a forthcoming OTA background pa~r.

Virtual Reality and Technologies for Combat Simulation 125

One reviewer of a draft of this paper stated thatthe biggest problem on the DSI has been the en-cryption of data for transmission for transmissionover wide-area networks; he argued that DODshould rely as much as possible on commercialtechnology and focus its spending on develop-ment of technology addressing needs peculiar toDOD, such as security and DIS protocol supportfor use with commercial multicast equipment—hardware and software that examines the address-es of digital messages (e.g., simulation data) andmakes copies and re-addresses them to send toparticular computers (e.g., the processors of simu-lators participating in a particular simulation).

The Naval Postgraduate School Network(NPSNET) Research Group has been experiment-ing with multi-vehicle combat simulations inwhich data are formatted into SIMNET or DISProtocol Data Units (PDUs) [97] and multicast toparticipating simulators over the Multicast Back-bone (MBONE or MBone)—an experimentalvirtual network operating on portions of the physi-cal Internet [23]-or using MBONE protocols[214]. The MBONE is being developed to trans-mit live audio and video (e.g., from a rock concert)in digital form to a large audience on the Internet.It uses special procedures to insure that the digitalmessages (packets) containing the audio and vid-eo signals will be delivered faster than, say, thosecontaining parts of e-mail messages. These fea-tures are just what is needed for networked multi-vehicle combat simulation, and it costs little to setup an MBONE router (software is free and can runon a personal computer) and connect it to theMBONE.

Internet access providers are now debating themerits of charging for data transfer (rather than foraccess privilege, as is now done) in dollars per bit,depending on the availability required and latencythat can be tolerated. If such a pricing scheme isadopted, DOD might be able to pay for the con-nectivity, bandwidth, and availability that it re-quires when it needs it—for example, duringexercises. DOD will also continue to need somededicated links, but perhaps fewer as a proportionof the whole.

| Cost-Effectiveness Is IncreasingThe scale—i.e., the number of transistors-of mi-croprocessors continues to increase roughly in ac-cordance with Moore’s law—i.e., it doubles everytwo years. The information storage capacity ofmemory chips shows a similar trend, and CD-ROMS and other media are now commonly usedfor high-capacity archival storage. Flat-panel dis-play chips—such as LCDs, active-matrix LCDs,and micromechanical mirror-arrays-show a sim-ilar trend. These trends and others (including par-allel processing, networking, and production ofsoftware to exploit the new hardware to full ad-vantage) allow new simulators and simulator net-works to simulate processes with unprecedentedrealism and speed.

Intel envisions that by 2000 it will be able toproduce a “Micro 2000” processor chip with fourdies (pieces of silicon), each containing 80 to 100million transistors switching on and off 250 mil-lion times per second (250 MHz). The multipro-cessor chip will be able to execute over two billioninstructions per second. Newer, higher-densitychips cost more than earlier chips, but the cost perunit of performance-instructions per second fora processor, or storage capacity in bits for amemory chip-has been decreasing.

As a result, the average cost of a fully im-mersive virtual reality system-one with head-tracking, a head-mounted display, and “three-dimensional” sound-declined from about$1 OO,OOO in early 1991 to about $50,000 twoyears later. As the price halved, sales increasedtenfold, from 50 to 500, probably the result of boththe price elasticity of demand and the contagious-ness of technology adoption. An industry forecastpredicts that average cost will decline to about$10,000 in 1998, when sales will reach 16,000systems. This volume (if not the revenue: $160million) will be dwarfed by the volume of “desk-top” virtual reality systems that provide interac-tivity with simulated 3-D objects using, in somecases, only ordinary office workstations runningspecial software, some of which is distributed free

26 I Virtual Reality and Technologies for Combat Simulation

on the Internet and on dial-up computer bulletin-board systems.

This trend in virtual reality systems is reflectedin other computer-intensive simulators as well.Technological advances have allowed prices tofall while performance increases. Many proposedapplications of high-tech simulation technologythat were not heretofore cost-effective now are orsoon will be, but such questions as “which?” and“when?” must be answered on a case-by-casebasis.

CHALLENGES| High-Density, Color Flat-Panel DisplaysA widely recognized challenge for the develop-ment of some types of virtual simulators is the de-velopment of high-density, color, flat-paneldisplays for use in head-mounted displays(HMDs) and other applications. “High-density”in this context refers to the number of picture ele-ments (individual dots, called pixels or pels) perinch or per square inch. In some applications—e.g., wall-mounted television screens—a detailedpicture may be displayed by using a large displayscreen. In order for an HMD to display a similarlydetailed picture, the picture elements must becloser together, so the display is small enough andlight enough to wear.

LCDs currently used in HMDs provide onlyabout a fourth of the visual acuity obtainable indomed flight simulators, which remain thetechnology of choice for tactical aircrew trainingdevices. The displays must be lightweight to beuseful (or competitive) for HMDs, and they mustbe less expensive if they are to find wider applica-tion. The best LCDs for HMDs are currently madein Japan; American economic competitivenesswould benefit if U.S. industry could manufacturea competitive product, a point made often in the1991 hearings on virtual reality. Since then, DODhas funded a number of projects intended to stim-ulate domestic production of world-class flat-pan-

el displays for use in military systems, as well as incommercial systems, which would reduce thecosts to DOD for purchasing similar or identicaldisplays.25

A recent Center for Naval Analyses memoran-dum to the Navy’s Director for Air Warfare sug-gested [210]:

The best approach for the Navy would be toleverage the developments in virtual envi-ronment technologies being pursued by theentertainment industry. Once that technology—particularly the technology for head-mounteddisplays—matures, applications for aircrewtraining should be considered.

| Fast Head TrackingWhen an HMD wearer turns his or her head quick-ly, it take a fraction of a second for the display tocatch up and show the scene as it should appearfrom the new direction of gaze. The wearer seesthe scene pan to catch up after head rotation hasstopped. This effect is called swimming. It notonly destroys (or inhibits) the illusion that theHMD wearer is in a virtual world, it can also causesimulator sickness, which is related to motionsickness.

Some of the delay, called latency, maybe there-sult of slow image rendering by the CIG system.However, when very fast CIG systems are used,the latency of the head tracking device becomesthe limiting factor. Several technologies, de-scribed above, have been used for head tracking;magnetic tracking systems have been prominentin high-end VR systems.

Optical trackers now in development are apromising alternative to the magnetic trackerswidely used with HMDs. One type of opticaltracker uses one or more TV cameras on the HMDto transmit images of the scene (possibly includ-ing pulsing infrared light-emitting diodesmounted on the ceiling) to a video processor thatcalculates the position and orientation of cameraby analyzing the positions of the infrared LEDs in

25 TheW initiatives will& de~~bed in ~ appndix on flat-panel displays in the volume of case studies, to k published in 1995, that will

supplement OTA’S final repofl on its assessment of civil-military integration [169].

Virtual Reality and Technologies for Combat Simulation 27

the TV image it produces. By adding “offsets,” theprocessor also calculates the position and orienta-tion of the wearer’s head (which may be neededfor correct body representation in a shared virtualworld) and the position and orientation of thewearer’s eyes (the points of view from which thestereoscopic images should be rendered). Opticaltracking can be very fast, but an optical tracker isheavier than a magnetic tracker (this may change)and requires more computation, making it expen-sive. It allows the wearer freedom of motion with-in a large volume, but if it requires an array oflight-emitting diodes, the cameras’ view of themcannot be obstructed. Research on this challengecontinues.

| Wideband Networks With Low LatencyAnother challenge is that of building the datacommunications infrastructure required: a net-work connecting all the simulators (as well as anyreal vehicles, weapons, and other systems) that areto participate in a simulation, capable of transfer-ring large quantities of information at aggregaterates that, using the DIS protocol, must grow atleast in proportion to the number of participatingentities (simulators, etc.). The network need notbe dedicated to combat simulation; it can beshared with other users and used for other tasks,but it would be wasteful if a large simulation wereto be interrupted by, say, a digital movie beingtransferred over a link of the network for leisure-time viewing. As was noted above, under Syn-thetic Environment Technology Is Dual-Use,nondedicated networks will continue to comple-ment the Defense Simulation Internet. Wider use(and improvement) of multicasting protocols canreduce the network bandwidth required for simu-lations (and other uses), and proposed Internetsubnetwork usage-pricing schemes may allowDOD to pay for the low latency it needs for simu-lations.

| Multilevel Network SecurityYet another challenge is that of ensuring that eachhuman or computer participating in a simulationcannot acquire secret information used in the sim-

ulation without proper authorization. In distrib-uted interactive simulations, multilevel security isthe preferred operating mode. For example, a par-ticipating U.S. Army tank driver usually does notneed and is not authorized to obtain all of the in-formation used by a participating U.S. Army in-telligence analyst, and the simulator used by aparticipating Luftwaffe (German Air Force) fight-er pilot does not need and generally is not autho-rized to obtain all of the information that might begenerated by a participating U.S. Air Force fightersimulator. In the former case, security is enforcedby a variety of techniques including manual pro-cedures that DOD would like to automate. In thelatter case, much of the problem is intrinsic andnot amenable to automation. If a foreign pilot isnot authorized to know the dash speed of an AirForce at full military power, it should not be flownat that speed within range of his radar, either in alive exercise or in simulation.

The end-to-end encryption systems presentlyused to provide security for the communicationlinks used by distributed interactive simulationsare imposing severe limits on the amount of datathat can be put through them, according to the DISSteering Committee [43]. A challenge is to mini-mize the impact of these constraints by improvingthe performance of the encryption systems or find-ing alternatives to them that provide adequate lev-els of protection, but with greater throughput. Thechallenges and opportunities of multilevel secureuse arise in many applications and are by nomeans unique to combat simulation.

| Automating Object and WorldDescription for Scene Generators

Another challenge is automating the tedious proc-ess of generating data files that image-generatingcomputers need to display terrain, other back-grounds, objects, and scenes. In 1991 testimony,Fred Brooks, Professor of Computer Science atthe University of North Carolina at Chapel Hill,said: “The house model that you saw there withthe pianos in it has about 30,000 polygons and 20textures in its present version. We have beenworking on that model for years” [17]. Clearly,

28 I Virtual Reality and Technologies for Combat Simulation

rapid experiential prototyping will require bettersoftware tools for object model development.There are similar problems in the development ofvirtual objects and environments for militarytraining and mission preview.

There has been progress, but the process re-mains time-consuming and hence costly. For ex-ample, in April 1994 OTA staff rode a virtual“magic carpet” over simulated terrain of a worldhotspot. The terrain database had been preparedearlier to help policy makers visualize the terrain,but in order for them to drive virtual vehicles overthe terrain, roads—initially “painted” onto the ter-rain from photographic data by an automatedprocess—had to be “cut” into the grade by amanual process that required a day per five kilo-meters.

Some DOD simulators are now using Power-Scene rendering software developed by Cam-bridge Research Associates of McLean, Virginia.It can render 3-D images (i.e., 2-D images fromany specified point of view) from overhead stereo-scopic images (e.g., aerial photos). This obviateshand-modeling of cultural features but requiresoverhead stereoscopic images of the area to bemodeled in whatever resolution is needed. Ascene rendered from SPOT satellite imagery,which has 10-m resolution, would not show all thedetail visible to a pilot flying lower than 8,000 feet[210].

The issue of acquiring, formatting, storing, anddistributing global terrain (and more generally,environmental) data is a policy issue discussed be-low in the section “Infrastructure.” Here we focuson the problem of rapidly fusing such data withother data (e.g., technical intelligence on enemyweapon system platforms) to create a virtual envi-ronment for a simulation. A particularly challeng-ing aspect of this problem is the need t orealistically model changes in terrain and struc-tures (buildings, bridges, etc.) that would becaused by the combat activities being simulated,for example, small-arms fire, artillery fire, andaerial bombardment. That is, “dynamic terrain”must be simulated. This need is being addressedby the Army Research Laboratory’s (ARL’s) Vari-able Resolution Terrain (VRT) project [188]:

The Variable Resolution Terrain (VRT) modelwas developed at the U.S. Army Research Labo-ratory to address the shortcomings of DefenseMapping Agency (DMA) terrain data. DMA ter-rain data lack the high degrees of detail and flex-ibility required by combat models andsimulations for individual soldiers. Using theVRT model, high-fidelity geotypical terrain canbe rapidly generated by computer. The level ofdetail of an area of terrain can be computed toany resolution. VRT-generated terrain is dynam-ic. Terrain events, such as cratering from artil-lery, which occur during a combat simulationcan be quickly reflected. VRT-generated terrainis also user definable. A limitless number of ter-rain surfaces, ranging from mountainous areasto coastal plains, can be specified by a user. . . .

The VRT model will be applied at ARL as partof a real-time combat simulation environmentfor individual soldiers. This implementationwill be accomplished using a client-server dis-tributed computing paradigm. The paradigmwill consist of a heterogeneous cluster of high-perforrnance RISC-based workstations and avariety of parallel and vector supercomputers.In this arrangement, a single workstation (themaster server) will act as the primary terraincontroller. It will manage the operation of allother computers involved in the terrain genera-tion process. This will allow for maximum flexi-bility permitting changes in the availablecomputing power response to specific terraincomputational requirements. Application of theVRT model in this manner is necessary due thecomputational demands of the model. Its imple-mentation will produce high-detail terrainwhich is responsive to rapidly changing battle-field conditions in a real-time environment.This will enhance the realism of combat simula-tions.

| Simulating Infantry and NoncombatantsAnother challenge is simulating individual infan-try troops and noncombatants. Infantry troops arecritical to the success of most phases of land com-bat, and noncombatants are present on manybattlefields, especially in urban combat and low-intensity conflicts. Mission rehearsal systemshave been developed for special operational

Virtual Reality and Technologies for Combat Simulation 129

forces and security personnel, but dismounted in-fantry troops (or units) are not simulated in net-worked simulations that use the DIS protocol[97], a deficiency that critics argue makes the out-comes “predicted” by SIMNET exercises of ques-tionable relevance to likely combat, especiallylow-intensity conflict in which U.S. forces aremost likely to be involved in the near future.

Infantry fireteams (units consisting of a few in-dividuals) have been simulated collectively inSIMNET simulations [109] but in a stereotypedmanner; the representations are not controlled bythe body movements of individual participants ona one-to-one basis.2b For some purposes, model-ing only fireteams and mounted infantry (in ar-mored vehicles) is a useful and economicalapproximation. For training of dismounted infan-try troops or investigating their performance, it isnot. DOD and the Army are conducting researchto simulate individuals (infantry troops, medics,and others) in netted or stand-alone simulations.

Notable work in this area is being done by theArmy’s Integrated Unit Simulation System(IUSS) project and the Navy’s Individual Port(I-Port) project. Applications include the testingof systems carried and used by individuals (com-munications equipment, armor, etc.) in complexand stress inducing environments and the injec-tion of complex individual behavior into a virtuaIexercise.

IUSS is a very high-resolution simulation ofdismounted soldier interaction in a battlefield en-vironment. Its resolution is from individual sol-dier to company-level groupings of soldiers. IUSSis used to study the effects that changes in sol-diers’ personal equipment, conditioning, and ca-pabilities have on the accomplishment of smallunit missions. IUSS provides the ability to per-form real-time analysis of soldier/unit/equipmentperformance both during and after simulationexecution using commercially available software.

The Dismounted Battle Space Battle Laboratoryhas sponsored addition of DIS compatibility to IUSS[43].

I-Port inserts an individual into a virtual exer-cise by applying sensors to the human body thatcan determine direction, velocity, and orientationof that body, converts the information to DIS-likeprotocol data units (PDUS), and transmits them.The virtual world is presented to the individual viaa head-mounted display or projected displays onsurrounding screens. Three I-Ports were demon-strated in February 1994 at the 1994 IndividualCombatant Modeling and Simulation Sympo-sium held at the US Army Infantry School in FortBenning, Georgia. The soldiers in the I-Portsdemonstrated the capability to walk and runthrough the virtual environment, give overheadhand and arm signals, assume a prone position,throw a grenade, dismount from a Bradley fight-ing vehicle, and point and fire a rifle.

The demonstration was a joint effort of the Na-val Postgraduate School NPSNET ResearchGroup, SARCOS Research Corporation, Inc., theUniversity of Pennsylvania, and the University ofUtah. SARCOS and the University of Utah pro-vided the mobility platform and the upper bodyinstrumentation for the demonstration. The mea-surements from the upper body suit and mobilityplatform were analyzed and processed by the Jacksoftware developed by the University of Pennsyl-vania. All computer graphics displays, includingthe three-walled walk-in synthetic environment,were generated by the NPSNET Research Group,using a modified version of NPSNET-IV, a low-cost, student written, real-time networked vehiclesimulator that runs on commercial, off-the-shelfworkstations (Silicon Graphics, Inc., IRIS com-puters). NPSNET-IV uses SIMNET databasesand SIMNET and DIS networking protocols[214], and one version uses multicasting [12].

26 OTA ~@fob~cmed ~ example ~ the May 1994 Multi.Semi~e Dis~buted T~~ing Testbed (MDT2) exercise, which was conducted on

the DSI. A simulator for a man-portable ground-based laser target designator team participated in the simulation from the Naval Research and

Development Activity in San Diego, California [55].

30 I Virtual Reality and Technologies for Combat Simulation

The Army Research Laboratory, SimulationTechnology Division, Simulation MethodologyBranch is also addressing the need for syntheticenvironments technology for the individual sol-dier [187]:

A virtual environment has been developedfor research into embedding real and computergenerated people into DIS. The environment,internally called Land Warrior, provides a 3-Dview of the virtual world for real playersinstrumented with body position sensors. Theenvironment has three modes, stealth, simula-tor, and remote control. In stealth mode, a usercan view the virtual environment passively. Insimulator mode, a user at a monitor and a key-board can interact with other entities in the envi-ronment. In remote control mode, othersimulation programs such as the Stair-Steppercan view the virtual environment, move aroundin the environment, and interact with other sim-ulation entities by firing a weapon. Short-termdevelopment will provide a feedback mecha-nism to simulations from the virtual terrain, al-lowing adjustment of stair-stepper resistance asa function of terrain shape. In addition, methodsof modifying the virtual environment databasein real-time will be studied to allow the displayof damage due to small-arms fire. The environ-ment includes models to compute the move-ment of vehicles and people. Other models canbe inserted easily into the environment for ma-nipulating other objects.

ARPA has expressed “interest in a virtual envi-ronment for individual simulated soldiers andmedics which is compatible with SIMNET for usein medical forces planning and training” in its so-licitation [179] for proposals to develop advancedbiomedical technology for dual-use application.One potential use for such a system would be toestimate scenario-dependent mortality rates,which are needed as input by theater-level medicaloperations models such as the Medical RegulatingModel (MRM) used at the Wargaming Depart-ment of the Naval War College for support ofseminar war games. Estimation of casualties foroperational planning might require more validitythan is required for seminar war games, but this isa question of accreditation.

I ValidationDOD Directive (DODD) 5000.59 defines valida-tion as “the process of determining the degree towhich a model is an accurate representation of thereal-world from the perspective of the intendeduses of the model.” Department of the Army Pam-phlet 5-11 uses an almost identical definition forvalidation and applies it to simulations as well asmodels. It also defines specific steps in the processof validating a model or simulation: data valida-tion, structural validation, and output validation.Output validation is arguably the most difficult ofthe three, because the outputs (i.e., predictions) ofa model are unlikely to be valid if the model isbased on invalid data. Similarly, outputs are un-likely to be valid if the model is based on invalidstructural assumptions, although some models—essentially fitted curves-have few structural as-sumptions. Output validation is also the type ofvalidation that directly reflects the quality andreliability of model-derived or simulation-derivedinformation presented to Congress—for example,estimates of the cost and operational effectivenessof proposed weapon systems or of likely casual-ties if U.S. forces are committed to some armedconflict.

For two decades the validity of combat modelsused by DOD has been questioned—for example,in [161,199,57,200,38]. However, the last of thesecritiques [38] was prepared before DOD, at the re-quest of Congress [165,171], established the De-fense Modeling and Simulation Office [190,195]and issued DODD 5000.59, both of which wereintended to improve the practice and managementof modeling and simulation within DOD, includ-ing verification, validation, and accreditation(VV&A) of models and simulations. These ac-tions have had visible beneficial effects-espe-cially in promoting joint development (orconfederation) and use of models and simulations.However, verification requires man-hours, andvalidation requires data, much of which will haveto be collected in exercises and on test ranges,which is costly. It remains to be seen whether veri-fication, accreditation, and (especially) validation

Virtual Reality and Technologies for Combat Simulation 131

will be funded well enough to be well done and tochange the existing culture, in which validation isnot considered mandatory and is often waivedwith the justification that it would be too difficult,expensive, or time-consuming. OTA will addressthese matters in its final report on military model-ing and simulation. In this section, we simply notesome philosophical issues.

A tank simulator connected to the DefenseSimulation Internet may provide a valid simula-tion of combat from the perspective of its crew, ifthe purpose is to develop coordination in driving,target acquisition, and gunnery. This could bechecked by live simulation using real tanks, aswas done in the Army’s Close Combat TacticalTrainer (CCTT) Program [72] and as is being donein the Army Materiel Systems Analysis Activity’sAnti-Armor Advanced Technology Development(A

2ATD) Program [24]. There have also been ef-

forts to validate both a virtual DIS and a construc-tive simulation (Janus) on the basis of combat datacollected during the Gulf War [26].

The simulator may not provide a valid simula-tion of combat from the perspective of the loader(a crew member), if the purpose is to develop hiscoordination and upper-body strength so that hecan correctly select and quickly load 55-poundrounds in a tank sprinting over rough terrain [72].That shortcoming could probably be remedied atconsiderable expense by developing a movingbase for the tank simulator, and whether it hadbeen remedied could be checked by live simula-tion. But how can one tell whether an exercise, orthe average of 100 exercises conducted with100,000 netted simulators, would predict thechance of winning the next large campaign, wher-ever it may be? Even if one limits consideration toa handful of strategic scenarios, there are so manypossible operational variables to consider that onecannot test the validity of the simulations under allconditions of interest. Analysts have noted that“the reason for using simulation paradoxicallyprecludes any complete validation (because wecannot afford to create the same conditions in reallife)” [59]. Essentially the same thing was said in a1992 workshop on validation [68]:

In extremely complex and difficult modelingsituations, the requirements for comparing realworld results and model results may be difficult,if not impossible, to meet. . . . Ironicallyenough, it is this inability to replicate (or even tounderstand) the real world that drives one to theuse of a model in the first place.

Subjective judgment must be used in the outputvalidation of some combat models and simula-tions [68,37], just as it has in nuclear reactor safetyassessments and space launch reliability estimates[6]. There are rigorous methods for making sub-jective uncertainty explicit, quantifying it, com-bining estimates of different subjects, andupdating the combined estimate in a logical man-ner when relevant data become available [6]. Oneanalyst who advocates such methods for valida-tion notes that “this validation process is unques-tionably subjective, but not capriciously so” [37].

Simulation, computer animation, and virtualreality are now being used as evidence in criminaland personal-injury trials, and their validity is atissue [71 ,52]. The armed services’ security forces(the Army’s Military Police, the Navy’s Shore Pa-trol, and the Air Force’s Security Police) and in-vestigative units (the Army’s CriminalInvestigation Division, the Naval InvestigativeService, and the Air Force’s Office of Special In-vestigations) may consider using such forensicsimulations and will be interested in their validity,including the admissibility of their results as evi-dence in court. Conversely, civil investigatorsmight benefit from DOD-developed validationmethods and examples.

StandardizationStandardization is a perpetual issue, becausetechnology is dynamic. This is particularly true ofcomputer networking and distributed simulation.

DOD participated in the deliberations of theIEEE Standards Board, a nongovemment stan-dards body (NGSB), to influence the drafting andadoption of IEEE Std 1278-1993, the industrystandard for protocols for distributed interactivesimulation applications. This approach has manyadvantages over simply issuing a military stan-

32 I Virtual Reality and Technologies for Combat Simulation

dard; it facilitates civil-military integration, thepotential benefits of which are discussed in OTA'sreport on civil-military integration [170]. The Oc-tober 20, 1993 revision of Office of Managementand Budget (OMB) Circular A-119 encouragedfederal agencies to participate in NGSB delibera-tions and allowed federal employees to participateon job time. Of course, this participation requiresfunding, but it can be money well spent.

Because DIS technology is so dynamic, thereare already calls for revising or superseding IEEEStd 1278-1993. Critics argue that simulationscompliant with “the DIS protocol” (IEEE Stan-dard 1278) cannot be scaled to accommodate the100,000 entities (e.g., vehicle simulators) thatARPA hopes to have netted together in its Syn-thetic Theater of War (STOW) program by 1997.Others propose that the DIS standard be revised toaccommodate the exchange of additional kinds ofdata, such as economic data, that are not now mod-eled. These issues of scalability and flexibility arediscussed below. Issues of network synchroniza-tion and multilevel security will also be men-tioned.

| ScalabilityThe DIS protocol requires every entity in a simu-lation to broadcast certain changes (e.g., in its ve-locity) to every other entity in the simulation (“onthe net”). In most cases information about an air-plane’s change in heading maybe needed only byentities within a 100-km radius in the simulatedworld. However, if someone wants to connect anover-the-horizon (OTH) radar simulator to the netto see how much it would influence combat, itwould need heading and other information fromaircraft halfway around the simulated world. Indrafting the DIS protocol, it was simpler to requirebroadcasting of certain information than to draftspecial clauses for all the combinations of fight-ers, helicopters, OTH radars, and other entitiesthat one can imagine. The approach has workedwell to date, but that does not imply that it willwork at all if the number of entities in the simula-tion is increased 100-fold.

If 1,000 entities are on the net, and if eachmakes one reportable change per minute, theneach of the 1,000 entities will be receiving 999(roughly l,OOO)messages per minute. If thenum-ber of entities on the net is increased to 100,000,then each of 100,000 entities must receive andprocess 100,000 messages per minute. The aggre-gate (over all entities) computational capacity de-voted to reading the messages must increase10,000 times. The aggregate capacity devoted toother simulation tasks, such as image generation,may increase by a much smaller factor. Today, thecomputational resources devoted to scene render-ing dominate those devoted to message process-ing, but if the number of simulators wereincreased by a sufficiently large factor (whichOTA has not estimated) and no other changes weremade, messaging computation (and cost) woulddominate computation for other simulation tasks.The aggregate network communications capacity(“bandwidth”) servicing the entities would like-wise increase in proportion to the square of thenumber of entities, and the marginal cost of ad-ding one more simulator to the network would be-gin to increase approximately in proportion to thesquare of the number of simulators already on thenetwork. Thus, there is a limit on the number ofentities that can be netted economically, and thereis a limit on the number that can be netted at all,although technological advances are constantlyincreasing these limits.

There is also a problem of nonsalability inconfiguration control. The software for each typeof entity (e.g. vehicle or weapon) must be modi-fied when one new type of entity is allowed in thesimulation [147].

Developing a scalable DIS protocol is a chal-lenge to DOD and industry. Alternatives to databroadcasting that have been investigated recentlyinclude 1) the Event Distribution Protocol (EDP)developed by the MITRE Corp. [213] as a modifi-cation to the Aggregate Level Simulation Proto-col (ALSP) (which is not DIS-compliant but usesbroadcasting), and 2) the Distribution List Algo-rithm developed by NASA’s Jet Propulsion Labo-

Virtual Reality and Technologies for Combat Simulation 133

ratory and the MITRE Corp. [159]. Thedevelopers of the Distribution List Algorithmclaim that it also has a more logically rigorousmethod of synchronizing entities, which, they ar-gue, is needed for “analytic studies” but is not ascritical for the real-time training and systems ac-quisition decision-aiding uses to which DIS hasbeen applied.

| FlexibilitySome critics have urged that the successor to theDIS protocol be flexible enough to allow the mod-eling and distributed simulation of a variety ofprocesses other than those related to weapon andsensor platforms. Weather, epidemics, and eco-nomic (especially industrial) activity are proc-esses that some would like to see modeled [147];all can be relevant to combat. For example, thesuccess of World War II’s Operation Overlordhinged on the weather on D-Day [18]; the risk thatIraqis might use biological weapons and cause anepidemic affected planning, logistics, and opera-tions in Operation Desert Shield and OperationDesert Storm [153], as did the rapid development,production, and fielding of GBU-28 earth-pene-trating bombs and optical beacons (“Budd Lights”and “DARPA Lights” [166]) for distinguishingfriend from foe. Such interactions between opera-tional need and industrial response cannot now besimulated in realtime (or faster) except in refereedgames. ARPA’s Simulation-Based Design pro-gram is undertaking to facilitate simulation of se-lected industrial activities and their effects oncombat capabilities.

The flexibility to simulate diplomatic and do-mestic political processes might also be valuablefor training those who would manage the those di-mensions of the national response to a crisis. Re-ferreed political-military exercises have beenused for this purpose for decades. They are ofquestionable validity as predictive models but arenevertheless considered useful for training.

After the 1992 Senate Armed Services Com-mittee hearing on advanced simulation technolo-gy, Senator Strom Thurmond asked witnessGeneral Paul German, USA-Ret., Chairman ofDOD’s Simulation Policy Study, to answer thefollowing question for the record [173]:

. . . I would imagine that if you could modelpolitical and economic factors, you could makesimulators for crisis management training aswell. Have you recommended any analysis inthis area?

General German replied [72, p. 728]:

. . . My experience makes [me] a strong support-er of simulation as a way of preparing for futurecrises. That same experience leaves me doubtfulthat we will ever be able to adequately modelpolitical and economic factors. . . . Modelingwar is comparatively simple. Battle outcomes,after all, are rooted in fairly discrete physicalphenomena, and we are now well versed on howto allow humans realistically to interact withsimulations of these phenomena. . . .

German’s pessimism concerning crisis gamesechoes the skepticism expressed by RAND ana-lyst Robert A. Levine in 1964 [123]:

. . . if gaming is used, any research conclu-sions should stand and be defended entirely onthe basis of non-game “real-world” evidence.

This is another way of saying that if crisisgames are to be used as predictive models, thenthey should be validated—just as any other type ofmodel should be. A difficulty with output valida-tion of crisis games is that there is seldom morethan one real crisis evolving from a particular ini-tial scenario against which game results may becompared.27

The Advanced Research Project Agency isfunding the development and demonstration ofhuman-computer interaction (HCI) technologyapplicable to crisis management, health care, andeducation and training. “The vision of the HCIprogram is to greatly increase the ease and natural-

27 me same is tme of batt]e~ and ~ampaign~. In some cases, here me more data for validating platform engagement models.

34 I Virtual Reality and Technologies for Combat Simulation

ness of human interactions with complex comput-er-based systems, and to reduce the difficulty andrisk associated with user-system interface devel-opment,” according to the solicitation for propos-als [177]. The solicitation does not mentionvirtual reality by name, but VR technology couldcontribute to or benefit from the program.

I EffectivenessIn the 1992 SASC hearing on advanced simula-tion technology, Senator Thurmond asked Gener-al Paul German an important question about theeffectiveness of virtual simulation [173, p. 718]:

Although I support the use of simulators fortraining our service personnel, I realize that itcannot constitute the sole training method.What percentage of training using a tank crew asan example, can effectfully be accomplishedthrough the use of simulators?

General German replied that the question re-quires “a long and complicated answer,” but hegave a short one. He said that [networked virtual]simulators are “very good at showing tactical rela-tionships.” He cited one example of a training taskthat simulators are not very good at: providing anenvironment in which the loader, a member of atank’s crew, can learn to load 55-pound shellswhile the tank turret is moving as it would incombat.

OTA believes that that particular problemcould be solved, at considerable expense, bybuilding motion-base tank simulators using thetechnology developed for aircraft cockpit simula-tors, even before the first Link Trainer [139].However, as noted above, providing haptic feed-back to dismounted infantry soldiers remains achallenge. So does providing the “long and com-plicated answer” that General German said Sena-tor Thurmond needed. The answer must begin byrephrasing the general question as a list of specificquestions, because the answer depends on manyvariables, such as the specific tasks to be learned.If a list of tasks were in hand, it might be possible(but would be costly) to answer the simpler, fo-cused question: “Could simulators substitute for xpercent of these training hours (or dollars)?” A

closely related but more difficult question is:“Would it be cost-effective to use simulators tosubstitute for x percent of these training hours?”

| Cost-EffectivenessAssessing cost-effectiveness requires assessingeffectiveness (e.g., measuring training transfer ef-fectiveness) as well as measuring, estimating, orforecasting cost data. Hence it is even more diffi-cult than assessing effectiveness. Just as one can-not describe the effectiveness of simulators orsimulation in general; neither can one describe thecost-effectiveness of simulators or simulation ingeneral. It is necessary to ask the cost-effective-ness of a particular simulator or simulation for aparticular purpose, and to do this again and again,once for each combination of simulator and pur-pose.

As part of its assessment of combat modelingand simulation, OTA is searching for analyses ofsimulator and simulation cost-effectiveness. OTAwill report its findings in its final report on this as-sessment, which is due in 1995.

| InfrastructureSome of the technical challenges noted above—the development of better LCDs or other flat-pan-el displays for HMD, and the construction of awideband secure computer network—are issuesof industrial and civil infrastructure with implica-tions extending far beyond combat simulation,even to national economic competitiveness. Theyhave been recognized as such by the Administra-tion and by the committees of Congress with juris-diction over commerce, defense, and otheraffected activities. Some funding has been pro-vided to address these issues-directly, or bystimulating industry-through programs such asthose now included in the Defense TechnologyConversion Council’s Technology ReinvestmentProject (TRP). Progress and needs are reviewedannually, and funding is provided through the De-partment of Commerce (National Institute ofStandards and Technology), the Department ofDefense (Advanced Research Projects Agency),

Virtual Reality and Technologies for Combat Simulation 135

the Department of Energy, the National Aeronau-tics and Space Administration, and the NationalScience Foundation.

Collection and organization terrain and culturaldata into databases in formats that can be readilyused in virtual simulation is another issue with im-plications extending beyond the scope of virtualmilitary simulation [167,168]. Regarding its im-portance in naval applications of VR to striketraining and mission rehearsal, a Center for Naval

Analyses memorandum recently recommended tothe Navy’s Director for Air Warfare [210]:

Because the realism of the simulators usedfor aircrew training and mission rehearsal isdriven by the fidelity of the underlying data-bases, Navy requirements should focus on thespecification of database characteristics andclose coordination with the Defense MappingAgency to ensure timely availability of the re-quired products.

R eferencesand

Bibliography

The following works (not all of which are cited in the text or the footnotes of this background paper) are alpha-betized by author and by date for each author. The works include paper documents, videocassettes, online re-sources, and other types of information resources. The acronym URL stands for Uniform Resource Locator, aspecification—in a standard format used by the World Wide Web [12,77]—for resources available via the Inter-net. Unless otherwise noted, a citation containing a URL refers only to an online resource; a paper version may notbe published.

1.

2.

3.

4.

5.

6.

7.

8.

9.

4th Wave, Inc., “Virtual Reality Market Esti-mates,” Multimedia Monitor, July 1994, p. 28.Adam, J. A., “Virtual Reality Is for Real,” IEEESpectrum, Special Report: Virtual Reality, Octo-ber 1993, pp. 22-29.Adam, J. A., “Medical Electronics,” IEEE Spec-trum, January 1994, pp. 70-73.Alluisi, E. A., “The Development of Technologyfor Collective Training: SIMNET, a Case Histo-ry,” Human Factors, vol. 33, no. 3, 1991, pp.343-362.Andrews, A., and Maples, C., “VIEWSTM: TheVirtual Interactive Environment Workspace,”unpublished brochure, July 1, 1994.Apostolakis, G., “The Concept of Probability inSafety Assessments of Technological Systems,”Science, vol. 250, Dec. 7, 1990, pp. 1359-1354.“Arcade System Sources,” PCVR Magazine, no.15, May/June 1994, p. 9.Aukstakalnis, S., and Blatner, D., Silicon Mi-rage, S.F. Roth (cd.) (Berkeley, CA: PeachpitPress, 1992).Bartle, R., “Interactive Multi-User ComputerGames,” December, 1990; (c) MUSE Ltd., Brit-ish Telecom plc.URL=ftp://cogsci.uwo.ca/pub/vr/papers/mudreport.txt.

10.

11.

12.

13.

14.

15.

16.

Baumann, J., “Military Applications of VirtualReality,” Encyclopedia of Virtual Environ-ments, as of Aug. 16, 1994.URL=http://gimble.cs.umd.edu/vrtp/eve-articles/11 .G.Military.html.Bickers, C., “UTD: High Fidelity at Low Cost,”Jane’s Defence Weekly, Aug. 27, 1994, p. 24.Boutell, T., et al., “World Wide Web FrequentlyAsked Questions,” Sept., 2, 1994.URL=http://sunsite.unc.edu/boutell/faq/www–faq.html.Bricken, M., “Building the VSX Demonstra-tion: Operations with Virtual Aircraft in VirtualSpace,” HITL-M-90-9 (Seattle, WA: Universityof Washington, Human Interface TechnologyLaboratory, 1990).Bricken, M., “Gender Issues in Virtual RealityTechnology,” presentation, HITL-P-91-6(Seattle, WA: University of Washington, Hu-man Interface Technology Laboratory, 1991).Brill, L.M., “Virtual Reality,” Funworld, July1994, pp. 18ff.Brooks, F., Ouh-Young, M., Batter, J., and Kil-patrick, P., “Project GROPE—Haptic Displaysfor Scientific Visualization,” Computer Graph-ics, vol. 24, no. 4, August 1990, pp. 177-185.

137

38 I Virtual Reality and Technologies for Combat Simulation

17. Brooks, F. P., testimony, in U.S. Congress, Sen-ate Committee on Commerce, Science, andTransportation, Subcommittee on Science,Technology, and Space, New Developments inComputer Technology: Virtual Reality, hearing,Serial No. 102-553, May 8, 1991.

18. Brown, A.C., Bodyguard of Lies (New York,NY: Harper and Row, 1975).

19. Burdea, G., and Coiffet, P., Virtual RealityTechnology (New York, NY: John Wiley &Sons,1994).

20. Burgess, D., “3-D Sound FTP Site,” as of Aug.16, 1994.URL==ftp://multimedia.cc.gatech.edu/papers/3Daudio/README.html.

21. Cantele, J., et al., “Silicon Graphics’ SILICONSURF Home Page,” June 22, 1994.URL-http://www.sgi.com/.

22. Cartwright, G.F. “Virtual or Real? The Mind inCyberspace,” The Futurist, March-April 1994,pp. 22-26.

23. Casner, S., “Frequently Asked Questions (FAQ)on the Multicast Backbone (MBONE),” Aug.15, 1994.URL=ftp://venera.isi.edu/mbone/faq.txt.

24. Ceranowicz, A., “Modular Semi-AutomatedForces,” Proceedings of Elecsim 1994: Elec-tronic Conference on Constructive TrainingSimulation, R.D. Smith (cd.) (conducted on theInternet, April 11 - May 27, 1994).URL=ftp://ftp.mystech.com/pub/elecsimELECS194.ZIP, May 27, 1994.URL=ftp://ftp.mystech.com/pub/elecsim/ELECS194.sea.hqx, June 8, 1994.

25. Chien, Y.T., and Jenkins, J. (eds.), VirtualReal-ity Technology (Washington, DC: U.S. Gover-nment Printing Office, 1994).

26. Christenson, W.M., and Zirkle, R.A., 73 EastingBattle Replication JANUS Combat Simula-tion, IDA P-2270 (Alexandria, VA: The Institutefor Defense Analyses, September 1992).

27. Clark, S., "A Review of VR Resources on the In-ternet,” pp. 47-53 in Proceedings of the FirstU.K. Virtual Reality Special Interest Confer-ence, Held on the 14th March 1994 at Notting-ham University.URL=ftp://eta.lut.ac.uk/Public/vrsig94/Proceedings.zip. Also available as

URL=ftp://eta.lut.ac.uk/Public/vrsig94/Proceedings.txt andURL=ftp://eta.lut.ac.uk/Public/vrsig94/Proceedings.ps.

28. CM Research, “Thermal Feedback,” PCVRMagazine, no. 14, March/April 1994, pp. 14-18.

29. Cook, N., “VR: Even Better than the RealThing,” Jane’s Defence Weekly, vol. 21, no. 8,February 26, 1994, pp. 24-25.

30. Cramer, J.G., “News from CyberSpace: VR andHypertext, Analog Science Fiction and Fact,vol. 114, July 1994, pp. 217-221.

31. Criterion Software Ltd., “DOSTANK,” a Ren-derWare TM software demonstration program,Aug. 5, 1994.URL=ftp://l92.88.128.32/pub/renderware/dos/demo/dostank.zip.

32. Cruz-Neira, C., et al., Scientists in Wonderland:A Report on Visualization Applications in theCAVE Virtual Reality Environment, as of Aug.,17, 1994.URL=http://www.ncsa.uiuc.edu/EVL/docs/cave/vrpaper/report.html

33. CyberEdge Journal (quarterly), vol. 1, no. 1,Special edition: Military & Aerospace (Sausali-to, CA: The Delaney Companies, fall 1993).

34. Cynamon, J.J., “The Basics,” Simulation Val-idation Workshop Proceedings (SIMVAL II),A.E. Ritchie (cd.) (Alexandria, VA: MilitaryOperations Research Society, April 1992).

35. Davis, P. K., “Some Lessons Learned fromBuilding Red Agents in the RAND Strategy As-sessment System (RSAS),” Human Behaviorand Performance as Essential Ingredients inRealistic Modeling of Combat-MORIMOC II,(Alexandria, VA: Military Operations ResearchSociety, February 1989).

36. Davis, P. K., A Framework for Verification, Val-idation, and Accreditation, report R-4249-ACQ(Santa Monica, CA: The RAND Corp., 1992).

37. Davis, P. K., “A Framework for Verification,Validation, and Accreditation,” Simulation Val-idation Workshop Proceedings (SIMVAL II),A.E. Ritchie (cd.) (Alexandria, VA: MilitaryOperations Research Society, April 1992).

38. Davis, P.K., and Blumenthal, D., The Base ofSand Problem: A White Paper on the State of

References and Bibliography 139

Militaq Combat Modeling, RAND NoteN-3148 -OSD/DARPA (Santa Monica, CA: TheRAND Corporation, 1991).

39. DeFanti, T. A., Sandin, D. J., and Cruz-Neira, C.,“A ‘Room’ with a ‘View,’” Z&Z~ Spectrum, Oc-tober 1993, pp. 30-33.

40. Delaney, B., “A Survey of Head Mounted Dis-plays,” Al Expert, Virtual Reality ’93: Fall Spe-cial Report, September 1993, pp. 21-24.

41. Delaney, B., “Virtual Reality Lands the Job,”ZVewMedia, ISSN 1060-7188, vol. 4, no. 8, Au-gust 1994, pp. 40-45.

42. DIASPARTM Virtual Reality Network, computerbulletin board system (BBS), dial (714)376-1200 for a 1,200-baud connection or (714)376-1234 for a 9,600-baud connection. Also ac-cessible by telnet.URL=telnet://diaspar. corn.

43. DIS Steering Committee, The DIS Vision, AMap to the Future of Distributed Simulation,version 1, IST-SP-94-01 (Orlando, FL: The Uni-versity of Central Florida, Institute for Simula-tion and Training, May 1994).

44. Division, Inc., “Review,” (brochure), 1993.45. Dunn, W. H., (U.S. Army Modeling and Simula-

tion Management Office) “VV&A Implementa-tion Initiatives in Army & DoD,” briefingcharts, University of Alabama in Huntsville,Mar. 22, 1994.

46. duPont P., electronic mail message, with attach-ments, to Toni Emerson, Apr. 12, 1994, archivedin:URL=ftp://stein2 .u.washington.edu/publ ic/v irtual-worlds/faq/commerc ial/Division-press-releases.

47. Emerson, T., Virtual Reality Updute, vol. 2, no.2, March/April 1994.URL=ftp://stein2 .u.washington.eduJpubl ictVirtualReality/H ITL/BibliographieslVRU/vru-v2n2.txt.

48. Emerson, T., “Virtual Audio Bibliography,”technical report B-94-2, (Seattle, WA: Universi-ty of Washington, Human Interface TechnologyLaboratory, 1994).URL=ftp://stein2 .u.washington.edu/public/VirtualReality/H ITL/Bibliographies/Virtual-Audio-B ibliography.txt.

“ Emerson, T., “Virtual Interface Technology49.

50.

51.

52.

53.

54.

55.

56.

57.

58.

59.

Bibliography, Selected Citations from the Liter-ature,” revised 1994.uRL=ftp://stein2 .u.washington.edu/public/VirtualReality/HITL/papers/tech-reports/emerson-B-93-2.txtEmerson, T., “Information Resources in VirtualReality (IRVR),” Apr. 6, 1994.URL=ftp://stein2 .u.washington.edu/publ ic/VirtualReality/H ITL/papers/tech-reports/irvr.txt.Emerson, T., Virtual Reality Updute, vol. 2, no.3, September 1994.URL=ftp://l 40. 142.56 .2/public/VlrtualReality/HITL/Bibliographies/VRU/vru-v2n3 .txt.Emmett, A., “Simulations on Trial,” TechnologyReview, May/June 1994, pp. 30-36.Erichsen, M. N., Weapon System Sensor Integra-tion for a DIS-Compatible Virtual Cockpit,thesis, Air Force Institute of Technology, AFIT/GCS/ENG/93-07, December 1993. DTICAD-A274 088.“Europeans, Japanese Invest Heavily in MEMSResearch,” Aviation Week and Space Technolo-gy> March 1, 1993, p. 39.Farrow, S., “Multi-Service Distributed TrainingTestbed (MDT2),” Distributed Interactive Sim-ulation (DIS) Joint Newsletter, vol. 2, no. 5 (Or-lando, FL: Naval Air Warfare Center, TrainingSystems Division, July 1994).“Feedback Devices,” PCVR Magazine, no. 14,March/April 1994, p. 19.Firday, P. N., and Wilson, J. M., “The Paucity ofModel Validation Operational Research Proj-ects,” OR, the journal of the Operational Re-search Society, vol. 39, no. 4, 1987, pp. 303-308.Flexman, R.E., Roscoe, S. N., Williams, A.C.Jr., and Williges, B. H., Studies in PiZot Training:The Anatomy of Transfer (Urbana, IL: Engineer-ing Publications Office, College of Engineering,University of Illinois at Urbana-Champaign,1972).Freedman, J. E., and Starr, S. H., “Use of Simula-tion in the Evaluation of the IFFN Process,” inNorth Atlantic Treaty Organization (NATO)Advisory Group for Aerospace Research andDevelopment (AGARD) Conference Proceed-

40 I Virtual Reality and Technologies for Combat Simulation

ings No. 268 (AGARD-CP-268), presented at aMeeting of the Avionics Panel, Paris, France, 15-19 October 1979.

60. Fumess, T., “The Super Cockpit and HumanFactors Challenges,” HITL-M-86-1 (Seattle,WA: University of Washington, Human Inter-face Technology Laboratory, 1986).

61. Fumess, T., “Designing in Virtual Space,”HITL-R-87-1 (Seattle, WA: University ofWashington, Human Interface Technology Lab-oratory, 1987).

62. Fumess, T., “Harnessing Virtual Space,” HITL-M-88-1 (Seattle, WA: University of Washing-ton, Human Interface Technology Laboratory,1988).

63. Fumess, T., “Configuring Virtual Space for theSuper Cockpit,” HITL-M-89-1 (Seattle, WA:University of Washington, Human InterfaceTechnology Laboratory, 1989).

64. Fumess, T., “Creating Better Virtual Worlds,”HITL-M-89-3 (Seattle, WA: University ofWashington, Human Interface Technology Lab-oratory, 1989).

65. Fumess, T., in U.S. Congress, Senate Commit-tee on Commerce, Science, and Transportation,Subcommittee on Science, Technology, andSpace, New Developments in ComputerTechnology: Virtual Reality, hearing, Serial No.102-553, May 8, 1991, pp. 23-48.

66. Gaver, D., “Face Validation and Face Validity,”Simulation Validation Workshop Proceedings(SMVAL /1), A.E. Ritchie (cd.) (Alexandria,VA: Military Operations Research Society,April 1992).

67. Georgia Institute of Technology, Graphics, Vi-sualization, and Usability Center, Virtual Envi-ronments Group, “GVU Center VirtualEnvironments Group, Home Page,” as of Sept.20, 1994.URL=http://www.cc.gatech.edu/gvu/virtual/VirtualEnvironments.html.

68. Giadrosich, D., “Validating Models and Simula-tions,” Simulation Validation Workshop Pro-c e e d i n g s (SIMVAL H), A.E. Ritchie ( c d . )(Alexandria, VA: Military Operations ResearchSociety, April 1992).

69. Gibbons, J. H., President’s Science Advisor, andPanetta, L. E., Director, Office of Managementand Budget, Memorandum for the Heads of

Executive Departments and Agencies, Subject:“FY 1996 Research and Development (R&D)Priorities,” May 6, 1994.

70. Gore, A., opening remarks, in U.S. Congress,Senate Committee on Commerce, Science, andTransportation, Subcommittee on Science,Technology, and Space, New Developments inComputer Technology: Virtual Reality, hearing,Serial No. 102-553, May 8, 1991, p. 1.

71. Gore, R. A., “Reality or Virtual Reality? The Useof Interactive, Three-Dimensional ComputerSimulations at Trial,” Rutgers Computer &Technology Law Journal, vol. 19, 1993, pp.459-493.

72. German, P., testimony, in U.S. Congress, SenateCommittee on Armed Services, Department ofDefense Authorization for Appropriations forFiscal Year 1993 and the Future Years DefenseProgram, hearings, Serial No. 102-883, part 1,May 21, 1992, pp. 673-730.

73. Gradecki, J., “Virtual Reality Arcade Systems,”PCVR Magazine, no. 15, May/June 1994, pp.5-8.

74. Grimes, G., “Digital Data Entry Glove Inter-face,” U.S. Patent No. 4,414,537, filed Sept. 15,1981.

75. Grimes, J. “Virtual Reality Goes Commercialwith a Blast,” IEEE Computer Graphics and Ap-plications, March 1992, pp. 4-7.

76. Hamit, F., Virtual Reality and the Exploration ofCyberspace (Carrnel, IN: Sams Publishing,1993).

77. Hayes, B., “The World Wide Web,” AmericanScientist, vol. 82, September-October 1994, pp.416-420.

78. HCI Launching Pad (links to Human-ComputerInterface Bibliography Project, et al.), as ofAug. 16, 1994.URL=http://hydra.bgsu.edu/HCI/.

79. “Head Mounted Display Study; What’s Wrongwith Your Head Mounted Display?”, Cyber-Edge Journal, supplement to issue#17, Septem-ber-October 1993.

80. Heeter, C., “BattleTech Demographics,” IEEEComputer Graphics and Applications, March1992, pp. 4-7.

81. Heim, M., The Metaphysics of Virtual Reality(New York, NY: Oxford University Press, Inc.,1993).

References and Bibliography 141

82. Heinlein, R. A., “Waldo,” Amazing Science Fic-tion, August 1942.

83. Helsel, S., “BattleTech Center,” Virtual RealilyReport (Westport, CT: Meckler Corp., 1991).

84. Henderson, D., “The Multidimensional Spaceof Validation,” Simulation Validation WorkshopProceedings (SIMVAL IZ), A.E. Ritchie (cd.)(Alexandria, VA: Military Operations ResearchSociety, April 1992).

85. Henry, D., “The Design of TopoSeattle,” techni-cal report R-92-4, 1992.URL=ftp://stein2 .u.washington.edu/public/VirtualReality/H ITL/papers/tech-reports/henry -R-92-4.ps.

86. Hepburn, F., Kaiser Electro-Optics, Inc., per-sonal communication (fax), Aug. 18, 1994.

87. Hodges, A., Alan Turing: The Enigma (NewYork, NY: Simon and Schuster, 1983).

88. Hodges, L. F., et al., “Virtual Environments,Phobia Project,” as of Sept. 19, 1994.URL=http://www. cc.gatech.edu/gvtiv irtual/Phobia. html.

89. Hoffman, H., and Lopez, L., “Mouse Tactors,”PCVR Magazine, no. 14, MarcMApril 1994, pp.11-13.

90. Hollands, R., “The Virtual Reality Primer,”PCVR Magazine, no. 15, May/June 1994, p. 17.

91. Holzer, R., “Panel: Simulation Can Aid JointForces,” Defense News, August 1-7, 1994, p. 22.

92. Holzer, R., “U.S. Debates Live Fire vs. Simula-tion,” Defense News, August 1-7, 1994, pp.22-23

93. Howe, D. (cd.), “The Free On-line Dictionary ofComputing,” as of Sept. 14, 1994.URL=http://wombat. doc.ic.ac.uld.

94. “Human Interface Technology Laboratory Bib-liography, Virtual World Database,” Jan. 28,1992.URL=ftp://stein.u. washington.edu/publ ic/VirtualReality/ HITL/B ibliographies/fumess-bib.txt.

95. Institute of Electrical and Electronic Engineers,IEEE Standard Glossaq of Modeling and Simu-lation Terminology, IEEE Standard 610.3-1989(Piscataway, NJ: IEEE Customer Service Cen-ter, 1989).

96. Institute of Electrical and Electronic Engineers,IEEE Standard Glossary of Computer GraphicsTerminology, IEEE Standard 610.6-1991 (Pis-

cataway, NJ: IEEE Customer Service Center,1991).

97. Institute of Electrical and Electronic Engineers,IEEE Standard for Information Technology -Protocols for Distributed Interactive SimulationApplications, Entity Information and Interac-tion, IEEE Standard 1278-1993 (Piscataway,NJ: IEEE Customer Service Center, 1993).

98. Isdale, J., “What Is Virtual Reality? A Home-brew Introduction and Information ResourceList,” Oct. 81993.URL=ftp://ftp.u. washington.edu/publ ic/v irtual-worlds/papers/what isvr.txt.

99. Jacobs, J.F., The SAGE Air Defense Systems: APersonal History (Bedford, MA: MITRE Corp.,1986).

100. Jacobson, R., “Applying the Virtual Worlds Par-adigm to Mapping and Surveying Data,” VirtualReaZi~ World, September/October 1994, pp.61-69.

101. Jenks, A. “Could the Surgeon General Warn:‘VR Is Hazardous to Your Health’ ?,” Washing-ton Technology, vol. 9, no. 8, July 28, 1994, pp.1,28, and 30.

102. Joly, M., “Virtual World Entertainment,” PCVRMagazine, no. 15, May/June 1994, pp. 14-16.

103. Kaiser Electro-Optics, Inc., “SIM EYETM,” bro-chure, Oct. 11, 1993.

104. Kaiser Electro-Optics, Inc., “VIMTM personalviewerTM,” brochure, Apr. 1, 1994.

105. Kalawsky, R. S., The Science of Virtual Realityand Virtual Environments (Reading, MA: Addi-son-Wesley, 1993).

106. Karnow, C. E. A., “Implementing the FirstAmendment in Cyberspace,” Virtual RealityWorld (supplement to Multimedia Review), vol.1, no. 2, Summer 1993, pp. J-N.

107. Karnow, C.E.A., “The Uneasy Treaty ofTechnology and Law,” Virtual Reality SpecialReport, vol. 1, no. 1, 1994, pp. 33-38.

108. Karnow, C.E.A., “The Electronic Persona: ANew Legal Entity,” Virtual Reality World, vol.2, no. 1, January 1994, pp. 37-40.

109. Karr, C. R., and Root, E., “Eagle/DIS Interface,”Proceedings of Elecsim 1994: Electronic Con-ference on Constructive Training Simu!alion,R.D. Smith (cd.) (conducted on the Internet,April 11- May 27, 1994).

42 i Virtual Reality and Technologies for Combat Simulation

110.

111.

112.

113.

114.

115.

116.

117.

118.

119.

120.

121.

URL=ftp://ftp.my stech.com/pub/elecsirn/ELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.com/pub/elecsim/ELECS194.sea.hqx, June 8, 1994.Kelly, L., The Pilot Maker (New York, NY:Grosset and Dunlap, 1970).Kercheval, H., “MUSE Gives Virtual Reality aHumanistic New Twist,” Sandia Lab News, vol.46, no. 10 (Albuquerque, NM: Sandia NationalLaboratories, May 13, 1994), pp. 1,5, and 6.Kitfield, J., “Trading Bullets for Bytes,” Gov-ernment Executive, June 1994, pp. 18-25.Kollin, J. S., “The Virtual Retinal Display,” July27, 1993.URL=ftp://stein2 .u.washington.edu/publ ic/VirtualReality/H ITL/papers/General/VirtualRetinalDisplay. ps.Kooper, R., “Meta Virtual Environments,” as ofAug. 15, 1994.URL=http://www.cc. gatech.edu/gvu/people/Masters/Rob. Kooper/Meta.VR. htmlKooper, R., “Virtual Environments, Acropho-bia,” as of Aug. 15, 1994.URL=http://www. cc.gatech.edu/gvu/virtual/Acrophobia.html.URL=http://www.cc. gatech.edu/gvu/v irtual/mpeg/elevator.mpg.Lamson, R. J., “Virtual Therapy of Anxiety Dis-orders,” CyberEdge Journal, Issue no. 20, vol.4, no. 2, March/April 1994, p. lff.Lanier, J. Z., and Zimmerman, T. G., “ComputerData Entry and Manipulation Apparatus andMethod,” U.S. Patent No. 4,988,981, filed Feb.28, 1989, issued Jan. 29, 1991.Lanier, J. Z., testimony, in U.S. Congress, SenateCommittee on Commerce, Science, and Trans-portation, Subcommittee on Science, Technolo-gy, and Space, New Developments in ComputerTechnology: Virtual Reality, hearing, Serial No.102-553, May 8, 1991.Larijani, L. C., The Virtual Reality Primer (NewYork, NY: McGraw-Hill, 1993).Latta, J., 4th Wave, Inc., facsimile transmission,Sept. 9, 1994.Lederrnan, S.J., and Taylor, M. M., “FingertipForce, Surface Geometry, and the Perception ofRoughness by Active Touch,” Perception andP.Yy~’ll<~p/l>~.~i(Lr , vol. 12, no. 5, 1972, pp.401-408.

122. Lederman, S.J., and Klatzky, R.L., “HandMovements: A Window into Haptic Object Rec-ognition,” Cognitive Psychology, vol. 19, no. 3,1987, pp. 342-368.

123. Levine, R., Schelling, T., and Jones, W., CrisisGames 27 Years Later: Plus C’Est Deja Vu,P-7719 (Santa Monica, CA: RAND, 1991).

124. Lewis, R. O., (General Research Corp., Hunts-ville, AL), “Position Paper for the IOth Verifica-tion, Validation, and Accreditation (VV&A)Process for Distributed Interactive Simulations(DIS),” 1994.

125. Loeffler, C. E., and Anderson, T. (eds.), TheVirtual Reality Casebook, (New York, NY: VanNostrand Reinhold, July 1994).

126. Lytle, D., “A Radical New Approach to VirtualReality,” Photonics Spectra, August 1994, p.48.

127. Mathews, J., and Anselmo, J., “Even Good Sim-ulation Has Its Limitations,” Aviation Week andSpace Technology, Sept. 12, 1994, pp. 69 and 71.

128. McDonough, J., “Doorways to the VirtualBattlefield,” Virtual Reality ’92 ConferenceProceedings: Virtual Reality Becomes a Busi-ness (Westport, CT: Meckler Corp., 1992), pp.104-114.

129. McVey, S., “STOW: SIMNET Was Just the Be-ginning,” CyberEdge Journal Special Edition:Military & Aerospace, vol. 1, no. 1, Fall 1993,pp. 1 and 3.

130. McVey, S., “The Battle of 73 Easting, 4D Histo-ry Books,” CyberEdge Journal Special Edition:Military & Aerospace, vol. 1, no. 1, Fall 1993,pp. 1-2.

131. McVey, S., “VR in Weapon Systems Develop-ment,” CyberEdge Journal Special Edition:Military & Aerospace, vol. 1, no. 1, Fall 1993,p. 2.

132. “Medicine Meets Virtual Reality,” as of Aug.16, 1994.URL=http://bitmed. ucsd.edu/text-repository/mmvr.html

133. Metzger, J., “Verification,” Simulation Valia!a-tion Workshop Proceedings (SIMVAL II), A.E.Ritchie (cd.) (Alexandria, VA: Military Opera-tions Research Society, April 1992).

134. Miller, T., and Milliner, R., “Toy Scouts,” as ofAug. 15, 1994.

References and Bibliography 143

135.

136.

137.

138.

139.

14Q.

141.

142.

143.

144.

145.

146.

URL=http://www.vsl. ist.ucf.edu/projects/scouts/scouts. html.Nelson, T. H., Computer Lib: You Can andillustUnderstand Computers Now and Dream Ma-chines(texts printed together back-to-back and in-verted), 1st ed. (Chicago, IL: [available from]Hugo’s Book Service, c 1974).Nelson, T. H., Computer Lib; Dream Machines,Rev. ed. (Redmond, WA: Tempus Books ofMicrosoft Press, 1987).Null, C. H., and Jenkins, J. P., NASA Virtual Envi-ronment Research, Applications, and Technolo-gy (Washington, DC: National Aeronautics andSpace Administration, 1993).MUSE Demonstration , version 1.0, Sept. 16,1993, (VHS videocassette), Division 1415, San-dia National Laboratory, P.O. Box 5800, Albu-querque, NM 87185, 1993.Parrish, L., Space-Flight Simulation Technolo-gy, 1st ed. (Indianapolis, IN: H. W. Sarns, 1969).Pimentel, K., and Teixeira, K., Virtual Reality:Through the New Looking Glass (New York,NY: Windcrest Books, McGraw-Hill, Inc.,1992).Prothero, J., “A Survey of Interface GoodnessMeasures,” technical report R-94-1, 1994.URL=ftp://stein2 .u.washington.edu/publ ic/VirtualReality/H ITL/papers/tech-reports/prothero-.ps.Pulkka, A. K., “Sci.Virtual-Worlds Meta-FAQ(Frequently Asked Questions),” version 94.01,1994.URL=ftp://stein2 .u.washington.edu/publ ic/virtual-worlds/Meta-FAQ.Ramsdale, C., Division, Inc., personal commu-nication, Aug. 22, 1994.Ressler, S., “Movie Samples,” April 11, 1994, asof Aug. 15, 1994.URL=http://nemo. ncsl.nist.gov/-sressler/OVRTmovies.html.Ressler, S., “Open Virtual Reality Testbed,”April 26, 1994.URL=http://nemo.ncsl .nist.gov/-sressler/OVRThome.html.Ressler, S., “Surrogate Travel and MOSAIC,”April 11, 1994.URL=http://nemo. ncsl.nist.gov/-sressler/projects/nav/sur/navSurr.html.

147. Reynolds, P.F., Jr., “DISorientation,” Proceed-ings of Elecsim 1994: Electronic Conference onConstructive Training Simulation, R.D. Smith(cd.) (conducted on the Internet, April 11- May27, 1994).URL=ftp://ftp.my stech.com/publelecsitiELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.com/pub/elecs im/ELECS194.sea.hqx, June 8, 1994.

148. Rheingold, H., Virtual Reality (New York, NY:Simon & Schuster, 1991).

149. Rheingold, H. The Virtual Community: Home-steading on the Electronic Frontier (Reading,MA: Addison-Wesley, 1993).

150. Richmond, A., “WebStars: Virtual Reality,” asof Aug. 15, 1994.URL=http://guinan.gsfc.nasa.gov/WebStm#VR.html.

151. Ritchie, A.E. (cd.), Simulation Valiaktion Work-shop Proceedings (SIMK4L II), (Alexandria,VA: Military Operations Research Society,April 1992).

152. Roland, E., Kelleher, E., and Brandt, K., “Mod-eling Coalition Warfare: Multi-Sided Simula-tion Design,” Proceedings of Elecsim 1994:Electronic Conference on Constructive TrainingSimulation, R.D. Smith (cd.) (conducted on theInternet, April 11- May 27, 1994).URL=ftp://ftp.my stech.corn/pub/elecsitiELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.corn/pub/elecsitiELECS194.sea.hqx, June 8, 1994.

153. Scales, Brigadier General Robert S., Jr., U.S.Army, Certain Victory: The US Army in the GulfWar (Washington, DC: Office of the Chief ofStaff, United States Army, 1993).

154. Scott, W. B., “Micro-Machines Hold Promise forAerospace,” Aviation Week and Space Technol-ogy, March 1, 1993, pp. 36, 37, and 39.

155. Seglie, E., and Sanders, P., “Accreditation,”Simulation Validation Workshop Proceedings(SZMVAL 11), A.E. Ritchie (cd.) (Alexandria,VA: Military Operations Research Society,April 1992).

156. Simsalabim Systems (Berkeley, CA), Cylxr-Scope demonstration software for IBM PC/AT,1994.URL= ftp://cogsci.uwo.ca from the directory/pub/vr/cyberscope-demo/cyberfhn.exe. See also

44 I Virtual Reality and Technologies for Combat Simulation

URL= ftp://cogsci.uwo.ca from the directory/pub/vr/cyberscope-demo/readme. 1st.

157. Smith, R.D. (cd.), Proceedings ofEZecsirn 1994:Electronic Conference on Constructive TrainingSimulation (conducted on the Internet, April 11- May 27, 1994).URL=ftp://ftp.my stech.com/pub/elecsirn/ELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.corn/pub/elecsirn/ELECS194.sea.hqx, June 8, 1994.

158. Stampe, D., Roehl, B., and Eagan, J., VirtualReality Creations: Explore, Manipulate andCreate Virtual Worlak on Your PC (Corte Mad-era, CA: Waite Group Press, 1993)

159. Steinman, J. S., and Wieland, F., “ParallelProximity Detection and the Distribution ListAlgorithm,” Proceedings of Elecsim 1994:Electronic Conference on Constructive TrainingSimulation, R.D. Smith (cd.) (conducted on theInternet, April 11- May 27, 1994).URL=ftp://ftp.mystech.corn/pub/elecsim/ELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.corn/pub/elecsirn/ELECS194.sea.hqx, June 8, 1994.

160. Stipp, D., “ ‘Phantom’ Simulates Wielding aScalpel, Tossing a Ball,” Wall Street Journal,Aug. 23, 1994, pp. B1 and B6.

161. Stock fisch, J. A., Models, Data, and War: A Cri-tique of the Study of Conventional Forces,RAND Report R-1526-PR (Santa Monica, CA:The RAND Corporation, 1975).

162. Sutherland, I.E., “The Ultimate Display,” In-formation Processing, 1965; Proceedings ofIFIP Congress 65, vol. 2, Kalenich, W.A. (cd.)(Washington, DC: Spartan Books, 1965).

163. Task Group on Virtual Reality, Virtual RealityAssessment, A Report of the Task Group onVirtual Reality to the High Performance Com-puting and Communications and InformationTechnology (HPCCIT) Subcommittee of the In-formation and Communications Research andDevelopment Committee of the National Sci-ence and Technology Council (NSTC), Apr. 15,1994.

164. The Economist, March 5, 1994, supplement:Manufacturing Technology Survey, “Living inthe Material World,” pp. 11-13.

165. U.S. Congress, House of Representatives, Con-ference Report on H.R. 4739, National Defense

166.

167.

168.

169.

170.

171.

172.

173.

174.

175.

Authorization Act for Fiscal Year 1991, Confer-ence Report, Congressional Record 139( 146, Pt.2): H12167, Oct. 23, 1990.U.S. Congress, Office of Technology Assess-ment, Who Goes There: Friend or Foe?, OTA-ISC-537 (Washington, DC: U.S. GovernmentPrinting Office, June 1993).U.S. Congress, Office of Technology Assess-ment, The Future of Remote Sensing ji-omSpace: Civilian Satellite Systems and Applica-tions, OTA-ISC-558 (Washington, DC: U.S.Government Printing Office, July 1993).U.S. Congress, Office of Technology Assess-ment, Data Format Standards for Civilian Re-mote Sensing Satellites, OTA-BP-ISC-114(Washington, DC: U.S. Government PrintingOffice, July 1993).U.S. Congress, Office of Technology Assess-ment, Assessing the Potential for Civil-MilitaryIntegration: Technologies, Processes, and Prac-tices, OTA-ISS-611 (Washington, DC: U.S.Government Printing Office, in press).U.S. Congress, Office of Technology Assess-ment, Distributed Interactive Combat Simula-tion, Background Paper (Washington, DC: U.S.Government Printing Office, in press).U.S. Congress, Senate Committee on Appropri-ations, Senate Report SR101-521, pp. 154-155.U.S. Congress, Senate Committee on Com-merce, Science, and Transportation, Subcom-mittee on Science, Technology, and Space, NewDevelopments in Computer Technology: VirtualReality, hearing, Serial No. 102-553, May 8,1991.U.S. Congress, Senate Committee on ArmedServices, Department of Defense Authorizationfor Appropriations for Fiscal Year 1993 and theFuture Years Defense Program, hearings, SerialNo. 102-883, part 1, May 21,1992, pp. 673-730.U.S. Department of Commerce, Industrial Out-look 1994 (Pittsburgh, PA: Superintendent ofDocuments, P.O. Box 371954, 1994), chapter27.URL=ftp://l4l .211 .190. 102/ebb/industry/outlook/indout27 .txt.U.S. Department of Defense, Advanced Re-search Projects Agency, “Simulation-Based De-sign System (SBDS),” announcement ofcontract award, Broad Agency Announcement

References and Bibliography 145

176.

177.

178.

179.

180.

181.

182.

183.

184.

185.

BAA 92-38, Commerce Business Daily, IssuePSA-0956, October 21, 1993.U.S. Department of Defense, Advanced Re-search Projects Agency, “Simulation-Based De-sign Workshop,” Special Notice SN93-04,Commerce Business Daily, Issue PSA-0841,May 7, 1993.U.S. Department of Defense, Advanced Re-search Projects Agency, “Research In Human-Computer Interaction,” solicitation, BroadAgency Announcement BAA 93-42, CommerceBusiness Daily, September 20, 1993.U.S. Department of Defense, Advanced Re-search Projects Agency, “Simulation-Based De-sign Enabling Technologies,” solicitation,Broad Agency Announcement BAA 93-45,Commerce Business Daily, Issue PSA-0942,September 30, 1993.U.S. Department of Defense, Advanced Re-search Projects Agency, “Advanced BiomedicalTechnology,” solicitation, Broad Agency An-nouncement BAA 94-14, Commerce BusinessDaily, January 27, 1994.U.S. Department of Defense, Advanced Re-search Projects Agency, “SBD Prototype De-velopment,” solicitation, Broad AgencyAnnouncement BAA 94-39, Commerce Busi-ness Daily, Issue PSA-1 107, June 1, 1994.U.S. Department of Defense, Advanced Re-search Projects Agency, “BAA Proposer In-formation Packet,” [for BAA94-39], ARPAlistserver, list “baa94-39”, file “pip”, as of Aug.16, 1994.U.S. Department of Defense, Advanced Re-search Projects Agency, “Frequently AskedQuestions,” [about BAA94-39], ARPA listserv-er, list “baa94-39”, file “faq,” as of Aug. 16,1994.U.S. Department of Defense, Department of theArmy, Headquarters, Army Model and Simula-tion Management Program, Army Regulation5-11, 10 June 1992.U.S. Department of Defense, Department of theArmy, Headquarters, Verification, Validation,and Accreditation of Army Models and Simula-tions, Department of the Army Pamphlet 5-11(DA PAM 5-1 1), 15 October 1993.U.S. Department of Defense, Department of theArmy, Army Model and Simulation MasterPlan, May 1994.

186.

187.

188.

189.

190.

191.

192.

193.

194.

195.

U.S. Department of Defense, Department of theArmy, Army Research Institute for the Behav-ioral and Social Sciences, ARI MacHTTP Serv-er, as of Aug. 14, 1994,URL=http://alex-immersion.army.mil/vr.html.U.S. Department of Defense, Department of theArmy, Army Research Laboratory, SimulationTechnology Division, Simulation MethodologyBranch, “Synthetic Environments Technologyfor the Individual Soldier,” as of Aug. 14, 1994.URL=http://info.arl. arrny.mil/ACIS/STD/SMB/VR/vr.html.U.S. Department of Defense, Department of theArmy, Army Research Laboratory, SimulationTechnology Division, Simulation MethodologyBranch, “STD Variable Resolution Terrain,” asof Aug. 22, 1994.URL=http://info.arl. army. mil/ACIS/STD/SMB/VRT/vrt.html.U.S. Department of Defense, “Defense Acquisi-tion Management Policies and Procedures,” De-partment of Defense Instruction, Number5000.2, February 23, 1991.U.S. Department of Defense, “Modeling andSimulation Management Plan,” Deputy Secre-tary of Defense Memorandum, June 21, 1991.U.S. Department of Defense, “DoD Modelingand Simulation (M&S) Management,” Depart-ment of Defense DirectiveJanuary 4, 1994.U.S. Department of Defense& Simulation InformationNode, as of Sept. 22, 1994.

Number 5000.59,

“Online ModelingSystem,” DMSO

URL=gopher://dmso.dtic.dla.mil: 4350.U.S. Department of Defense, “Online Modeling& Simulation Information System,” TWSTIACNode, as of Sept. 22, 1994.URL=gopher://yvette. iac.ist.ucf.edu:7O.U.S. Department of Defense, Office of theChairman, The Joint Chiefs of Staff, Depart-ment of Defense Dictiona~ of Military andAssociated Terms, Joint Pub 1-02 (Washington,DC: U.S. Government Printing Office, 23March 1994).U.S. Department of Defense, Office of the Un-der Secretary of Defense for Acquisition, Estab-lishment of the Defense Modeling andSimulation Office, Memorandum, July 22,1991.

46 I Virtual Reality and Technologies for Combat Simulation

96. U.S. Department of Defense, Office of the Un-der Secretary of Defense for Acquisition, De-fense Science Board, Report of the DefenseScience Board Task Force on Simulation, Readi-ness, and Prototyping: Impact of Advanced Dis-tributed Simulation on Readiness, Training, andPrototyping, January 1993, slide 6.

197. U.S. Department of Defense, Office of the Un-der Secretary of Defense for Research and Engi-neering, Defense Modeling and SimulationOffice, briefing to OTA staff, 20 May 1994.

198. “U.S. Forms Simulation Office,” Defense News,August 1-7, 1994, p. 22.

199. U. S. General Accounting OffIce, Models, Data,and War : A Critique of the Foundation for De-fense Analyses, PAD-80-21 (Gaithersburg, MD:U.S. General Accounting Office, March 12,1980).

200. U.S. General Accounting Office, DoD Simula-tions: Improved Assessment Procedures WouldIncrease the Credibility of Results, PEMD-88-3(Washington, DC: U.S. General Accounting Of-fice, December 1987).

201. U.S. National Aeronautics and Space Adminis-tration, Johnson Space Center, SoftwareTechnology Branch, “STB Virtual Reality Lab,”Aug. 11, 1994.URL==http://www.jsc. nasa.gov/-mlvrvhtmlml.

202. U.S. National Aeronautics and Space Adminis-tration, Johnson Space Center, Flight Crew Sup-port Division, Graphics Research and AnalysisFacility, “Virtual Reality at the GRAF,” as ofAug. 15, 1994.URL==http://139 .l69. 134. 181/Homepages/GRAF-Lab.

203. U.S. National Aeronautics and Space Adminis-tration, Johnson Space Center, SoftwareTechnology Branch, “HST Training System,”Aug. 11, 1994.URL=http://www.jsc.nasa.gov/-mlHubblele/hubble.html.

204. U.S. Senate Hearing on Virtual Reali~, P~ELONE (VHS videocassette), R. Miller (cd.) (PaloAlto, CA: Pure Grain Film & Video, 1991).

205. U.S. Senate Hearing on Virtual Reality, P~ELTWO (VHS videocassette), R. Miller (cd.) (Palo

Alto, CA: Pure Grain Film & Video, 1991).206. Virtual Entertainment, Inc., “VR Slingshot,” as

of Sept. 22, 1994.URL=http://www. cts.corn/-vrman.URL=ftp://ftp.cts. corn/pub/vrman/vrs. exe.

207. “Virtual Reality: Tools, Trends and Applica-tions,” IEEE Spectrum, October 1993, pp.22-39.

208. Virtual Worlds: How People Are Using VirtualReality to Change the Real World .,. for Betteror Worse (VHS videocassette), R. Miller (cd.)(Palo Alto, CA: Pure Grain Film & Video,1993).

209. Vomastic, V. S., “F/A-18 Single-Seat VersusDual-Seat Crew Simulation,” Phalanx (journalof the Military Operations Research Society),September 1989, pp. 19-23.

210. Vomastic, V. S., and Shea, D. P., Joint Strike As-sessmen+--.pla~orm Electiveness (Task 3): Po-tential Applications of Virtual RealityTechnologies for Strike Training and MissionRehearsal, CIM 336 (Alexandria, VA: Centerfor Naval Analyses, January 1994).

211. Voss, L. D., A Revolution in Simulation: Distrib-uted Interaction in the ’90s and Beyond (Arling-ton, VA: Pasha Publications, Inc., 1993).

212. Williams, M.L., and Sikora, J.J., “Overview,”Simulation Validation Workshop Proceedings(SIMVAL H), A.E. Ritchie (cd.) (Alexandria,VA: Military Operations Research Society,April 1992).

213. Wilson, A.L., and Weatherly, R. M., “New Traf-fic Reduction and Management Tools for ALSPConfederations,” Proceedings of EZecsim 1994:Electronic Conference on Constructive TrainingSimulation, R.D. Smith (cd.) (conducted on theInternet, April 11- May 27, 1994).URL=ftp://ftp.my stech.corn/pub/elecsirdELECS194.ZIP, May 27, 1994.URL=ftp://ftp.my stech.corn/pub/elecsim/ELECS194.sea.hqx, June 8, 1994.

214. Zyda, M., “The NPSNET Research Group,” asof Sept. 16, 1994.URL=ftp://taurus. cs.nps.navy.mil/pub/NPSNET-MOSAIC/npsnet-mosaic.html.

Mxe added inproo~” The National Academy of Sciences has announced the release of National Research Council, Committee on Virtual RealityResearch and Development, Virfual Realify: Scientific and Technological Challenges (Anne Mavor, Study Director) (Washington, DC: Nation-al Academy Press, 1994). A press release surnmarizing the committee’s conclusions and recommendations is available on the Internet by ftp,gopher, and World Wide Web: National Academy of Sciences, “Substantial Technology Gap Exists Between What Is Virtual, What Is Reality,”Sept. 20, 1994. URL=http://xerxes.nas.edu:7O/ l/onpi/pr/virtual.

Appendix A:Acknowledgments A

OTA gratefully acknowledges the assistance of the following individuals and organizations. The individualsand organizations listed do not necessarily approve, disapprove, or endorse this background paper; OTA assumesfull responsibility for the background paper and the accuracy of its contents.

Frederick BrooksUniversity of North Carolina

at Chapel Hill

Andrew L. BarringtonCambridge Research Associates

Robert L. CloverInstitute for Defense Analyses

Marc De GrootImmersive Systems, Inc.

Frank HepburnKaiser Electro-Optics, Inc.

Thomas J. HickeyCambridge Research Associates

Jerry IsdaleIllusion, Inc.

Margaret LoperUniversity of Central Florida

Robert J. Mills, Jr.Simulation Technologies, Inc.

Craig RamsdaleDivision, Inc.

Riley RepkoThe Repko Group

Steve SeidenstickerScience Applications

International Corp.

Dennis P. SheaCenter for Naval Analyses

Dave StampeUniversity of Toronto

Al StevensLORAL Corp.

Ken WimmerScience Applications

International Corp.

I 47

48 I Virtual Reality and Technologies for Combat Simulation

U.S. Department of DefenseDefense Modeling andSimulation Office

Lieutenant Colonel JerryWiedewitsch, USAGary Yerace

Department of the Air ForceOffice of the Deputy Chief ofStaff, Plans and Operations

Directorate of Air ForceModeling, Simulation, andAnalysis

Colonel Carl M. Upson,USAFTed Harshberger

Department of the ArmyOffice of the Deputy UnderSecretary of the Army(Operations Research)

U.S. Army ModelImprovement and StudyManagement Agency

Army Model and SimulationManagement Office

Colonel David E. Hardin

Army Materiel CommandU.S. Army Materiel SystemsAnalysis Activity

Wilbert BrooksDepartment of the Navy

Office of the Deputy Chief ofNaval Operations (Resources,Warfare Requirements, andAssessment)

Assessment DivisionModeling and AnalysisSection

George PhillipsU.S. Marine Corps

Marine Corps CombatDevelopment Command

Marine Corps Modeling andSimulation ManagementOffice

Major Richard Powell,USMC

The Joint StaffOperational Plans andInteroperability Directorate(J-7)Joint Warfighting Center

Captain Stanley F. Bloyer,USNCommander Raoul Reese,USN

Force Structure, Resources,and Assessment Directorate(J-8)

Robert H. HalaykoU.S. Department of Energy

Sandia National LaboratoriesAdvanced ManufacturingInitiatives DepartmentArlan Andrews

Appendix B:

3-D6-DA 2A T D

ADMALSP

ARPA

ATD

ATMBOOM

CADCARTCATCAVE

CCTTCD-ROMCIGCOEA

CORBA

CRT

Three-DimensionalSix-DimensionalAnti-Armor Advanced TechnologyDemonstrationAdvanced Development ModelAggregate Level SimulationProtocolAdvanced Research ProjectsAgency (formerly DARPA, origi-nally ARPA)1. Aircrew Training Device2. Advanced Technology Demon-strationAsynchronous Transfer ModeBinocular Omni-OrientationMonitorComputer-Aided DesignCollision Avoidance Radar TrainerComputed Axial TomographyCAVE Automatic Virtual Environ-ment (a recursive acronym)Close Combat Tactical TrainerCompact Disk Read-Only MemoryComputer Image GeneratorCost and Operational EffectivenessAnalysisCommon Object Request BrokerArchitectureCathode-Ray Tube

Glossary ofAcronyms

DARPA

DISDISA

DMDDODDODDDOSDRRDSIDWSEDPEISA

ENIAC

FAAFOHMDFTPGEGRIPHCIHMDHPCCIT

Defense Advanced Research Proj-ects Agency (originally and nowARPA)Distributed Interactive SimulationDefense Information SystemsAgencyDigital Micromirror DeviceDepartment of DefenseDepartment of Defense DirectiveDisk Operating SystemDigital Radar RelayDefense Simulation InternetDistributed Wargaming SystemEvent Distribution ProtocolExtended Industry-Standard Archi-tectureElectronic Numerical Integrator andCalculator (or Computer)Federal Aviation AdministrationFiber-Optic Head-Mounted DisplayFile Transfer ProtocolGeneral Electric [Company]Graphical Interaction with ProteinsHuman-Computer InterfaceHead-Mounted DisplayHigh-Performance Computing andCommunications and InformationTechnology

| 49

50 I Virtual Reality and Technologies for Combat Simulation

IBM

IEEE

IFFN

I-PortISDNIUSSJWFCKATLANLCLCDLEDMBONEMDT2

MEWMIT

MOSAIC

MRMMS-DOSMUSE

NASA

NATONGSB

NPSNPSNETNSTC

NTSC

OEMOMGOTAOTHPCPDU

International Business Machines(Corporation)Institute of Electrical and ElectronicEngineersIdentification, Friend or Foe orNeutralIndividual PortIntegrated Services Digital NetworkIntegrated Unit Simulation SystemJoint Warfighting CenterKeyboard-Alternative TechnologyLocal Area NetworkLiquid CrystalLiquid Crystal DisplayLight-Emitting DiodeMulticast BackboneMulti-Service Distributed TrainingTestbedMicrowave Early WarningMassachusetts Institute of Tech-nologyModels and Simulations, Army In-tegrated CatalogMedical Regulating ModelMicrosoft Disk Operating SystemMulti-dimensional User-orientedSynthetic EnvironmentNational Aeronautics and SpaceAdministrationNorth Atlantic Treaty AllianceNon-Government Standards Body(or Bodies)Naval Postgraduate SchoolNaval Postgraduate School NetworkNational Science and TechnologyCouncil1. National Television SystemCommittee2. National Television StandardsConvention3. Naval Training Systems CenterOriginal Equipment ManufacturerObject Management GroupOffice of Technology AssessmentOver-the-HorizonPersonal ComputerProtocol Data Unit

RADAR

RDT&E

RISCRSASSACEUR

SAGE

SBBSBDSBDSSGISIMNETSMTPSONETSPOT

STEP

STOWSTPTCPTCP/IP

TETRPTVTWBNetURLUs.USAUSAFUTDVCASS

VCRVEIVHSVIEWSTM

VIMTM

VITALViVED

VPL

Radio Detection and Ranging(radar)Research, Development, Testing,and EvaluationReduced Instruction Set ComputerRAND Strategy Assessment System[NATO’S] Supreme Allied Com-mander-EuropeSemi-Automatic Ground Environ-mentSynthetic Battle BridgeSimulation-Based DesignSimulation-Based Design SystemSilicon Graphics, IncorporatedSimulator NetworkingSimple Mail Transfer ProtocolSynchronous Optical NetworkSystème pour l’Observation de laTerreStandard for Exchange of ProductModel DataSynthetic Theater of WarSystem Training ProgramTransmission Control ProtocolTransmission Control Protocol/Internet ProtocolThermo-ElectricTechnology Reinvestment ProgramTelevisionTerrestrial Wide Band NetworkUniform Resource LocatorUnited StatesUnited States ArmyUnited States Air ForceUnit Training DeviceVisually Coupled Airborne SystemsSimulatorVideo Cassette RecorderVirtual Entertainment, IncorporatedVideo Home SystemVirtual Interactive EnvironmentWorkspaceVision ImmersionVirtual Image Takeoff and LandingVirtual Visual Environment Display(pronounced vivid)1. Virtual Programming Language

Appendix B Glossary of Acronyms 151

ViVED Virtual Visual Environment Display VTC Video Teleconferencing(pronounced vivid) VV&A Verification, Validation, and

VPL 1. Virtual Programming Language Accreditation2. VPL Research, Inc. wow Window on a World

VR Virtual Reality W C Warrior Preparation CenterVRD Virtual Retinal Display WWW World Wide WebVRT Variable Resolution Terrain

* us. GovEm??lQmT PRIMTING OFFICE: 1994 - 301-804 - 814/27414