Virtual Explorer: Interactive Virtual Environment for

Transcription

Virtual Explorer: Interactive Virtual Environment for
Kevin L. Dean
kld@chem.ucsd.edu
Virtual Explorer:
Xylar S. Asay-Davis
Evan M. Finn
Tim Foley
Jeremy A. Friesner
Yo Imai
Bret J. Naylor
Sarah R. Wustner
University of California, San Diego
La Jolla, CA 92093-0339
Education
Scott S. Fisher
Telepresence Media
San Francisco, CA USA
Kent R. Wilson
Department of Chemistry and
Biochemistry
University of California, San Diego
Interactive Virtual Environment for
Abstract
The Virtual Explorer project of the Senses Bureau at the University of California,
San Diego, focuses on creating immersive, highly interactive environments for education and scientiŽc visualization which are designed to be educational—and exciting, playful, and enjoyable, as well. We have created an integrated model system on
human immunology to demonstrate the application of virtual reality to education,
and we’ve also developed a modular software framework to facilitate the further
extension of the Virtual Explorer model to other Želds. The system has been installed internationally in numerous science museums, and more than 7,000 individuals have participated in demonstrations. The complete source code—which runs on
a variety of Silicon Graphics computers—is available on CD-ROM from the authors.
1
Presence, Vol. 9, No. 6, December 2000, 505–523
©2001
by the Massachusetts Institute of Technology
Overview and Purpose
The Senses Bureau is an undergraduate research group with a thirty-year
history of innovation in computer graphics and multimedia technology for education and scientiŽc visualization. We at the Bureau believe that virtual reality
(VR) has excellent potential as an educational medium to supplement conventional techniques because it provides both greater interactivity as well as the
ability to create a convincing sense of immersion in the computer-generated
environment that is beyond what is possible with conventional textbook- and
blackboard-based educational approaches.
Many topics in science education involve processes that occur simultaneously on multiple time and length scales that are difŽcult to accurately represent, perceive, and visualize with traditional static media. Examples can be
found in complex Želds such as immunology, astronomy, relativistic dynamics,
quantum mechanics, and rainforest ecology. We wanted to create a system that
would be suitable for a diverse target audience that includes several types of
educational venues, such as high school, college, and university institutions,
museums and other public places, and independent student use. Although we
do not feel that 3-D graphics technology can entirely replace conventional
classroom teaching techniques, we are convinced that properly implemented
virtual environments can serve as valuable supplemental teaching and learning
resources to augment and reinforce traditional methods.
The Virtual Explorer project employs a two-tiered approach to demonstrating VR’s potential for scientiŽc visualization, as well as to creating interactive
virtual environments for education. First, we’ve developed a proof-of-concept
Dean et al. 505
506 PRESENCE: VOLUME 9, NUMBER 6
Figure 1. Concept view of the Virtual Explorer theater in the
development lab.
Virtual Explorer system to demonstrate and study the
potential applications and beneŽts of an integrated VR
installation in an educational arena. This prototype installation currently runs our example module, which
focuses on human immunology (Figure 1). Second, we
have created a modular software framework and toolkit
for the further development of virtual reality for education based on the Virtual Explorer model. We envision
numerous applications for the Virtual Explorer as a visualization tool in diverse scientiŽc Želds and hope that
this toolkit (which is available from the authors in full
source code version for a wide variety of Silicon Graphics computers) will provide others with the means to
expand upon our work.
2
Background
In the past thirty years, many research and commercial efforts have investigated the application of new
media technologies to education. In particular, the development of computer-based interaction with educational material has enabled the development of learning
environments that can be personalized to better match
individual vocabularies, styles, and speciŽc needs. More
recently, advances in interactive computer graphics have
enabled the development of user-interface technologies
that can immerse a student in these interactive learning
environments. It seems that the capabilites of these new
technologies facilitate learning through a process of selfpaced exploration and discovery, in contrast to the more
traditional approach of instruction and memorization.
Through the interactive exploration of immersive environments, a student can engage in a curriculum that is
based on learning by doing, as well as encountering
subject matter in contexts that are more meaningful.
Several attempts to develop immersive learning environments predate the use of computational technologies, and two of the most memorable speciŽcally relate
to the human body. A surviving example is the walkthrough scale model of the human heart at the Franklin
Institute science museum in Philadelphia, Pennsylvania.
Since the 1950s, visitors can explore the giant chambers
of the heart surrounded by a soundtrack of booming
heartbeats. Later, in the 1970s, neurosurgeon David
Bogen and artist David Macaulay developed a detailed
proposal for a thirty-story replica of the human brain as
a new museum for San Jose, California. Bogen intended
the structure as an important learning environment for
medical students studying neuroanatomy as well as for
the general public (Bogen, 1972).
For many decades, interactive real-time graphics have
been used for training applications that require the acquisition of speciŽc skill sets for unique missions or purposes (such as the control of a variety of aircraft, automobiles, or ships). But its use for more-general
educational applications hasn’t been explored until the
recent development of lower-cost hardware platforms
and powerful software tools. Recent research efforts that
examine the use of virtual environment technology in
education include:
c
Science Space—This joint research effort between
George Mason University and NASA’s Johnson
Space Center is developing a series of “virtual reality microworlds” for teaching science concepts and
skills through the use of an interactive virtual laboratory conŽguration. Current modules include
NewtonWorld, MaxwellWorld, and PaulingWorld
(Dede, 1996; Salzman, 1999).
Dean et al.
507
Figure 2. Students can interact with the immune system in multiple environments (left to right: blood vessel, cell surface, lymph node).
c
c
3
Zengo Sayu—This immersive, interactive virtual
environment is designed to teach Japanese prepositions to students who have no prior knowledge of
the Japanese language. In one conŽguration, students can hear digitized speech samples representing the Japanese name of many virtual objects and
their relative spatial location when touched by the
user in the virtual environment. The system was
developed at the Human Interface Laboratory at
the University of Washington (Rose, 1996).
Anatomic Virtualizer—This interactive, immersive,
virtual environment for teaching anatomy at the
university level was developed at the Learning Resources Center in the School of Medicine, University of California, San Diego (Hoffman, 1999).
The Mission
The Virtual Explorer allows students to interactively explore the immune system at both the cellular
and molecular scales, and at more familiar time and
length scales, while still retaining a sense of overall systemic scale. Students are free to explore realistic virtual
environments that include blood vessels, cell surfaces,
and lymph nodes, while carrying out detailed missions
and several series of assigned tasks (Figure 2). We seek
to provide students with a means not only to explore
the structure, appearance, and function of various components of the immune system, but also with a tool for
gaining an understanding of the interactions among
these components.
We present immunology in a rich, game-like environ-
ment that features compelling visual and interactive qualities and that has been designed to be attractive to students
who have been raised in an age of computer games and
music videos. An entertaining background story, whose
plot is set on an isolated spacecraft, captures the user’s
imagination with a fantastic setting and expands the mission beyond its immunological content (Figure 3).
After selecting the immunology module (Figure 4),
the student is presented with a brief computer-animated
movie that describes an ill-fated mission into deep
space. Returning with samples of a dangerous off-world
bacteria, the transport ship USS Archon suffers an explosion caused by an unnoticed fuel leak in the propulsion system. This explosion allows the bacteria to escape
and to contaminate the ship’s air supply, resulting in the
infection of the pilot, the ship’s sole crew member. He
possesses only minimal medical knowledge, and the
ship’s supply of antibiotics has proven useless against
this foreign pathogen. Being an accomplished engineer,
however, the pilot has been able to modify the remotecontrolled nanobots that are normally used for repairing
the ship’s computers for operation within his own body
(Figure 5). Online references, a helpful “ship’s computer” character, and virtual tools are available to assist
the student-pilot in completing the mission. For example, the nanobot is equipped with several tools that aid
the pilot in carrying out this unique mission, including
monoclonal antibody-based protein dye jets for identifying different types of white blood cells, a remote probe
that allows the pilot to explore cell surfaces at the molecular scale, a vacuum for collecting bacterial samples,
and protein dye jets (Figure 6).
508 PRESENCE: VOLUME 9, NUMBER 6
Figure 3. A solo space mission gone terribly wrong: Background story for the Virtual Explorer’s immunology module.
Dean et al.
509
Figure 4. The immunology module. Diverse scientiŽc disciplines ranging from astronomy to quantum mechanics are also candidates for the
Virtual Explorer.
Figure 5. Remote-controlled nanobots. These nanobots provide a
vehicle allowing students to interact with the immune system at
microscopic scales.
Additionally, the nanobot’s outer hull can be dynamically modiŽed so that it can emulate cell surfaces and functionality. Fortunately, in addition to its quirky personality,
the ship’s computer is equipped with an extensive database
on human immunology, thus allowing it to offer guidance
during the mission and to recommend a course of action
to the pilot. The pilot must use the nanobot to identify
and explore the site of infection, emulate the function of
the damaged component of the immune system, and initiate a successful immune response. The mission’s level of
difŽculty, the overall sense of urgency, and the video
Figure 6. Virtual tools. Such tools, including a bacterial sample
collection vacuum shown here, assist students in performing assigned
tasks.
game-like appeal is all heightened by challenges such as
Žnite resources (for example, the number of times the protein dye jets can be Žred), damage incurred by the nanobot ship (from collisions, bacterial toxin, and phagocytic
cells), and the amount of time allowed to complete each
task (Figure 7).
Although the “ship’s computer” character functions
in an advisory capacity, offering verbal and textual support to guide student-pilots through the various missions, the ultimate course remains under the student’s
control. Help screens, which appear in the plane of the
510 PRESENCE: VOLUME 9, NUMBER 6
Figure 7. An optional display, keeping the user updated about the nanobot status (left to right: hull structural integrity, protein dye jets
remaining, current viewing scale, and time remaining for current task).
Figure 8. An example of the help screens providing students with more-detailed information about each cell or protein they encounter.
screen upon user command, contain information that is
essential to understanding the tasks to be performed,
including visual simulations, as well as information
about cells and proteins encountered in the simulation
(Figure 8).
Full-motion video animation provides outlines both
of the relevant immunology and of the speciŽc tasks
from a third-person perspective, providing crucial support for students in understanding their intended roles
(Figure 9).
Additionally, students can pause the simulation at any
time to access database information and simulation controls through a simple pop-up menu system (Figure 10).
In this manner, mission outlines, help screens, and
animated mission brieŽngs can be reviewed throughout
the simulation. Added text and spoken support serves to
augment the visual cues that are provided in mission
brieŽngs and help screens. For those students who continue to have difŽculty, a “hint” functionality is also
available, which provides explicit instructions for the
task at hand and becomes increasingly speciŽc as the
student continues to have difŽculty and requests additional help. It can be reviewed as needed for assistance
Figure 9. Full-motion video animation complementing audio and
textual instructions in introducing students to assigned tasks.
in completing the mission. Overall, this multifaceted
help system has played a key role in making the simulation accessible and relevant to a broad target audience.
It provides students with sufŽcient information to make
the Virtual Explorer accessible to inexperienced users,
Dean et al.
Figure 10. The familiar pop-up menu system, providing easy access
to nanobot functions for novice users.
yet without sacriŽcing the challenge that retains the interest of more-advanced users.
The Virtual Explorer’s immunology module currently
contains two interactive missions (Figure 11). Following
the brief introductory movie, the user is given a training
mission in which the user can explore and observe the
site of a bacterial infection and must collect a bacterial
specimen for analysis (Figures 12 and 13).
This Žrst mission introduces the user to the look and
feel of the virtual environment and also allows familiarization with the controls. Students are also challenged
with phagocytic components of the innate immune system (such as neutrophils) and must master appropriate
piloting skills to complete this mission. Upon completing this mission, the student can decide to emulate one
of several white blood cells (currently, only the helper T
cell is available) and he or she must use the nanobot to
fulŽll this character’s role in an immune response. In the
“Helper T Cell Mission,” we present a compromised
immune system that the student can “repair” by piloting a small nanobot ship in such a way so as to fulŽll the
role of a helper T cell in a humoral immune response.
The inherent complexity of the immune system, how-
511
ever, makes it impossible for one mission to touch upon
the entire range of material and issues that are presented
to students in an immunology course. Eventually, we
hope that others will go beyond this work and add missions that detail the involvement of other components
of the immune system which can be explored through
the individual viewpoints of those components. Ideally,
such future missions (such as “killer T cell” or “neutrophil” missions) would expand upon the helper T cell
mission’s focus and include additional facets of immunology, such as the innate and cell-mediated immune
responses.
Mission outlines were scripted to maximize user interaction and freedom, while still providing sufŽcient support to guide even those users with no immunology
background. Missions are divided into individual tasks,
thus establishing a series of mini-goals which are presented to the user in a scavenger-hunt fashion.
Preliminary user feedback revealed that clear mission
outlines must not only be presented before each task (to
provide clear instructions for that task) but must also be
continually available for review during task execution.
Although the mission outlines and help screens have
been made clear and simple, the virtual environments
have also been carefully constructed to show as much
relevant detail as possible. Although much of the simulation’s visual detail is not referenced in the mission outlines (Figure 14), we have found that providing visual
accuracy is essential to avoid misleading users who have
limited immunology backgrounds and to maintain the
simulation’s relevance for more-experienced users. A
detailed Website provides additional scientiŽc information about each of the models in a glossary format.
4
Educational Content
We chose immunology— one of the most complex
subjects studied by students of biology and medicine—as the subject for the Žrst module because it presents unique visualization challenges. Its processes occur
simultaneously in diverse locations of the body and often on time and length scales that, although too small
to be directly perceptible, still vary over several orders of
magnitude. Consequently, the study of basic immu-
512 PRESENCE: VOLUME 9, NUMBER 6
Figure 11. The immunology module, allowing the student to select from missions that emulate the roles of key players in the immune system,
as well as an introductory training mission.
Figure 12. Detail from the training mission. A shard of glass creates an opportunity for bacteria to enter the body.
nology presents several common conceptual pitfalls,
which we feel can be clariŽed with properly implemented interactive virtual environments. The compartmentalization of instructional material that is required
for the efŽcient organization of a textbook makes it difŽcult for students to gain an overall “road map” of the
immune response while still retaining a sense of the details of each microenvironment. Thus, processes and
microenvironments are usually studied individually so
that each can be explored in detail, but the systemic relationship among these details often remains difŽcult to
conceptualize.
One common misunderstanding that interactive 3-D
graphics are particularly well suited to clarify is the concept of relative scale. Textbooks and other static teaching materials are inherently limited in their abilities to
simultaneously show microscopic details and the larger
macroscopic systems within which they operate.
Consequently, textbooks and the like are often unable
to clearly represent the vast scale differences that are key to
immunology (Figure 15). For instance, immunology texts
often utilize schematic diagrams that depict cell surface
proteins that are oversized and underpopulated by several
orders of magnitude. Although these diagrams are useful
Dean et al.
513
Figure 13. Results summaries, concluding each task with an update on the current status of the immune system and providing an overview
of the next task. Additionally, students can pause the simulation at any time to access database information and simulation controls through a
simple pop-up menu system (Figure 10).
Figure 14. Text outlines of each task, augmented by full-motion video and available to students for review throughout each mission.
for conveying cell-protein identity and for suggesting the
mediation of cell-to-cell interactions through these proteins, students are unable to gain a sense of how much
smaller surface proteins are than typical cells. Additionally,
the implications in many diagrams that cell-to-cell interactions can be mediated by single surface proteins are inherently misleading (Figure 16).
The concept of relative concentration provides additional conceptual challenges that are similar to those
encountered in the exploration of relative scale. Students are often required to memorize lists of average
concentrations, but, without a visual representation of
these numbers, it is very difŽcult to understand the implications of ratios, which also can vary by several orders
of magnitude (Figure 17).
For example, in healthy individuals, red blood cells
outnumber white blood cells by a ratio of almost 700 to
1. Similarly, IgM and IgD surface receptors are typically
several times as abundant as MHC Class I and Class II
proteins on the surfaces of mature B cells.
Interactive 3-D graphics can provide students with a
visual model that helps them gain a basic understanding
of the relative frequency of occurrence of different components. Certain components, however, are so rare that
we are required to exaggerate measured concentrations
in our VR presentation simply to include even a few
specimens. For example, the relative concentrations of
monocytes and granulocytes in the bloodstream are so
low that they could appear to be virtually nonexistent
among the many red blood cells. The representation of
important constituents with vanishingly small concentrations requires that we include a few specimens in the
514 PRESENCE: VOLUME 9, NUMBER 6
Figure 15. Differing scales. Depicting scales that differ by several orders of magnitude is a task well suited to interactive computer graphics
(left to right: blood vessel at 20003 magniŽcation, cell surface at 1,000,0003 magniŽcation)
Figure 16. Surface proteins. These proteins allow for recognition
and signaling between cells and are often misrepresented by
immunology textbooks in both scale and population.
simulation to remind the student of their essential roles.
Although we would have preferred to have shown exact
concentrations, we were limited by available computational power.
Another area that is particularly enhanced by interactive 3-D graphics is the description of shape and structure. The characteristic shapes of cells, proteins, and
receptors have critical implications for binding, function, and identiŽcation. Structural differences between
MHC Class I and Class II, for example, are critical in
determining the nature of the immune response. Also,
lymphocytes are very difŽcult to distinguish visually,
although such discrimination is often critical to the understanding of an immune response.
“Virtual dyes”—which simulate the binding of monoclonal antibody dyes to the surface proteins of these
cells—allow the students to quickly identify subsets of B
and T cells in their native environment (Figure 18). Additionally, static teaching materials such as textbooks
often fail to remind students of the dynamics of the systems being studied. Cell surfaces, for example, are
highly uid and dynamic in nature, and surface proteins
are often free to migrate and diffuse across the surface.
A complete immune response involves a complex series of steps and interactions (Figure 19). For example,
the immune response to a bacterial infection might involve immediate inammation at the site of infection
and lymphocyte activation in some subset of the lymph
nodes or spleen, which is then followed by an antibody
and complement response, and so on. One common
misconception involves the locations of the immune
response: the primary adaptive immune response is actually mediated in the lymph node, rather than at the site
of infection (Figure 20). Because the processes in an
immune response occur at several different locations in
the body and involve important processes at several different length scales, the interactive visual simulation of
these processes is a potentially unique aid to understanding. We therefore believe that immunology’s visu-
Dean et al.
515
Figure 17. Virtual Explorer’s depiction of the bloodstream, helping to clarify issues of relative cell size and population.
Figure 18. Protein dye jets, allowing students to visually identify different types of white blood cells based on their surface protein
characteristics.
alization challenges make it especially well suited to
demonstrate the beneŽts of interactive 3-D graphics for
education.
5
Hardware ConŽ guration
The Virtual Explorer is currently running on a
four-processor Silicon Graphics Power Onyx. This
level of performance allows us to render in real time
six independent video signals which are split by an
MCO board to drive three contiguous displays in stereo, while still supporting well-populated virtual environments and fast frame rates. Rapid advancement in
computer hardware leads us to believe that this level
of computer graphics performance will be available at
the educational and consumer levels in the near future. In parallel, we have developed a version of our
system for the Silicon Graphics O2 workstation (a
516 PRESENCE: VOLUME 9, NUMBER 6
Figure 19. Full-motion video animation, supplementing the interactive real-time graphics to demonstrate tasks to be performed as well as to
give students a more comprehensive look at an immune response (left to right: the nanobot facilitates an immune response by emulating a
Helper T cell, shown here docking with a B cell; a complement cascade helps to carry out the Žnal stage of an immune response).
Figure 20. Lymph nodes. Although often misunderstood or unfamiliar to students, lymph nodes take center stage as the foci of adaptive
immune responses.
$5,000-$10,000 platform), as well as for various
other Silicon Graphics workstations. The exibility of
the software framework has allowed us to easily adapt
the Virtual Explorer for most Silicon Graphics IRIXbased hardware systems and their supported user input devices. (See Figure 21.)
The Virtual Explorer installation in our lab is en-
closed in a small soundproof theater (approximately 4 m
by 6 m) and employs three 52 in. rear-projection, consumer-grade television screens arranged at 120 deg. angles, creating a large window into the virtual environment. (See Figure 22.)
The graphics are driven by a four-processor Silicon
Graphics Power Onyx, with RealityEngine2 graphics
Dean et al.
Figure 21. The Virtual Explorer software in our most expansive
version, running on a four-processor Silicon Graphics Power Onyx,
which controls the interactive 3-D graphics and coordinates the
simulation. Six-channel video output from the Power Onyx drives three
large-screen displays that form a wraparound viewport into the virtual
world (Figure 22). Four-channel spatialized sound is generated by a
sound server running on an SGI Indigo2 Extreme, which communicates
with the Onyx through TCP/IP. User input from a force-feedback
joystick is processed through a Windows PC which also communicates
with the Onyx via TCP/IP. (See Figure 27.) Another version runs on an
individual single-processor SGI computer.
and two RM4 raster managers. The Onyx uses an MCO
board to split the video signal into six independent
channels, and stereoscopic multiplexers combine these
channels into the three Želd-sequential stereo channels
that are displayed on the three large TV screens. Depending upon the available graphics hardware and the
level of processor performance, the software can also
support several other combinations of stereo and mono
video channels. (See Figure 23.)
Field-sequential stereo LCD shutter glasses (Figure
24), which are synchronized to the video Želd frequency
with two infrared transmitters, allow multiple students
to experience the virtual environment simultaneously.
Although we experimented with several stereo video
systems, we ultimately selected the VRex Mux-1 multiplexer system because of its support of the NTSC video
standard and its relatively low cost. Initially, we also
considered using a head-mounted display, but preferred
the greater versatility, comfort, and ability to handle
large numbers of users that our current large-screen system provides. It presently accommodates approximately
Žfteen observers, and this capacity is theoretically limited only by the range of the infrared transmitters (ap-
517
Figure 22. Three large-screen, rear-projection monitors, creating a
wraparound viewport into the virtual world.
proximately 10 ft. to 12 ft.) and the size of the viewing
room.
6
User Interface
Depending upon the requirements of the physical
installation, the Virtual Explorer system can accommodate multiple user input devices. To be effective, the
interface paradigm must be easily understandable, especially by nontechnical users. We believe that acceptable
user input devices must provide a familiar interface that
is relatively simple and easily recognized so that students
can focus on interacting with the simulation and not on
mastering the controls (Figure 25).
We are currently using a CH Products force-feedback
ightstick and throttle, which—in addition to providing
an interface that is already found in many computer
video games—also provides the level of control necessary to successfully navigate in a dynamic three-dimensional environment (Figure 26). Force-feedback capabilities allow properties of the environment (such as
viscosity) to be tactually communicated to the user, and
enhance the user’s experience of immersion in the virtual environment by reecting ship collisions, speed,
and acceleration. Although joystick control is not very
processor intensive, the scarcity of joystick-type input
518 PRESENCE: VOLUME 9, NUMBER 6
Figure 23. The Onyx generating six-channel video (RGBS), which is processed through RGBS to composite video encoders (CV-233).
Stereoscopic multiplexers (VR-MUX 1) interlace left- and right-eye images for each of three screens, which are displayed on large, rearprojection displays. Infrared transmitters, which are connected to each of the outside monitors, synchronize stereo shutter glasses to the 60Hz
video Želd frequency.
Figure 24. Field-sequential stereo shutter glasses, providing a full
three-dimensional experience.
devices for SGI computers led us to choose this system,
which is driven by a Windows NT PC communicating
with the Onyx via TCP/IP (Figure 27). Additionally,
Virtual Explorer also supports the Nintendo 64 controller (connected directly to an SGI serial port with an
adapter box) and Microsoft’s Sidewinder ForceFeedback
Pro Joystick.
Navigating the nanobots has proven to be the most
Figure 25. Stereo shutter glasses and large screen displays combine
with a familiar force-feedback joystick and throttle to provide an
interactive and immersive learning experience.
challenging issue for users with limited computer gaming
experience. Although we’ve found that a certain degree of
difŽculty in navigation is essential in maintaining excitement for experienced users, it was also clear that inexperienced users must also be able to control the most basic
functions of the craft simply to complete the assigned missions. Mechanisms for obtaining additional help and in-
Dean et al.
Figure 26. ForceFX force-feedback joystick and throttle from CH
Products provide a ightstick-style navigation interface.
structions had to be made easily understandable and
readily identiŽable. Creating a simple hardware-software
interface that was easy to learn and operate—yet that still
provided access to the many controls required by the user
during the simulation—proved to be one of the more persistent design challenges that we encountered. Many users
Žnd it difŽcult to remember the functions of many relatively nondescript buttons (such as may exist when each
button controls a separate function).
In an early attempt to deal with this problem, we
added a speaker-independent, speech-recognition feature to the software. This feature was supposed to assume the burden of controlling many nanobot auxiliary
functions. Based upon commercially available speechrecognition software, the software listens for verbal
commands such as “computer, start engines,” and relays
the appropriate signal to the simulation. We quickly discovered several problems, however, which convinced us
to pursue other solutions. The main problem was the
noisy environment within which Virtual Explorer typically runs; the system we tested requires that the environment be virtually free of ambient background noise.
Virtual Explorer, however, generates substantial background audio (engine hum, blood-ow pulse, and the
like), which made the speech recognition substantially
less accurate and essentially incompatible.
519
Ultimately, a much more modest solution proved
most successful in providing students with the option of
a simpliŽed user interface while still maintaining the
same level of user control. The Virtual Explorer software contains a menu-based control system (similar to
familiar PC GUIs) that can be used in place of the joystick buttons to access online help and to control nanobot auxiliary functions. Users who are more comfortable
with this interface can use it instead of the joystick buttons, although the joystick is still used for navigation.
Audio in Virtual Explorer is carefully designed to enhance the user’s sense of immersion, as well as to allow
students to better orient themselves within the virtual environment. Background music (based on the ProTracker
standard) aids students in distinguishing among different
scales and environments. Students can also identify spatial
relationships between the “ship” and the objects in the
virtual environment by 3-D sound, and thereby beneŽt
from a heightened sense of immersion and overall enhanced awareness of the dynamics of the environment.
Our audio system supports multiple sound Žle formats and
multiple independent audio channels (based on hardware
capabilities), which allow for both global (mono) and localized sound effects. We have created our own spatialized
audio algorithm which allows us to successfully mimic 3-D
audio, including simple panning, localization, and Doppler
shift effects. The audio system can be controlled either by
the same computer as the main simulation or a secondary
IRIX-based system that is connected to the graphics hardware via TCP/IP. Currently, the audio server is running
on a Silicon Graphics Indigo2, because our Onyx lacks
sound output. Four independent audio channels provide
quadraphonic sound and drive four high- and midrange
speaker systems, two directly driven bass speaker systems,
and two powered long-excursion subwoofers for visceral
effects.
7
Software Design
The Virtual Explorer software is written in C++,
based upon the IRIS Performer toolkit. Although we
considered other development options such as
OpenGL, Open Inventor, VRML, and proprietary packages such as World ToolKit, we ultimately chose Per-
520 PRESENCE: VOLUME 9, NUMBER 6
Figure 27. User input from a Windows PC and audio output to an SGI Indigo2 Extreme, linked to the Onyx by Ethernet and communicating
with the Virtual Explorer software through TCP/IP.
Figure 28. Four-channel audio, generated by an audio server running on a Silicon Graphics Indigo2 Extreme that communicates with the
Onyx through TCP/IP over an Ethernet connection. Front and rear audio signals are processed through separate ampliŽers (AVR-10), resulting in
effective spatialized sound. Four satellite speakers, two passive subwoofers, and two powered subwoofers provide a wide dynamic range.
former for several reasons: it allows us to freely redistribute the generated code, it provides a high-level graphics
API while still allowing direct access to GL and lowerlevel rendering details, and it supports multiprocessing.
We constructed the immunology module within the
Virtual Explorer software framework, which is constructed on top of Performer. This should facilitate easier and quicker development of additional missions,
modules, and educational worlds.
The basic graphics-rendering pipeline for Virtual Ex-
Dean et al.
plorer is subdivided into six threads of execution, based
upon Performer’s multiprocessing framework: application, cull, draw, database, intersection (object collision
detection), and user I/O. The six threads can run on
one to four of the available processors, depending upon
machine conŽguration. The application thread controls
the high-level simulation, including mission progress,
object motions, and simple dynamics calculation (such
as the translational and angular momentum of the ship
and other objects). The database, user I/O, and intersection threads run asynchronously from the application
thread to maintain a constant and acceptable frame rate.
Virtual Explorer contains three basic scene types:
blood vessel (which is essentially linear), cell surface (essentially planar), and lymph node (volume-oriented)
(See Figure 2.) Variables such as clip-plane depth, fog
effect, global lighting characteristics, database paging
parameters, and motion models for the ship can be adjusted to differentiate between individual scenes. Scenes
are created based on a speciŽed combination of Žxed
geometry and procedural scene generation.
Each scene has speciŽc information about Žxed geometry, such as the shell of the lymph node, the nanobot extraction needle, or the shape and position of the
blood vessel. Additional scenery is created quasi-randomly and cached when the application is launched,
based on variables such as cell population and average
concentrations. This cached scenery can be dynamically
rearranged during the simulation. Earlier versions of the
software included actual dynamic generation of scenery
during the simulation, but that technique proved to be
too processor intensive to maintain a sufŽcient level of
graphics performance. A voxel-based paging scheme
dynamically reconŽgures and pages cached geometry as
needed during the simulation, allowing large scenes
with large amounts of geometry to be simulated without sacriŽcing graphics performance and frame rate. Although the overall complexity varies signiŽcantly between scenes, most scenes contain between 3,000 and
8,000 textured polygons per frame. The RealityEngine2
allows us to maintain steady six-channel video with a
frame rate of approximately 20 Hz.
The simulation contains biologically accurate scale
models of over thirty different cells and proteins that are
521
Figure 29. Electric Garden at SIGGRAPH ’97.
important to the study of immunology. Cells have been
modeled at the scale of 1:2,000 and proteins at
1:1,000,000, which is consistent with the two viewing
scales available to the user. We have created these models and deŽned their interactions based upon available
microscopy images, x-ray crystallography, and NMR
structures, as well as other structural data. Each model
typically contains Žve geometric levels of detail and has
an associated information Žle with the deŽning characteristics that are used by the simulation. Additionally,
each model is accompanied by a help screen containing
information of interest to the student (Figure 8). Techniques such as object sequences (which allow for morphing models) and dynamic texture shifting (which allows for protein “dyeing”) show biological
characteristics and improve the interaction between the
user and the individual objects in the simulation.
8
Conclusions
The response from the educational, scientiŽc, and
computer graphics communities has been very positive.
More than 7,000 people have already participated in
demonstrations (Figure 29). We are distributing the
complete source code and installer scripts for a variety of
Silicon Graphics computers, with illustrated instruction
manuals included, as a CD-ROM. Several science and
technology museums have licensed Virtual Explorer for
522 PRESENCE: VOLUME 9, NUMBER 6
Figure 30. Software:Theater at HeinzNixdorf Museumsforum in
Paderborn, Germany.
permanent exhibits, and it has already been installed in
the Heinz Nixdorf MuseumsForum (Figure 30) in Paderborn, Germany (for which we wrote a German version
of the text and audio track) and the Tech Museum of
Innovation (Figure 31) in San Jose, California. Other
installations are in the planning stages. Future directions
for study may include characterization of the educational beneŽts of interactive three-dimensional virtual
environments, like Virtual Explorer, over interactive, yet
non-immersive, two-dimensional systems.
Further information on the system and how to obtain
a video demonstration of Virtual Explorer (as well as the
CD-ROMs of the source code and instruction manuals)
can be obtained from the Virtual Explorer Website at
www-wilson.ucsd.edu/ve/.
Acknowledgments
We would like to thank the following individuals for their invaluable contributions to the Virtual Explorer project: April
Apperson (adviser for immunology), School of Medicine, University of California, San Diego (La Jolla, CA); Jon Christensen (former project director), Painted Word, Inc. (Cambridge, MA); Glen D. Fraser (adviser for interactive 3-D
graphics), Montreal, Quebec, Canada; David Goodsell (adviser for cellular and molecular visualization), Scripps Research
Institute (La Jolla, CA); Mizuko Ito (adviser for educational
Figure 31. Life Tech Theater at the Tech Museum of Innovation in
San Jose, California.
interface), Institute for Research on Learning (Menlo Park,
CA) and Stanford University (Stanford, CA); Teresa Larsen
(adviser for biology and computer animation), Scripps Research Institute (La Jolla, CA); Barbara Sawrey (adviser for
multimedia education and visualization), Department of
Chemistry and Biochemistry, UCSD (La Jolla, CA); Gabriele
Wienhausen (adviser for multimedia education and visualization), Department of Biology, University of California, San
Diego (La Jolla, CA); and Michael Zyda (adviser for interactive 3-D graphics), Department of Computer Science, Naval
Postgraduate School (Monterey, CA).
References
Bogen, J. E. (1972). A giant walk-through brain. Bulletin of
the Los Angeles Neurological Society, 37(3).
Dean, K.L., Asay-Davis, X. S., Finn, E, M., Friesner, J. A.,
Naylor, B. J., Wustner, S. R., Fisher, S. S., & Wilson, K. R.
(1998). Virtual Explorer: Creating interactive 3D virtual
environments for education. In M. T. Bolas, S. S. Fisher,
and J. O. Merritt (Eds.), Stereoscopic Displays and Virtual
Reality Systems V, Proceedings of SPIE—the International
Society for Optical Engineering, 3295 (p. 429), Bellingham,
WA.
Dean, K., Asay-Davis, X., Finn, E., Friesner, J., Naylor, B.,
Wustner, S., Fisher, S., & Wilson, K. (1997). Electric garden: The Virtual Explorer. Computer Graphics, 31(4), 1617, 81.
Dean et al.
Dean, K. L., Finn, E. M., Friesner, J. A., Naylor, B. J., Wustner, S. R., Wilson, K. R., & Fisher, S. S. (1997). Electric
garden: Virtual Explorer. In R. Hopkins (Ed.), Visual Proceedings: The Art and Interdisciplinary Programs of
SIGGRAPH 97 (p. 110), New York: Association for Computing Machinery.
Dede, C., Salzman, M. C., & Loften, B. (1996). Science
space: Virtual realities for learning complex and abstract
scientiŽc concepts. In Proc. IEEE Virtual Reality Annual
International Symposium (pp. 246-253).
Hoffman, H. M., & Murray, M. (1999). Anatomic VisualizeR: Realizing the vision of a VR-based learning environment. In Medicine Meets Virtual Reality, The Convergence of
523
Physical and Informational Technologies: Options for a New
Era in Healthcare (pp. 134-140), IOS Press.
Kuby, J. (1997). Immunology (3rd ed.). New York: W. H.
Freeman and Company.
Rose, H., & Billinghurst, M. (1996). Zengo Sayu: An immersive educational environment for learning Japanese (Technical report). Seattle: University of Washington, Human
Interface Laboratory of the Washington Technology
Center.
Salzman, M. C., Dede, C., Loftin, R. B., & Chen, J. (1999).
A model for understanding how virtual reality aids complex
conceptual learning. Presence: Teleoperators and Virtual Environments, 8(3), 293-316.