2004 SBIR Phase I Technical Proposal

Simulation-Based Lunar Telerobotics Design, Acquisition and Training Platform for Virtual Exploration

in the topic

X5.02 Surface Exploration and Expeditions, Virtual Exploration
Proposal #9853

 

 

 

����������������������������������������������������������� TABLE OF CONTENTS

PART�� �����������

DESCRIPTION

PAGE

Proposal Cover

1

Proposal Summary

2

1

Table of Contents

3

2

Identification and Significance of the Innovation

4

3

Technical Objectives

7

4

Work Plan

4.1 Technical Approach

4.2 Task Descriptions

4.3 Meeting the Technical Objectives

4.4 Task Labor Categories and Schedules

13

5

Related Research/ Research and Development

15

6

Key Personnel and Bibliography of Directly Related Work

17

7

Relationship with Phase II or Future R/R&D

19

8

Company Information and Facilities

20

9

Subcontracts and Consultants

20

10

Potential Applications

10.1 Potential NASA Applications

10.1 Potential Non-NASA Commercial Applications

21

11

Similar Proposals and Awards

22

Proposal Budget


Part 2 - Identification and Significance of the Innovation

 

The Solicitation

 

2.1 Innovation: The declared new exploration vision and the strategic role of a new Telerobotics Simulator and Physical Test Fixture

The solicitation for this topic calls for innovation in the way that people will interact with both physical and virtual data sets using multi-sensory displays and interfaces (including force-feedback) to support richly endowed situational awareness and telerobotics.

 

The exploration systems called for in the President�s new vision for space [1] are in alignment with the goals of this topic. For example, from the recommendations of the President�s (Aldridge) Commission report [2] it is clear that a return to the Moon will require an extensive period of imaging and surface telerobotic operations to select and prepare a site for a human crew on long duration (90 days or longer) stays. The new exploration initiative calls for in-situ resource utilization (ISRU) of lunar materials. Prior to any extensive mining and smelting of lunar ores or extraction of fuels, 3He or water, the very first ISRU application will focus on creating a safe living environment for crews. A leading health hazard on the lunar surface as in space will be exposure to radiation levels, especially during peak sunspot cycles or in the event of a radiation spike from an unpredictable Coronal Mass Ejection (CME). Radiation shielding solutions for pressurized lunar habitats therefore must have high effectiveness and be able to be enhanced on short notice. A leading solution for extendible shielding is the layering of Lunar regolith above a pressurized habitat or the burying of a habitat at sufficient depth to provide protection. In either case, there will be a need for an industrial grade excavator using a bucket wheel, a front end loader or other type of vehicle to be teleoperated on the Lunar surface.

 

Given the criticality of reaching a high technology readiness level (TRL) for lunar base site preparation and shielding early in the development of Lunar exploration options, we propose the following systems-within-systems pathway approach to develop this capability:

 

1. Development of a telerobotics simulator supporting both a virtual and physical test fixture which as its first application will allow the design, prototyping and testing of lunar excavator designs.

2. In subsequent phases, this test fixture will be employed in the following development spirals:

a. An all-virtual fixture with a simulated regolith environment and excavator operating in test using analog dynamics.

b. The virtual test fixture driving a small scale physical excavator prototype article in a �sandbox� laboratory environment.

c. The virtual test fixture tele-operating a full scale excavator prototype in a terrestrial desert field site (such as the Nevada Test Site).

d. At a high TRL, the virtual test fixture would be a fully equipped telerobotic, latency-managing interface to actual flight hardware on the lunar surface, perhaps including several generations of excavators.

e. During lunar surface operations, the test fixture would be employed jointly by Mission Control Centers (MCC) and lunar crews to operate excavator equipment.

 

Figures 1 and 2 below illustrate how these development spirals permit one 3D simulation platform to support a range of TRLs from in-lab prototyping to full scale vehicles in terrestrial desert tests and finally as a mission operations tool.

Figure 1: Spiral Development pathway for 3D simulation-based mission support

Figure 2: Corresponding levels of development of a Lunar telerobotics development cycle

Focus of this Phase I proposal

 

During Phase I we will concentrate on the lowest TRLs of this capability with the development of a virtual test fixture performing a high caliber 3D dynamic reproduction of an actual lunar bucket wheel excavator prototype developed at the Colorado School of Mines. This test fixture will support both visual interfaces and traditional workstation-based interactivity from DigitalSpace and a haptic force feedback interface integrated from Stanford University�s Biocomputation Center. From NASA�s Ames Research Center we will employ the Brahms agent technology and SimStation Procedures Module CAD representation and interaction also developed at ARC in collaboration with NASA JSC and Raytheon.

 

The long term vision

 

To stay on course at the beginning of a development cycle a long term vision is necessary. We have developed the following story vision piece to support the long term goal seeking of our proposal of a Simulation-Based Lunar Telerobotics Design, Acquisition and Training Platform for Virtual Exploration:

 

It is 2019 and late night in Houston but a brilliant day at Mare Smythii on the Moon. Lunar mission operators at NASA-JSC are easing the first piece of heavy equipment down the ramp and the haptic operator receives a satisfying force feedback "bump" as the big excavator encounters its first piece of lunar surface. The real time topography system flickers to life as the stereoscopic cameras on board the excavator produce a 360-degree model of the regolith terrain in all directions, indicating surface and subsurface anomalies including a boulder the size of a house buried by 3 meters of regolith.

At NASA's Ames Research Center in California, donning a lightweight immersive display the excavator operator gets into the driver's seat. Training for years to learn the highly sensitive haptic cab and accommodate the 3 second round trip signal delay the operator is ready to "sense" the lunar surface herself as she drives the excavator forward. She has weeks of careful work ahead, piling lunar soil thickly on top of pressurized hab modules automatically landed at the chosen site for the base. The human crew is only a year away from entering the modules and a lot of extra shielding is needed to keep them safe from high sun spot and coronal mass ejection radiation spikes.

As she shifts her virtual excavator's bucket down and piles up the first load of regolith, terabytes of data drive the particle system model presented in her heads up display. That model tells her the likely volume and makeup of the load based on the stereo camera's real time analysis combined with input from the radar depth profiling system. Satisfied with the virtual excavator's performance in the model, she pulls the lever for the actual lunar excavator and switches to vehicle video to observe it at work on that first load. By load 99 the operator is in as tune with the excavator, the feel of the lunar terrain and gravity and the job site as if she had been standing there at Mare Smythii with a shovel.

 

Prior Projects by DigitalSpace supporting this work

DigitalSpace has nine years of experience in the development and deployment of 3D simulation, training, design and multi user collaborative systems. Table 1 below details four recent projects which provide a strong foundation on which the work of the proposal will proceed. DigitalSpace�s prior closely related work and partnership with many of the experts listed in this proposal lowers the risk to the project�s successful completion. Project 1 below enabled us to develop lunar terrain and vehicle simulations from Boeing studies [3]; project 2 allowed us to develop experience with Stanford University�s Biocomputation center and their Spring haptic teleoperations environment [4]; project 3 permitted us to develop simulations of agent-driven human and robotic operations for the Brahms team at NASA ARC [5,6,7]; and project 4 permitted us to gain experience building virtual simulators for NASA ARC, JSC and the Neutral Bouyancy Laboratory in the telerobotic operation of a major upcoming repair to the ISS [8,9].

Table 1: DigitalSpace projects that form the background for this Phase I proposal

1. Simulation and animation of concepts from Boeing for Constellation/CEV and long duration lunar facilities.

Supporting prior DigitalSpace work: Support of Boeing CREATE workshop and follow-up work on Space Exploration Online (Module 1: Lunar Base), a Chairman�s Innovation Initiative. (2004)



2. Simulation template and applications for crew health and safety on long duration missions for ISS and Constellation/CEV.

Supporting prior DigitalSpace work: VAST simulation for JSC and Lockheed Martin space medicine, use of CHeCS rack on ISS by crew performing emergency medical procedures, for refinement of procedures. (2004)

3. Simulation template and applications for the design of human-robotic systems and practices to support a long duration surface facilities on the Moon or Mars.

Supporting prior DigitalSpace work: BrahmsVE/FMARS, MDRS/Mobile Agents with ARC and JSC. (2000-2004)

4. Simulation template and applications for rapidly developed, low cost just-in-time virtual training for in-flight ISS and future Constellation/CEV crews.

Supporting prior DigitalSpace work: SimStation SimEVA application - STS-114 CMG change-out crew refresher simulation for Raytheon and JSC/Neutral Buoyancy Laboratory. (2003-2004)

Part 3 - Technical Objectives

 

3.1 The Objective

This project will carry out the following objective and thereby providing a proof of concept for a key capability in support of the NASA�s new exploration mission:

This project will result in the creation of a simulator of an existing prototype remotely controlled lunar bucket wheel excavator, the testing of that virtual robotic vehicle inside a virtual lunar regolith stimulant sand box and the final application of the combined virtual environment with haptic force feedback devices and multi-modal immersive displays.

Having achieved the above objective, DigitalSpace will be in a strong position to propose a Phase II project which would tie the Phase I virtual environment and its haptic force feedback devices and immersive displays to drive the physical bucket wheel excavator creating, a closed loop virtual and physical simulator of a tele-operated robotic lunar surface materials handling vehicle. Iterating this design would provide NASA the ability to advance the TRL of Lunar surface operations vehicles and work practices and meet a cornerstone objective set by the Office of Exploration Systems (OExS).

Combining three technology elements:

Element 1: prototype Bucket Wheel Excavator

The Colorado School of Mines, Center for Commercial Applications of Combustion in Space (CCACS) is one of the leading Lunar ISRU center�s in the world. CCACS built a prototype small bucket wheel excavator at approximately the scale of the rovers that are carried to Mars on the Mars Exploration Rover Mission [10]. Testing of this prototype in a physical lunar stimulant sand box permitted the collection of data on forces exerted and power requirements for excavation and provided data on which more efficient designs can be based. This rover was able to excavate approximately one rover mass of material per hour. The rover and its bucket wheel assembly is pictured in figures 3 and 4 below.

Figure 3: CCACS Bucket Wheel Excavator

Figure 4: Detail of Bucket Wheel (CCACS)

Element 2: 3D virtual environments

Figures 5 and 6 below depict real-time computer graphic renderings from DigitalSpace�s 2003-04 project with the VisOpps team at NASA ARC and JSC to experiment with realistic reconstructions of a surface robot vehicle, in this case the Mars Exploration Rover (MER). DigitalSpace employed its platform SimSpace to model MER with close attention to the rover�s drive train dynamics. A simulated Martian terrain with analog surface, gravity and day/night cycles was implemented. The vehicle is �drivable� in real time and its contact with the synthetic surface was simulated through a sophisticated physics engine. NASA engineers felt that the model expressed a reasonable fit with actual MER dynamics on the Martian surface. Through this project, DigitalSpace has established a high degree of expertise and confidence in its ability to model surface vehicle operations [11].

Figure 5: DigitalSpace�s virtual Mars Exploration Rover

Figure 6: Detail of MER rover showing physics engine and Rocker-Bogie suspension working on simulated rocky terrain

Element 3: Haptic force feedback interfaces

Low-level teleoperation permits the user to directly control the motions and contact forces of a remote manipulator in real time. The most common application of this technique is in construction equipment such as excavators in which the operator controls the velocity of the joints of a machine to perform a physical task. Typical construction equipment does not provide sensory force feedback directly to the hand which would be needed for high latency, highly efficient control of tele-operations.. Considerable engineering effort must be applied to reproduce the sensory feedback information which allows accurate and efficient control. Both teleoperation and associated virtual environments visualization/training need this rich and self-consistent sensory feedback.

In both teleoperation and virtual environment applications of haptics, a loop is closed between the human operator's motion "inputs" and forces applied by the haptic device. In teleoperation this loop is closed via a communication link, robot manipulator, and the environment. In virtual environments, the loop is closed via a computer simulation [12]. Stanford University�s National Biocomputation Center has developed a series of virtual gloveboxes, and haptic virtual/physical force feedback applications for tele-surgery [13,14,15]. Based on Biocomputation�s long history working with NASA ARC and JSC and its expertise and willingness to employ its Spring platform for this Phase I project, DigitalSpace has invited them to participate in this Phase I prototype project. Figures 7 and 8 below illustrate Stanford Biocomputation systems in action.

Figure 7: Bimanual laparoscopic haptics device developed with Immersion and SUMMIT at Stanford

Figure 8: Tele-surgery hysteroscopy simulator at Stanford Biocomputation

New elements to be constructed

 

To achieve the integration and features described above we will be building a new series of integrated modules based on DigitalSpace�s SimSpace platform and Stanford University Biocomputation Center�s Spring platform.

Figure 9: New integration and module builds for this Phase I proposal

The existing and new components that will be used and integrated in this project are described next and depicted in figure 9 above.

Physical Excavator Model: The documentation, test results, and expert guidance from CCACS at the Colorado School of Mines will provide the background for this model, which informs the virtual telerobotic excavator.

SimSpace Virtual Telerobotic Excavator: Implementation of the virtual excavator as a dynamic 3D model will employ DigitalSpace�s SimSpace platform. The model will adopt the following additional component technologies:

o Virtual Simulant Sandbox: A particle system or deformable surface model will emulate the sandbox used by CCACS in the original testing of the physical excavator.

o Analog Teleoperator Interface: DigitalSpace will design a 2D interface that will simulate telerobotic operations including the multisecond latency that would be experience during Lunar surface operations.

o SimStation Procedures Module: this module derived from the SimStation project with NASA ARC [8,9] will be employed to manage the CAD components and behavior of the virtual vehicle.

o Brahms Agents: The Brahms and BrahmsVE platform developed with NASA ARC [5,6,7] will be used to embody an agent for the robotic vehicle and another representing the operator. Brahms is currently used for the Mobile Agents projects testing human/robotic operations in terrestrial desert locations so is judged as a low risk, capable platform for this project.

Spring Haptic Operator Model: The Stanford Biocomputation Center�s Spring system will be utilized to generate a first step haptic force feedback implementation. Spring is also able to drive immersive (parabolic) displays which will give an added dimension to this first Phase project.

The SimSpace and Spring implementations are both informed by the physical vehicle and by each other. The validation of future physical vehicle designs by the simulators and their operator interfaces is a major goal of this project and is represented by the dashed line feedback in figure 9. This kind of feedback often termed a �closed loop simulation� which provides a tight coupling between physical vehicle design and performance and the virtual vehicle realizations and interactivity [16, 17].

Architecture of the components

A more detailed treatment of both the SimSpace architecture and the Spring platform is depicted in the following two figures.

Figure 10: the SimSpace architecture in 2004

As figure 10 illustrates, SimSpace combines 3rd party Foundation elements (Brahms, SimStation, shared models and any other pluggable system) within a common Simulation layer (SimSpace including the BrahmsVE interface, Oworld Agent Information Broker (OWIB) and OWorld engine) and drives a flexible Presentation layer (commercial or open source 3D engines) enabling a much larger range of Applications than would be possible with a monolithic system. Figure 11 illustrates the Spring platform architecture from the Stanford Biocomputation Center.

Figure 11: the Spring platform from the Stanford Biocomputation Center

Additional capabilities to be implemented

The solicitation proposal calls for a number of areas to be addressed by responses and this project addresses a number of them, including the following:

We will implement a multisecond communications delay mode in the virtual vehicle environment to heuristically aid in the development of Predictive interfaces for real Lunar telerobotics missions.

We will be employing 2D (through heads-up display list and dialogue UI modalities) and 3D (real time vehicle and scene rendering) multimodal displays. The Spring system version will enable us to create a view on a parabolic display.

Typically low level performance simulation of virtual vehicles generates a very large amount of data which needs to be post-processed later. Our back end support for the project will include a high performance SQL database to assemble the scene and record data output in a highly compressed format. Data from each test will be presented in real time through the interface and key features data mined on the fly will include actual kinematics data, modeled power needs of the vehicle and effective mass of regolith moved.

This data will be captured in the database to be made available for more advanced processing via professional packages such as VisualDOC� (Vanderplaats R&D) which allows the generation of a kinematic simulation to engineering CAD/CAM specifications.

The physics engine employed in the project is a high performance physical simulator able to create both the simulated physics of vehicle-to-regolith contact and the particle systems-like behavior of actual regolith being affected by the vehicle actions in the simulator. If the need arises, the project may apply for use of the 128 processor SGI Origin 3800 shared memory supercomputer at the Biocomputation center.

As the project will produce a 3D immersive display with 2D elements and involve a first state implementation of force feedback interface, we will be able to report on issues brought up by evaluating the situational awareness successes and shortcomings of the simulation. In addition, the human sensory augmentation can be compared with the operation and controls of the physical excavator prototype at the Colorado School of Mines.

Construction of virtual test fixture, test, measure, and report

 

DigitalSpace will implement the virtual environment to permit collaborative review by domain experts listed in part 6.2 below. Issues such as performance evaluation, scaling (between operator and robot), control (how stable is the control and kinematic effects), mechanization (stiffness and torsion, control by master and slave), and kinematics (degrees of freedom use, constraints) [18].

 

Part 4 - Work Plan

 

The project will commence with extensive consultation with expert advisors at CCACS, Stanford, Caterpillar, Bechtel and several NASA centers as other contractors and university collaborators. Thereafter the following staged implementation will be undertaken:

Produce a 3D virtual model of the existing CCACS prototype lunar analog bucket wheel excavator. Validate the virtual excavator with the CCACS.

Create a virtual stimulant �sandbox� to be able to test the virtual excavator. Endow the virtual sandbox with basic physical properties. Validate sand box with the CCACS.

Build a simple control interface and test operating the virtual excavator in the sandbox. Match the control interface to likely haptic force feedback controls.

Iterate the virtual prototype to match the properties observed by CCACS in their original testing of the physical excavator in its simulant sand box.

Export the refined virtual model to the Spring system and work with Stanford Biocomputation center to create a first generation haptic force feedback version of the virtual excavator and stimulant sandbox.

Test the haptic-enabled version in an immersive display interface (parabolic) and generate data/metrics to compare simulator/haptic and observed physical performance.

Produce reports and recommendation for Phase II.

Throughout the second and third portions of the project, all consulted teams will be able to view progress and the final simulation through releases of the SimSpace platform and project content. At the end of the project, a full report of the successes and limitations of the simulation will be delivered to all consulting groups and their input sought for possible Phase II project proposals, if Phase II is to be solicited.

 

4.1������ Project Work Plan by task

Table 2 below provides our projected allocation of hours by labor category for the major tasks in this Phase I work plan and corresponds to our proposed budget.

 

Table 2: Work Plan

 

TASK

DESCRIPTION

PI

PM

SME

SE

CD

TG

1

Expert interviews

20

20

5

0

0

0

2

 

Model 3D version of CCACS rover

5

20

0

10

80

0

3

Create virtual stimulant sandbox

5

20

0

40

60

0

4

Build control interface

5

20

0

40

20


0

5

Iterate simulator vs prototype rover

5

20

0

20

20

10

6

Refine simulator

10

20

0

30

20

0

7

Export simulator content to Spring

10

20

0

40

40

0

8

Test and Iterate Spring version

10

20

0

10

10

10

9

Share environments with advisors

10

50

15

10

0

0

10

Finalization of environments

10

20

0

20

10

5

11

Project report generation

10

35

0

0

0

0

Total

All tasks

100

265

20

220

260

25

Total of all hours: 890 where the roles are defined as:

PI�������� =��������� Principal Investigator

PM������ =��������� Program Manager

SME���� =��������� Subject Matter Experts

SE������� =��������� Software Engineer

CD������ =��������� Content Developer

TG������ =��������� Test Group

4.2������ Project Reference Website

The Project Reference Website will be a center for ongoing progress and resources surrounding the project, from the interview phase to the simulation evaluation.

 

4.3�� Project Work Schedule

This section describes the work schedule for the Phase I effort (see Table 3 below).DigitalSpace work is to be coordinated from its corporate offices located near Santa Cruz California. DigitalSpace development and testing teams are located at several places around the United States and internationally. This schedule assumes a six-month project duration.

 

Table 3: Work Schedule

 

Month and Phases

 

1

2

3

4

5

6

Expert Interviews & architecture

Content: 3D rover, virtual sand box, control

+�

�d

d

Testing and iterative development

d

dt

t

d

dt

Mid Point Project Review

�+

Port to Spring/Haptic version

d

d

Testing and iterative development

td

td

Final testing & sharing with advisors

t�

Final Phase I Report

*

Where:

� = Specification and or Design & Documentation

d = Software and Content Development

t = Software Testing

+ = Status Report

* = Final Report

 

Part 5 - Related R/R&D

 

DigitalSpace has been engaged in the development of virtual world platforms for eight years and has successfully completed a number of major projects (see Part 8.1 below). The PI and members of DigitalSpace have contributed numerous publications to a variety of scientific and technical journals [19]. In addition to the work with USRA/RIACS and NASA described in Part 2 above, we have collaborated with numerous universities and companies.

5.1 NASA/RIACS Collaboration

The SimStation virtual vehicles project at NASA ARC and JSC will be joined by the Brahms project at ARC and JSC to provide basic technology support of the project. Team leads from both of these efforts will be involved in this effort.

5.2 External Cooperation

Domain experts such as Mike Duke from Colorado School of Mines, specialist from Bechtel and Caterpillar companies and Kevin Montgomery from Stanford university will be supporting this project. In addition, we will be cooperating with Dr. Don Brutzman and his group at the Naval Postgraduate School/MOVES Institute on the Extensible Modeling and Simulation Framework (XMSF), an XML-based cross platform framework for simulation.

 

5.3 Bibliography of Directly Related Work

 

[1] Office of Exploration Systems on the web at: http://exploration.nasa.gov

[2] President's Commission on Moon, Mars and beyond, report on the web at: http://www.moontomars.org/

[3] Boeing CREATE workshop on lunar base design/utilization, Houston, April 30-May 1, 2004. Results of DigitalSpace work on the web at: http://www.digitalspace.com/projects/lunarbase/index.html

[4] VAST project with Lockheed-Martin and MedOps/NASA JSC and Stanford Biocomputation, Summer 2004, on the web at: http://vast.stanford.edu

[5] BrahmsVE MDRS/FMARS projects by DigitalSpace on the web at: http://www.digitalspace.com/projects/fmars/

[6] Sierhuis, M. 2001. Modeling and Simulating Work Practice; Brahms: A multiagent modeling and simulation language for work system analysis and design. Ph.D. thesis, Social Science and Informatics (SWI), University of Amsterdam, SIKS Dissertation Series No. 2001-10, Amsterdam, The Netherlands, ISBN 90-6464-849-2.

[7] Sierhuis, M.; Bradshaw, J.M.; Acquisti, A.; Hoof, R.v.; Jeffers, R.; and Uszok, A. Human-Agent Teamwork and Adjustable Autonomy in Practice, in Proceedings of The 7th International Symposium on Artificial Intelligence,Robotics and Automation in Space (i-SAIRAS), Nara, Japan, 2003.

[8] Shirley, M., Cochrane, T., SimStation: A Knowledge-Integrating Virtual Vehicle, Virtual Iron Bird Workshop, NASA Ames Research Center, March 31, 2004. Available on the web at:

http://ic.arc.nasa.gov/vib/day1/papers/Shirley_Cochrane.pdf

[9] Damer, B. et al, Final Report, SBIR I: BrahmsVE: Platform for Design and Test of Large Scale Multi-Agent Human-Centric Mission Concepts, DigitalSpace Documents, July 2004. On the web at: http://www.digitalspace.com/projects/eva-sims/

 

[10] Muff, T., Johnson, L., King, R., and Duke, M.B., (2004), "A Prototype Bucket Wheel Excavator for the Moon, Mars and Phobos," Proceedings of the 2004 Space Technology and Applications International Forum (STAIF-2004), Albuquerque, New Mexico, February 8-11, 2004.

 

[11] Mars Exploration Rover Mars surface modeling projects on the Web at: http://www.driveonmars.com

[12] Hannaford, B, �The Haptic Community Website, �Teleoperation�, University of Washington, available at: http://haptic.mech.nwu.edu/HapticResearch.html

 

[13] Mazzella, F; Montgomery, K; Latombe, JC; "The Forcegrid: A Buffer Structure for Haptic Interaction with Virtual Elastic Objects", International Conference on Robotics and Automation (ICRA 2002), Washington DC, May 13-15, 2002.

[14] Montgomery, K; Bruyns, C; Brown, J; Thonier, G; Tellier, A; Latombe, JC; "Spring: A General Framework for Collaborative, Real-Time Surgical Simulation", Medicine Meets Virtual Reality (MMVR02), Newport Beach, CA, January 23-26,2001.

[15] Montgomery, K; Stephanides, M; Schendel, S; Ross, M; "User Interface Paradigms for VR-based Surgical Planning: Lessons Learned Over a Decade of Research", Technical Report, Biocomputation Center, Stanford University.

[16] Dorais, G., Bonasso, R. P., Kortenkamp, D., Pell, B. & Schreckenghost, D. (1999). Adjustable autonomy for human-centered autonomous systems on Mars. Proceedings of the AAAI Spring Symposium on Agents with Adjustable Autonomy. AAAI Technical Report SS- 99-06. Menlo Park, CA: AAAI Press.

[17] Barney Pell, Edward B. Gamble, Erann Gat, Ron Keesing, James Kurien, William Millar, Christian Plaunt, Brian C. Williams: A Hybrid Procedural/Deductive Executive for Autonomous Spacecraft. Autonomous Agents and Multi-Agent Systems 2(1): 7-22 (1999)

[18] Rochlis, J, Clark, J.P. and Goza, M., "Space Station Telerobotics: Designing a Human-Robot Interface", AIAA Confeence on Space Station Utilization, Kennedy Space Center, October 2001.

 

[19] DigitalSpace Home Page and Resources:

http://www.digitalspace.com and publications at http://www.digitalspace.com/papers

[20] Virtual Glovebox project described on web at: http://biovis.arc.nasa.gov/vislab/vgx.htm

Part 6: Key Personnel and Bibliography of Directly Related Work

 

6.1�� Management and technical staff members

The following brief resumes introduce management/technical staff members for the proposed project.DigitalSpace certifies that Bruce Damer, the Principal Investigator, has his primary employment at DigitalSpace at the time of award and during the conduct of the project.

 

Name:�������������������������� Bruce Damer (PI)

Years of Experience:���� 23

Position:����������������������� CEO

Education:�������������������� Bachelor of Science in Computer Science (University of Victoria, Canada, 1984); MSEE (University of Southern California, 1986)

Assignment:����������������� Mr. Damer will be the Principal Investigator for the SBIR Phase I effort. He will coordinate all interaction between DigitalSpace and its collaborators and NASA and other participants, be responsible for all staffing, technical design, reporting and documentation.�� Mr. Damer will devote a minimum of 100 hours per month of his time to this project.

Experience:������������������ Mr. Damer is the world's recognized expert on avatars and shared online graphical virtual spaces having created much of the early literature, conferences and awareness of the medium. Mr. Damer is a visiting scholar at the University of Washington Human Interface Technology Lab and a staff member at San Francisco State Multimedia Studies Program.

 

Name:�������������������������� Stuart Gold

Years of Experience:���� 29

Position:����������������������� Chief Architect (communities platform)����������

Education:�������������������� Royal Institute of British Architects

Assignment:����������������� Stuart Gold will serve as a Program Manager for the project and structure the technology components and architecture for the BrahmsVE platform as well as coordinating the 3D modeling teams and provide any database and real-time community tools infrastructural support on the project and the XML based interfaces with Brahms.

Experience:������������������ Mr. Gold is a pioneer of online systems, starting with his work on transaction processing for Prestel in the 1970s and concluding most recently with his leadership in the design and delivery of online virtual worlds including: TheU Virtual University Architecture Competition, International Health Insurance Virtual Headquarters, and Avatars98-2001 online events. Mr. Gold also is the chief architect of the DigitalSpace communities platform, implementing XML and JS based community tools for use by all DigitalSpace projects.

Name:�������������������������� Galen Brandt (Program Marketing � Phases I and II)

Position:����������������������� New business development, DigitalSpace

Experience:������������������ 26 years including creating market strategies for Dun and Bradstreet, SUNY Fashion Institute of Technology, DoToLearn and others.

Assignment:����������������� Content and Market development for Phase I and II. Mrs. Brandt has assisted in reviewing all aspects of proposals and documentation for the project, including the users guides. She has also assisted in terms of team communications and marketing concept and content development.

 

Name:�������������������������� Dave Rassmussen (TE)

Position:���������������������� Member of the 3D Design Studio, DigitalSpace

Experience:����������������� 9 years experience in virtual world design, skills: 3DS Max, Java, Active

Worlds, Adobe Atmosphere, PHP/MySQL database development

Assignment:���������������� Directing team performing 3D modeling and animation, testing

 

Name:�������������������������� Merryn Nielson (Lead CD)

Position:���������������������� Member of the 3D Design Studio, DigitalSpace

Experience:����������������� 9 years experience in virtual world design, skills: 3DS Max, Java, Active

Worlds, Adobe Atmosphere

Assignment:����������������� Web design on project, 3D worlds, avatar design, testing

Name:�������������������������� Peter Newman (SE)

Position:���������������������� Developer in C++, JS, PHP, HTML, 3D Design Studio, DigitalSpace

Assignment:����������������� Programmer of OWorld engine extensions.

Name:�������������������������� Ryan Norkus (CD)

Position:���������������������� Graphic artist, 3d modeler and animator, 3D Design Studio,DigitalSpace

Assignment:����������������� Focusing on the automation of animated sequences

6.2 NASA and Non-NASA Advisors (and Domain Experts)

 

In the spirit of the recent H&RT BAA and the new exploration initiative, DigitalSpace will formally involve experts from several organizations to enable this Phase I project:

Individuals at NASA Centers and universities:

Dr. William Clancey, Brahms Team, NASA ARC

Dr. Maarten Sierhuis, Brahms team, NASA ARC

Dr. Mark Shirley, SimStation team, NASA ARC

Tom Cochrane, SimStation team, NASA ARC/Raytheon

Dr. Mike Sims, NASA ARC

Dr. Geoff Briggs, Scientific Director, Center for Mars Exploration, NASA ARC

Dr. Tom Furness III, HIT Lab University of Washington

Dr. Daniel Thalmann, director, MIRALab, Geneva, Switzerland

Dr. Don Brutzman, Naval Postgraduate School, MOVES Institute

Cooperating Institutions

Colorado School of Mines (CCACS)

Mike Duke is recognized as one of the world�s experts in Lunar resource utilization. His team at the Center for Commercial Applications of Combustion in Space (CCACS) including Brad Blair, Gary Rodriguez and others developed the prototype lunar bucket wheel excavator and continue to promote lunar teleoperations and space resources utilization. CCACS is currently engaged in a lunar construction study for NASA with Bechtel. This study will inform the project.

Colorado School of Mines Center for Automation, Robotics, and Distributed Intelligence (CARDI)

A second group (CARDI) at Colorado School of Mines lead by Bill Hoff, will be advising this project and evaluating our work products in the following areas:

  1. 3D sensing and modeling of terrain
  2. Soil-manipulator dynamics
  3. Estimating dynamical properties of vehicles or manipulators, based on sensed data
  4. Immersive user interfaces such as CAVEs
  5. Building and integrating actual robot vehicles

Stanford University�s National Center for Biocomputation

Since 1998, Stanford University�s National Center for Biocomputation has been a leader in the growing area of telemedicine. Working closely with NASA ARC, the Biocomputation Center developed key technologies in the multimodal sensory display interfaces area including the Virtual Glovebox [20] and has recently engaged in work for crew medical health and safety for ISS crews. Kevin Montgomery, Technical Director of the Center will be providing key support to the project in the area of bringing the virtual vehicle simulator into the Spring haptics-supported environment.

Part 7: Relationship with Phase II or other Future R/R&D

 

DigitalSpace has made a multi-year commitment to the development of the vision we share with the NASA, RIACS, Colorado School of Mines, and Stanford team members who have made this effort possible. Briefly stated, our joint mission is to create the world�s most comprehensive, graphically realistic, collaborative work practice, mission planning and operations development environment. Successful completion of this proposed SBIR Phase I will set the stage for the application of a complete loop haptically enhanced virtual environment for telerobotics. We expect to engage multiple NASA and outside customers for this capability within a Phase II SBIR to qualify the platform for full Phase III commercialization. At the start of Phase III, DigitalSpace plans to either finance its initial operation with customer revenues or venture capital, or if no venture capital is obtained, the principals will self-finance the venture during Phase III.

 

Part 8: Company Information and Facilities

 

DigitalSpace Corporation was incorporated in the state of California on August 24, 1995.DigitalSpace is a company organized to innovate and commercialize in the multi-user virtual worlds and virtual communities market. The company�s business has expanded to include virtual environments for universities, government, non-profits and other projects. DigitalSpace�s headquarters is near Santa Cruz California and it currently leases office space in a 2 story building at 221 Ancient Oaks Way, Boulder Creek, California 95006. Additional DigitalSpace US team members have satellite offices in Phoenix Arizona and Seattle Washington. See our web site at http://www.digitalspace.com for a portfolio of projects and clients.

 

Equipment used

All DigitalSpace team members have at least one personal computer connected on the Internet (most have IBM PCs, Pentium 3-4 class) with all needed software for 3D modeling, website design and programming.��

 

Part 9: Subcontracts and Consultants

 

DigitalSpace will employ a number of consultants and contractors in this project. Facsimiles of their signed letters of commitment and availability are included in table 4 below (DM3D studios personnel are described in Part 6 key contractor personnel above as are our Domain Experts):

Table 4: Consultant and Contractor commitment letters

 

Part 10: Commercial Applications Potential

 

10.1 Potential NASA Commercial Applications

Surface operations on the Moon or Mars, teleoperated servicing robotics on long duration space station and crew exploration vehicle flights and a whole class of crew training and data rich control environments demand a new generation of haptically enhanced virtual environments. Key NASA applications enabled by an advanced closed loop virtual environment telerobotics simulator include:

  • Teleoperated surface operations on the Moon, Mars for exploration, site preparation and In-Situ Resource Utilization (shielding, materials, fuel and water extraction).
  • Teleoperated servicing for ISS, orbiting scientific observatories (such as Hubble) and long duration Crew Exploration Vehicles (CEV).
  • Teleoperated robotics for IVA (such as ISS/PSA)
  • Teleoperation for enhanced utility and safety for launch facilities.
  • Telerobotics training for all classes of remote manipulator systems.
  • Design and prototyping of many types of vehicle through the use of the simulator for simulation-based design and acquisition.

 

All workstation/internet based versions of this environment can be offered as a public outreach (EPO) module for educational uses and the home. Telerobotics for Space Camp, schools based or internet based teleoperations experience for young students or high school/college level students will encourage a space engineering career direction.

10.2 Potential Non-NASA Commercial Applications

Any remotely controlled vehicle in mining, constructions, hazardous waste handling, military operations and other commercial applications requires high fidelity virtual environments for development of viable interaction scenarios, training of operators and day to day production. This project is an important step to making closed loop highly responsive telerobotics operations a commercial reality. The needs for teleoperations in the terrestrial mining and construction industries alone could create a multi-billion dollar annual business in a new generation of safer, and more effectively managed machinery operations.

 

Online games � educational and entertainment applications

Robot �wars� are one of the most popular forms of entertainment in the popular media and robot game competition are some of the finest learning events for K-12 and college engineering students and faculty. Massive multi-player online games are experiencing a large amount of investment and commercial interest. This work will be competent platform for the sourcingof a successful reality-based multiplayer online game both as a learning tool and as a pay-per-play tournament environment. With a robotics focus.

Defense design, training and operations applications

The military will be using semi and fully autonomous agents working closely to support troops and command in surveillance and combat missions throughout the 21st Century. Therefore we expect a great deal of interest surrounding a product in this space. We are already in contact with the Naval Postgraduate School MOVES Institute about cooperation on and adopting a new XML based standard in simulation communications for virtual presence in the control of remote robotic vehicles.

Industrial design, training and operations applications

From factory floor robotic automation to security systems, construction, complex environments where humans work in operation with with mobile agents or other autonomous machine systems

 

Emergency first responder teleoperations

Chemical, biological or nuclear hazards will all require first responders to consider using teleoperated machines to enter affected areas first. This work could directly inform this type of technology for occupational safety.

Part 11: Similar Proposals and Awards

 

DigitalSpace Corporation has received past support enabling the creation of the fundamental building blocks for the current proposed work from a Phase I STTR 2000 (Contract #NAS2-01019), through SBIR Phase I/II 2002 (Contracts NAS2-03134, NNA04AA17C), through SBIR 2003 (NNA04AA32C) and through RIACS. DigitalSpace is submitting no other proposals similar to this 2004 SBIR Phase I proposal.