Rowan University - Personal Web Sites



Virtual Reality Environments for

Integrated Systems Health Management of Rocket Engine Tests

George D. Lecakes Jr.1, Jonathan A. Morris2, John L. Schmalzel2, and

Shreekanth Mandayam1

1Department of Electrical and Computer Engineering, Rowan University, 201 Mullica Hill Road,

Glassboro, NJ 08028, USA

2NASA-Stennis Space Center, Stennis Space Center, MS 39529, USA

Abstract - Integrated Systems Health Management (ISHM) consists of processes managing erroneous conditions that systems may encounter during their operational life by either designing out failures early on, or defending and mitigating any possible failures. A successful implementation of ISHM consists of the following four components: data sensors, computations, data sinks, and visualization modules. In this paper we explore the use of virtual reality (VR) platforms as a candidate for developing ISHM visualization modules. VR allows for a complete and spatially accurate 3D model of a system to be displayed in real time. It provides a medium for improved data assimilation and analysis through its core tenants of immersion, interaction and navigation. Furthermore, VR allows for integrating graphical, functional and measurement data in the same platform – providing for the development of subsequent risk-analysis modules. The research objectives of this paper are focused on – creating a detailed visual model of a multi-sensor rocket engine test facility inside a VR platform; and demonstrating the capability of the VR platform to integrate graphical, measurement and health data in an immersive, navigable and interactive manner. A human-based performance evaluation of the VR platform is also presented. These research objectives are addressed using an example of a multi-sensor rocket-engine portable test stand at NASA - Stennis Space Center’s E-3 test facility.

Index Terms - Virtual measurement systems, human-computer interface, automated test & diagnostic systems

INTRODUCTION

C

omplex structures, such as those used in monitoring and testing space-craft components require a multitude of sensors to determine any irregularities or malfunctions. Intelligent sensors have become critical components of complex test systems [1]. These sensors have the capability not only to provide raw data, but also provide “information” by indicating the reliability of the measurements and its effect on system health at a component level. The effect of this added information is a voluminous increase in the total data that is gathered for a sensor system. If an operator is required to perceive the state of a complex system in a timely manner, novel methods must be developed for sifting through such enormous data sets. Virtual reality (VR) platforms are ideal candidates for performing this task – a virtual world will allow the user to experience a complex system that is gathering a multitude of data with its underlying health parameters in an immersive, interactive and navigable environment [2].

Previous work has shown that visualization provides for the ability to measure and explore areas that are classically difficult or impossible to accurately realize [3]. Furthermore, studies have indicated that human perception can be elevated when utilizing a 3-D representation of data and can provide additional perspective, display meaningful patterns, enhance grouping facility and augment the ability to form relationships between data sets [4]. Visualization techniques have already been employed to combine 3-D pipeline geographic information with spatially accurate models, creating a system that allows the user to review large datasets of information on the anomalies in the pipeline [5]. VR allows for the creation of entire virtual worlds that can be as expansive and detailed as the technology permits [6]. VR has already been extensively used as a method for advanced scientific visualization [7]. Virtual reality provides the user with a psychological experience of being surrounded by a false environment through hardware, software, tracking and visual displays, immersing the user in a virtual world [8, 9].

VR’s applications have been many and extend into diverse areas as training pilots, designing ergonomic structures, psychiatric studies and a visual means to interact with microscopic structures on scales humans could never interact with [10]. VR has allowed for novel methods in understanding the deformation of vehicles through visualization of finite-element analysis of a car’s impact [11]. In the realm of geology, VR can be utilized as a tool for linking details of individual rock samples over vast distances in an easily navigable virtual world capable of depicting entire rock outcroppings [12]. The possible benefits of training doctors with VR tools for surgeries have been demonstrated, which shows the power of VR and visualization in providing hands-on experience [13]. In the realm of human response, VR has been developed to be a powerful tool in determining the causes of public speaking fears. Through the use of a virtual audience, various parameters such as interest and body language were varied and the corresponding reactions of the speaker were noted [14].

The type of data that is to be visualized allows for various representations that can be utilized to convey meaning to the user. These different methods all try to achieve the same goal: to uncover some trend or nuance that provides a marketable benefit to the user [15]. Applying these ideals to the subject of medicine has resulted in enhancing the understanding of a complex system [16] and the hope is that similar advancements can be made in developing a virtual measurement system that provides a value-addition to automated test and diagnostic systems.

The previous work described in the preceding paragraphs demonstrate that VR has firmly established itself as a tool providing significant value in data analysis and visualization applications. The research work described in this paper is motivated by this fact, and extends the method into the novel area of rocket engine tests.

In order to create a virtual world representing a complex test and measurement system, we envisage the incorporation of three forms of data for creating the visualization [17]. They include – graphical data, measurement data and functional data, as shown in Figure 1.

[pic]

Fig. 1. Graphical, measurement and functional data are combined with user input to create a dynamic and interactive virtual environment.

Graphical data is composed of geometric information required for creating the various 3D objects that populate the world. These objects are usually created through the process of modeling and can represent objects such as buildings, furniture, structures, etc. Measurement data is the raw information from various sensors. Finally, functional data is composed of anything created through analysis such as system health information, simulations, and mathematical models.

The graphical data can be created through two methods, either an automated system of capturing actual objects in 3D through cameras and scanners or through the act of modeling in a 3D software packages. Graphical objects can be composed of polygons, non-uniform rational b-splines (NURBS) or subdivisional surfaces; however most data is converted to polygons because they are simpler to handle and nearly all applications support them. These objects can be coupled with materials, which dictate how the object interacts with light to create the illusion that it is composed of different forms of matter (plastics, metals, woods, glass, etc).

Measurement and functional data can be visualized through a variety of different methods depending on the type of data [18]. Scalar 2D information can use contour and height-field planes to indicate the change in value over area. Datasets composed of 3D spatial information can use voxels (volume pixels) to display cubes that represent a value at a particular location. Various techniques can be utilized to reduce occlusion (the blocking of elements) in user’s view through transparency and color changes. In addition, by thresholding values and removing elements that fall outside of a range of view it is possible to observe only information that is deemed relevant by the user. An example would be setting a range of density values to find a specific part in a human being, be it bone or tissue. Vector data can be visualized through cone or arrow plots, which orient themselves in the direction of the data. Other visualizations such as stream-surfaces and stream-lines can capture the change in values over an area in through a 3D surface and are very useful in fluid analysis.

In this paper, VR is deployed inside an Integrated Systems Health Management (ISHM) environment for monitoring the performance of a rocket engine test stand, specifically the portable engine test stand at NASA-Stennis Space Center’s E-3 test facility. Typically, sensor measurement data from rocket engine tests are recorded and stored as spreadsheets. We demonstrate the potential of VR for analyzing sensor data meaningfully in an integrated, interactive and immersive manner, either synchronously or asynchronously.

Objectives

The overall goal of this research project is to demonstrate the advantage of using VR visualizations for integrating multiple sensor measurements inside ISHM environments. In this paper, our research objectives are focused on these specific tasks:

1. To create a detailed visual model of a multi-sensor rocket engine test facility inside a VR platform;

2. To demonstrate the capability of the VR platform to integrate graphical, measurement and health data in an immersive, navigable and interactive manner.

Approach

The overall approach in meeting the two specific objectives for a VR implementation of an ISHM environment of a rocket engine test stand is shown in Figure 2 [19]. A detailed description of each of the stages follows.

[pic]

Fig. 2. The generalized approach for creating a VR environment for sensor-data integration.

1 Creating the Virtual Environment

In VR, a world is a synthetic environment made up of 3D models. These models provide visual representations of real world objects [20]. The detail of these objects is directly tied to the immersive sense of a scene. If improperly utilized, it can result in an immediate feeling of separation from the environment and loss of presence [14, 21]. A visually pleasing and accurate environment will make the user feel as if they exist inside the world.

2 Integrating Sensor Measurements

The VR environment is required to complement the existing ISHM framework depicted in Figure 3.

[pic]

Fig. 3. The current ISHM flow chart.

Sensors relay information to a data acquisition system where it is transmitted by UDP to virtual intelligent sensor software which performs basic health analysis [22]. This information is then sent to the ISHM model where detailed system health and root cause analysis is performed. This information is stored in a database and can then be accessed by the virtual reality system. In addition, several network pathways are available for the rendering software to directly communicate with the ISHM model to acquire system health information.

The three data types in the VR environment are now specifically described with respect to the rocket test stand application that is the focus of this paper. Graphical data consists of the 3D model representation of the Methane Thruster Testbed Program (MTTP) trailer. Measurement data is the raw data obtained from a sensor [20]. This consists of valve position information, temperature, pressure, velocity, etc. Functional components are composed of data that has been transformed into meaningful information or data obtained through analysis [16]. For example, system health information would be considered functional data. Specifically, information about the state of a subsystem would be analyzed to determine the systems level of operation. Raw data itself does not always garner insightful information. Through the use of varying algorithms, it is possible to extract important features that can be used by a program to implement decision making.

3 4 Creating Multiple, User Specifiable Visualizations

One of the key components of virtual reality is the ability to immerse a user. It has already been shown that when displaying three-dimensional data, VR increases comprehension [23, 24]. In order to attain this immersion, any incoming data must be represented in a way that is quickly and easily understood. Information should be visualized in a manner so that the navigator is not bombarded with confusing or cluttered imagery.

By using attributes such as color, size and shape, intuitive visualization methods can be utilized that assist in the understanding of data, and is referred as an iconic display technique [25]. By changing these parameters in accordance with incoming data, these visualizations can provide an intuitive understanding of the information. By providing multiple preset visualizations to the user, they have the power to change how they view the information based upon individual presence. This makes for a virtual experience tailored to the specific user based on their visualization preferences. This ability of preferring particular visualizations and removing information from view is regarded as interactive filtering [25].

Results

The following virtual world was created to implement the described objectives. A Fakespace ImmersaDesk R2 was used as the projection system. Autodesk Maya 8.5 and Autodesk 3D Max 8 were both used to create the 3D geometry. WorldViz Vizard R3 rendered the virtual environment.

A. Virtual Environment

To create a virtual environment, three sets of information must be combined. Figure 4 outlines the data and tasks required for each data type to create the VR environment. The first task is in gathering the necessary data to create the graphical assets.

For graphical data AutoCAD drawings of a custom rocket test stand were provided by Stennis Space Center in Mississippi, USA. These were flat two-dimensional drawings of original plans for various pipelines and general structural dimensions. However, the existing structure had been modified and various components exchanged so these drawings could not be utilized. A more traditional method of modeling the trailer by utilizing reference images was performed. With over fifty photographs taken with a digital camera of the test stand from various angles and positions, enough spatial information was captured to manually model the structure.

The trailer model was created in Autodesk Maya 8.5. The model was then broken into component parts and exported as object (obj) files to Autodesk 3D Studio Max. This program was utilized because of an exporting feature available that would make the model completely compatible with the rendering program. The models were converted into .OSG files. In this format, the geometric and texture detail could be easily imported into the rendering software of WorldViz Vizard. The final model is shown in Figure 5.

This 3-D virtual representation of the test stand is important because it allows for a complete spatial representation of the entire assembly of sensors and pipe ways. A completely spatially accurate model in 3 dimensions will provide insight into anomalies that may be lost in two dimensional settings. An example would be a leak over two temperature sensors positioned vertically over one another. This relationship between these two pipes would be lost in a 2D display and only an individual with intimate knowledge of the structure could point it out. However, with the 3D spatially accurate model, any operator would have the spatial information and have the potential of identifying the cause of the anomaly.

B. Sensor Data Integration

Data from test runs on the E-3 Portable Engine Test stand were provided by the Stennis Space Center in the form of twenty one separate trials. These data files were in a Microsoft Excel spreadsheet containing over thirty individual sensor recordings, each with timestamp information for a thirty second trial run of the engine. This data was modified into a comma-separated-value (CSV) format and loaded into memory through the Vizard built in Python language editor. The data was internally stored as a two dimensional array. Each column related to a different sensor and each row a reading at a particular instance in time. The data then had to be registered, or tagged to a particular visualization. The visualizations for each valve were registered to their particular column. The data array was then accessed by each object to determine the current reading as time progresses in the simulation. Since each sensor reading had a corresponding time stamp, the ability to stop, start and scroll through time can be enabled in future versions. This would provide the operator with the ability to replay important events as well as slow down the simulation speed during complex moments.

The data provided only contained the states of valves and the readings of sensors. No root-cause analysis was contained within the documents. That area was managed by the G2 system currently employed by NASA for root-cause analysis identification. Currently, integration of the root-

[pic]

cause analysis has resulted in a UDP network interface from G2 to an intermediary program (Labview) and finally to Vizard. This early implementation has made it possible for strings of information to be sent to Vizard to designate failure pathways as well as the root cause of system faults. Future work will help to integrate these two programs so that pertinent data can be requested based on the user’s requirements. This would allow Vizard to access the full suite of root-cause analysis algorithms inside G2 to provide the state of the system and any health problems that may arise.

C. Visual Representation

While there were over thirty features contained within the provided sensor information, nearly all could be classified as being valve state, valve command, valve feedback, pressure, temperature, strain or current. The data was either a range of values, percentages or a binary on/off states.

In order to convey meaning, a visual system was designed that immediately indicated what was occurring at each valve based on incoming sensor data. Color, object size, and mental relationships were utilized in the visualizations. Figure 6 shows three of the visualizations at the front of the MTTP trailer. The standard relationship of red meaning closed and green meaning open was implemented in conveying the state of each valve. At each valve, two diamonds labeled state and command would change color from green to red. This indicated whether the valve was commanded closed (red) or open and if the valve itself was obeying the command.

If the valve was in a transition period between open and closed, the diamond for valve state would turn yellow. It would be simple for an operator to notice state inconsistencies because contrasting green and red colors would be displayed. Below the select few valves outfitted with feedback sensors were three transparent bars. These bars would begin to enlarge and pulse yellow as the feedback percentage increased which was detected by an intelligent senor. As feedback increased, so would the size of the three bars. In one scenario, the command and state were in disagreement, yet the simulation ran perfectly. On second notice, during brief intervals the feedback on two sensors would drastically increase. From the feedback visualization an operator noticed that the valve sensors were faulty and needed replacement. Coupling these visualizations together not only allows an operator to see the data but understand and remediate anomalous conditions.

For temperature data, two visualization methods were made available for the user. The first option was a thermometer, which featured an arrow that would move from the top to bottom on a rectangle depending on the current temperature. This value was a relative range, which utilized the lowest and highest encountered values for that particular sensor. This was done because various locations on the trailer would experience varying temperatures and no set scale would adequately suffice. The thermometer was colored with a gradient from blue to red to indicate cold or hot. A more general approach of splitting the temperature ranges into cold, mild and hot was also implemented. At each of these ranges, the visualization would change from a snowflake to a water droplet and finally a flame. This visualization was created for when a user quickly wants to assimilate data without trying to read the height of an arrow relative to a color gradient. This was useful when a user would need to watch over several visualizations at once especially when zoomed out and viewing the entire trailer at once.

[pic]

Fig. 5. The virtual representation of the MTTP trailer.

For pressure, a gage was created with a gradient texture from green to red placed on a geometric plane. A needle would rotate around the gage and indicate low pressure while over the green zone and high pressure over the red. Another visualization approach for the user was a sphere within a sphere, which can be seen in Figure 7. A large orange translucent outer sphere provided the maximum pressure at a valve. A blue sphere within the orange sphere represented the current pressure. This blue sphere would increase in size and at maximum pressure would encompass the orange sphere. This method of visualization was primarily utilized to understand the general pressure over an entire area at once, rather than inspecting gages of particular points.

Strain was displayed through a gradient model similar to the thermometer example with a color range from green to red and an arrow moving vertically over the gradient. Strain was also visualized by three bars that would progressively grow in separation as the strain increased.

Electric current was implemented as a lightning bolt that would pulse when flow was occurring. The size of the bolt indicated the relative amount of current flowing through the area. No other visualization was needed for current since this was only used in this model to show when the ignition switch was being activated to ignite the trailer.

[pic]

Fig. 6. Current, temperature, pressure and strain visualizations and numerical values at the front of the MTTP rocket test trailer.

An overhead map of the model was located in the upper left with a red dot signifying the location of the user. This made it easy for inexperienced users to determine their location in the virtual scene. This will also be important for future work where test stands will span thousands of meters in size. An operator could easily get lost in such an immense virtual world, making a map a very useful feature. A series of buttons on the left of the screen allows for quick selection of varying visualization methods. The operator can also turn off the numeric values shown below the visualizations with a series of buttons on the left interface. On the bottom, navigation buttons provide the user with a quick way to move to a top, front, rear, left or right perspective. Finally, on the right a series of three circles indicate the shutdown state that the MTTP test trailer is currently in. The three states were advance to shutdown, normal shutdown or emergency shutdown. Each of these GUI elements can be seen within Figure 2.

[pic]

Fig. 7. View of pressure, sensor and temperature visualizations across the MTTP VR model.

Several of the visualization methods described in this paper should be adapted to the specific situation in order to yield the desired effect if the test-system were to be scaled-up. Were this setup to include several thousand sensors it would be difficult for an operator to visually understand the readings of thousands of thermocouples. The meter visualization is most beneficial when the user is zoning in on only a select few objects. In the case of the presence of many sensor points, the use of single icons and color can help prevent confusion and overwhelming amounts of information. Visualization icons (e.g. snowflake, water and fire) each depict a different symbol with their own distinctive color (e.g. white, blue and red). The same effect would be evident in the valve-state visualizations where green and red are employed to indicate open and closed states for both the actual state of the valve and how it is commanded. The user only has to look for contrasting colors of red against green to know there is a possible problem with a valve, greatly simplifying the task of searching for anomalous readings.

Scaling the system upwards also has an effect on the required computational burden. As the number of sensors increases so will the required time to update each object with new information. Methods would have to be developed to only process user-specified sections of the test data. Also, one could eliminate visualizations of particular components or sections of a test stand, or merge several components into a single visualization. By reducing many components of a similar type or common area into a single visualization, the number of updates and checks can be reduced and the system performance enhanced.

D. Performance Evaluation

Inasmuch as an extensive body of previous research work exists attesting to the effectiveness in VR to provide improvement in the ability of the users to interpret data from complex systems, a human-factors study deploying the visualization platform in the field should form a part of the overall research project. Although, this has not been a major focus of this research work, we have conducted a preliminary study for obtaining a human-based performance evaluation.

A questionnaire was presented to test-engineers at NASA-Stennis Space Center, requiring a comparison of the procedures that they currently employ for performing ISHM of rocket engine tests, with the VR platform that we have designed. The survey requested responses to two sets of questions – one for the users and another for the developers of the VR environment. The survey respondents were asked to rate specific categories on a scale of 1 (poor) to 5 (excellent). The user questionnaire focused on the user’s navigation, interaction and immersion experiences. The developer section of the questionnaire concentrated on the versatility and memory requirements of the software. The respondents were asked to rate not only the features corresponding to the performance of the visualization environment, but also the importance of the specific feature. In addition, the respondents were asked to identify the methods they currently employ for performing ISHM.

Table 1 summarizes the average responses of the users and developers, comparing the VR environment presented in this paper, with the procedures that are currently used at NASA-Stennis Space Center. The responses indicate that features provided by a VR environment, namely, navigation, interaction and immersion are rated to have significant importance; and the most important features being those corresponding to interaction. The VR environment clearly outperforms the current procedure in the navigation and immersion categories, whereas in the interaction category, the performance of the VR environment is equal to or greater than the current process. From the developers’ point of view, the VR environment clearly exceeds the performance of the current procedure in the case of development time and flexibility, and is comparable in the case of compatibility, memory requirements and execution time.

These human based performance evaluation results provide us with confidence that the VR platform provides value in integrating graphical, measurement and health data in an immersive, navigable and interactive manner.

Conclusion

In this paper the use of Virtual Reality as a visualization interface for ISHM has been demonstrated. Data was visualized though the use of several visualization methods to allow the user to select their own preferences. Depending on the requirements, the visualizations can be changed to provide information either quickly and generally or more precise and exact. The interface also provided the user with several options for navigation around the environment as well as a map to easily determine relative location. Additional visualizations have been developed to allow for future 3D scalar fields of simulation data to be overlaid on the graphic model. VR as a visualization interface allows for the following benefits in ISHM:

1. The VR environment allows for visualization of multiple sensor data sets that might only have been previously viewed as two-dimensional spreadsheets, allowing for more complete and easier understanding of what is occurring.

2. VR enables a visualization method that includes the spatial relationship of the 3D models, a feature that can be lost with other two-dimensional approaches.

Future implementations include integrating streaming data from a server. Currently, the demonstration loads the data from a precompiled CSV data file at the start of the simulation. By directly integrating the virtual world into a server, information can be drawn on demand. In addition, data analysis can be performed outside of the VR world, freeing up system resources. More complex root-cause analysis can be performed on a separate computer and the results channeled to the virtual world visualizations.

The primary focus in implementing VR as a visual interface for ISHM systems is to provide the best graphical medium to the user. The basic three tenets of VR: immersion, navigation and interaction, allow for a visually detailed and accurate environment. We anticipate continuing this research work and conducting more field tests with operators to obtain more comprehensive measures of the evaluation of system performance. Furthermore, as part of future work, a variety of test-stands will be modeled and the generalization ability of the visualization methods described in this paper will be investigated.

Acknowledgment

THIS WORK WAS SUPPORTED IN PART BY NASA JOHN C. STENNIS SPACE CENTER UNDER ORDER NO.NNS06AC08P. THE AUTHORS THANK THE FOLLOWING ROWAN UNIVERSITY STUDENTS FOR THEIR ASSISTANCE IN CREATING THE VR ENVIRONMENT – STEVEN LATMAN, KEVIN GARRISON AND KJ BROWN.

References

1] J. SCHMALZEL, F. FIGUEROA, J. MORRIS, S. MANDAYAM, AND R. POLIKAR, “AN ARCHITECTURE FOR INTELLIGENT SYSTEMS BASED ON SMART SENSORS,” IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 54. NO. 4. PP. 1612 – 1616, AUGUST 2005.

2] L.G. Shapiro, G.C. Stockman, Computer Vision, Prentice Hall, 2001.

3] D.M. Germans, H.J.W. Spoelder, L. Renambot, H.E. Bal, S. van Daatselaar, P. van der Stelt, “Measuring in Virtual Reality: A Case Study in Dentistry,” IEEE Transactions on Instrumentation and Measurement, Vol. 57, No. 6, pp. 1117 – 1183, June 2008.

4] A. Saddik, A.S.M.M. Rahman, M.A. Hossain, “Suitabiulity of searching and representing multimedia learning resources in a 3-D virtual gaming environment,” IEEE Transactions on Instrumentation and Measurement, Vol. 57, No. 9, pp. 1830 – 1839, September 2008.

5] S.O. Koo, H.D. Kwon, C.G. Yoon, W.S. Seo, S.K. Jung, “Visualization for a Multi-Sensor Data Analysis,” International Conference on Computer Graphics, Imaging and Visualization, pp. 57 – 63, July 2006.

6] T. Hong, A Virtual Environment for Displaying Gas Transmission Pipeline NDE Data, M.S. Thesis, Iowa State University, 1997.

7] D. Dam, H. Laidlaw, R.M. Simpson, “Experiments in immersive virtual reality for scientific visualization,” Computers & Graphics, Vol. 26, pp. 535-555, 2002.

8] C. Ware, Information Visualization: Perception for Design, 2nd ed. San Francisco : Morgan Kaufmann, 2004.

9] A. Van Dam, A. Forsberg, D. Laidlaw, J. La Viola, R. Simpson, “Immersive VR for scientific visualization: a progress report,” IEEE Computer Graphics and Application, November-December 2000.

10] F.P. Brooks, “What's real about virtual reality,” IEEE Computer Graphics and Applications, Vol. 19, No. 6, pp. 16-27 November/December 1999.

11] M. Schulz, T. Reuding, T. Ertl, “Analyzing engineering simulations in a virtual environment,” IEEE Computer Graphics and Applications, Vol. 18, No. 6, pp. 46-52, November/December 1998.

12] J.B. Thurmond, P.A. Drzewiecki, X. Xu, “Building simple multiscale visualizations of outcrop geology using virtual reality modeling language (VRML),” Computers & Geoscienes, Vol. 31, No. 7, pp. 913-919, March 2005.

13] B. Reitinger, A. Bornik, R. Beichel, D. Schmalstieg, “Liver surgery planning using virtual reality,” IEEE Computer Graphics and Applications, Vol. 26, No. 6, pp. 36-47, November/December 2006.

14] M. Slater, D. Pertaub, A. Steed, “Projects in VR: Public Speaking in Virtual Reality: Facing an audience of avatars,” IEEE Computer Graphics and Applications, March/April 1999.

15] S. Eick, “Visual discovery and analysis,” IEEE Transactions on Visualization and Computer Graphics, Vol. 6, No. 1, pp. 44-58, January 2000.

16] S. Papson, An Investigation of Multi-Dimensional Evolutionary Algorithms for Virtual Reality Scenario Development, M.S. Thesis, Rowan University, 2004

17] S. Papson, J.A. Oagaro, M.T. Kim, R. Polikar, J.C. Chen, J.L. Schmalzel, and S. Mandayam, “A virtual reality environment for multi-sensor data integration,” Proceedings of the ISA/IEEE Sensors for Industry Conference, New Orleans, LA, January 2004.

18] H. Wright, Introduction to Scientific Visualization, Hull : SpringerScience + Business Media, 2007.

19] G. Lecakes Jr., J.A. Morris, J.L. Schmalzel, S. Mandayam, “Virtual reality platforms for integrated systems health management in a portable rocket engine test stand,” IEEE International Instrumentation and Measurement Technology Conference, Victoria, Vancouver Island, British Columbia, Canada, May 2008.

20] M.J. Pierce, Scientific Visualization of Gas Transmission Pipeline NDE Data. M.S. Thesis, Iowa State University, 1999.

21] M.J. Schuemie, P. Van Der Straaten, M. Krijn and C.A.P.G Van Der Mast, “Research on presence in virtual reality: a survey,” Cyberpsychology & Behavior, Vol. 4, No. 2, pp. 183-201, 2001.

22] J. Schmalzel, F. Figueroa, S. Mandayam, “The role of intelligent sensors in integrated systems health management,” IEEE International Instrumentation and Measurement Technology Conference, Victoria, Vancouver Island, British Columbia, Canada, May, 2008.

23] N.S. Lee, J.H. Park and K.S. Park, “Reality and human performance in a virtual world,” International Journal of Industrial Ergonomics, Vol. 18, pp. 187-191, 1996.

24] C. Ware, G. Franck, “Evaluating stereo and motion cues for visualizing information nets in three dimensions,” ACM Transactions on Graphics, Vol. 15, No. 2, pp. 121-140, April 1996.

25] D.A. Keim, “Information Visualization and Visual Mining,” IEEE Transactions on Visual and Computer Graphics, Vol. 8, No. 1, pp. 1 – 8, January-March 2002.

-----------------------

Fig. 4. Graphical, measurement and functional data are combined with user input to create a dynamic and interactive virtual environment.

Table 1: Human-based performance evaluation questionnaire.

|Category |Question |Average Score |

|  |  |Importance |Current |ISHM Visualization|

| | | |Procedure | |

|NAVIGATION |Ease of Navigation to a specific time/location/sensor |4.5 |2.5 |4.0 |

| |reading | | | |

| |Multiple methods of navigation |3.0 |3.0 |3.5 |

|INTERACTION |Ease of selecting the source of sensor readings, |5.0 |3.5 |4.0 |

| |components and health | | | |

| |Ease of querying data from sensor readings, components|5.0 |4.0 |4.0 |

| |and health | | | |

| |Ease of identifying the source of sensor readings, |5.0 |4.5 |4.5 |

| |components and health | | | |

|IMMERSION |Ability to view sensor readings, components, health |4.0 |3.5 |4.0 |

| |Model representation faithful to reality |3.5 |2.5 |4.0 |

| |Identifying the location/time of an anomalous/ benign |5.0 |4.0 |4.5 |

| |condition | | | |

|VERSATILITY |Development time |2.5 |1.5 |3.5 |

| |Compatible file formats |3.5 |3.0 |3.0 |

| |Flexibility to develop models |4.0 |2.5 |4.5 |

|MEMORY |System memory requirements |3.5 |3.0 |3.5 |

| |Execution time (software package) |4.0 |3.0 |3.5 |

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download