Touch control comes to VR in 3D interpretation

May 1, 2000
There has been a growing disconnect between the computing needs of geoscientists and interpreters, and the capabilities of their computer tools. Subsurface data is inherently three-dimensional (3D) and growing more complex all the time. Yet, the HCI (human computer interface) typically used for interpreting this data - mouse, keyboard and video display - is overwhelmingly two-dimensional (2D) and less than intuitive.

There has been a growing disconnect between the computing needs of geoscientists and interpreters, and the capabilities of their computer tools. Subsurface data is inherently three-dimensional (3D) and growing more complex all the time. Yet, the HCI (human computer interface) typically used for interpreting this data - mouse, keyboard and video display - is overwhelmingly two-dimensional (2D) and less than intuitive.

There are, of course, many software tools for performing interpretation. While most of these provide for some 3D viewing of data, they are primarily 2D applications. During most of the work of data segmentation and modification, the user is constrained to working in 2D. These applications provide 2D software techniques that utilize a standard mouse to select and edit data entities.

This results in an extremely time consuming (and often boring) process of constantly switching modes - from 2D for interaction to 3D for visualization and confirmation. A significant percentage of an interpreter's time can be spent just in switching back and forth, rather than in actual interpretation.

Haptic interaction

The first challenge was to create and control a 3D cursor. We chose the Phantomtrademark haptic interface from SensAble Technologies. This tool allows users to touch and manipulate virtual objects in the computer, with the touch properties of those objects passed through the Phantom to the user. The tool has two key capabilities important to 3D interpretation - absolute control of 3D cursor motion in a workspace to within thousandths of an inch, and the ability to present data attributes as touch properties to the user. By enabling realistic 3D-touch interaction, it allows a computer user to interact with the virtual world as they would the real world.

The second challenge was to create a viewing environment that provides depth perception and enables the user to navigate in real time. True interaction is only possible where the view update is sub-second, aka real time. Based on our experience with other 3D modeling applications, we chose an IBM Intellistation Z-Pro, which is a Windows NT workstation with dual Pentium processors and an advanced graphics accelerator from Intergraph. Together with stereo shutter glasses from Crystal Eyes and VGL software library developed by Volume Graphics GmbH (VGL), we were confident that this $10,000 desktop system could deliver real time 3D stereo, something unheard of 2-3 years ago.

Putting it all together

The remaining challenge was to build a software application to tie these pieces together, and enable a user to perform 3D interpretation. To this end, we developed a utility we call VoxelNotepadtrademark (VNP). As the name suggests, VNP plugs into existing environments, extending them to provide fast, intuitive 3D interaction with volumetric data. The program provides interactive 3D stereoscopic visualization and haptically-enhanced selection, evaluation, and modification of volumetric data. The main features are import and export volumetric geoscientific data; interactively visualization of volumetric data in stereo; interactively feel for the volumetric data with a haptic interface device; directly picking individual voxels; and directly and interactively modifying data values.

In the near future, the program function will be extended to handling multi-attribute data sets, modeling other entities used in exploration such as well bores, faults and horizons; and extending modification and selection capabilities to include the ability to "lock" the data values of selected regions of the volume.