The Volume Haptics Toolkit provides an extension of the H3D API into the domain of volumetric data. The toolkit provides the scene-graph nodes needed to generate both advanced haptic feedback from volumetric data and for producing scientific visual rendering. Primarily the haptic algorithms provided by the toolkit are designed to provide natural representations of information from scientific volumetric data and to provide guidance in the exploration process. The methods are, however, very general and can also be used for other purposes, for example to generate guidance fields in games or alternative tasks.
The toolkit is based on the results from four years of research on haptics for scientific visualization. These results include the development of haptic primitives that in the toolkit are used both as a comprehensive abstraction layer for implementing haptic interaction and as an effective means for stable calculation of the haptic feedback. The research has also identified important aspects and outlined the guidelines needed for successful implementation of effective haptic interaction in scientific data exploration. These aspects and guidelines have been carefully adhered in the development of VHTK to make it easy to build both prototypes and advanced applications.
The special features of this toolkit, that have been introduced through the implementation of research results, include
To enable the interactive construction of the haptic environment needed for the effective design of the form of interaction, the toolkit is built on the concept of a configurable data flow pipeline. In this model, shown in the figure, the features of interest are extracted from the data at real-time. Haptic transfer functions are used to extract material information from the data, such as friction or hardness. The feature and material information are used to control haptic primitives. The primitives, specifying the form and nature of the haptic feedback, are collected by a force mapper that calculates the final force feedback that is, in turn, sent to the haptic device, typically at a rate of 1 kHz. Both haptic and visual components can be interactively manipulated at run-time through the event system of H3D.
See documentation for more details.