Skip to main content Accessibility help
×
Home

Virtual reality modelling language: freely available cross-platform visualization technique for 3-D visualization of the inner ear

  • Paul Hans (a1), Alan Jackson (a2), James E. Gillespie (a2) and Richard T. Ramsden (a1)

Abstract

This was a study of the use of virtual reality modelling language (VRML) for cross-platform interactive three-dimensional (3-D) visualization of high-resolution magnetic resonance (MR) images of the inner ear in the assessment of cochlear implant candidates.

A retrospective case review was made of cochlear implant candidates undergoing pre-operative high-resolution MR studies to determine their suitability for implantation. 3-D visualizations of MR scans of the inner ear structures were created using surface rendering and exported as portable VRML files.

Case studies are presented to illustrate different points of interest. VRML reconstructions aided the interpretation of two-dimensional (2-D) source images in a variety of inner ear abnormalities.

VRML is an internationally recognized standard for cross-platform 3-D visualization that creates a means of providing the implanting surgeon with a portable 3-D representation of the inner ear, aiding interpretation of the complex cross-sectional anatomy of these structures, and guiding selection of patients for implantation as well as implantation technique. The elucidation of the mechanisms behind inner ear malformations can also be aided by detailed imaging studies of the temporal bone, with VRML reconstructions providing an easily interpreted representation of deformities.

Copyright

Keywords

Virtual reality modelling language: freely available cross-platform visualization technique for 3-D visualization of the inner ear

  • Paul Hans (a1), Alan Jackson (a2), James E. Gillespie (a2) and Richard T. Ramsden (a1)

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed