Fancy a touch of my organs, mate?
How many of you have actually looked at your own x-ray only to scratch your head, wondering what the heck it is and which part of the body that x-ray came from? Doctors are much better readers of such x-rays, and they often engage the help of computerized image analysis in order to extract information from such images. These image analysis makes it a whole lot easier to determine the size of organs before surgery. Unfortunately, the surgeon cannot ‘touch’ nor ‘feel’ the organ in question – but all of that is going to change thanks to a PhD candidate from Uppsala University, Sweden, who successfully developed new technology that makes it a snap to diagnose and plan the treatment of cancer. Haptics technology is involved in the development of such new interactive methods ‘where the mouse and keyboard are replaced by a pen-like three- dimensional mouse that enables the user to feel the virtual organs.’
The desktop display is equipped with a PHANToM Desktop haptic device, where the haptic pen is hooked to the specially constructed workstation which is capable of displaying stereo graphics. According to the Uppsala University news release, ‘Doctors will soon be able to feel organs via a display screen.’
Further information concerning Vidholm’s thesis include “Modern medical imaging techniques provide an increasing amount of high-dimensional and high-resolution image data that need to be visualized, analyzed, and interpreted for diagnostic and treatment planning purposes. As a consequence, efficient ways of exploring these images are needed. In order to work with specific patient cases, it is necessary to be able to work directly with the medical image volumes and generate the relevant 3D structures directly as they are needed for visualization and analysis. This requires efficient tools for segmentation, i.e., separation of objects from each other and from the background. Segmentation is hard to automate due to, e.g., high shape variability of organs and limited contrast between tissues. Manual segmentation, on the other hand, is tedious and error-prone. An approach combining the merits from automatic and manual methods is semi-automatic segmentation, where the user interactively provides input to the methods. For complex medical image volumes, the interactive part can be highly 3D oriented and is therefore dependent on the user interface.“