Simulation and Interaction Methods for Intuitive Deformation of Volume Images
Fecha
2019
Autores
Título de la revista
ISSN de la revista
Título del volumen
Editor
Universidad Rey Juan Carlos
Resumen
The areas of simulation and visualization in computer graphics offer technical solutions
that can aid medical applications in multiple ways, such as visualization and exploration
of volume images to detect tumors or other diseases, or simulate the deformation of organs
to train surgeons or plan surgical interventions. However, enabling the simulation of
deformations on a medical image of a patient requires many pre-processing operations,
which are time-consuming. These operations include image segmentation, meshing of
anatomical elements, or assignment of mechanical parameters and boundary conditions.
Depending on the organ to be deformed, different parameters and settings must be specified,
which requires additional workload and a per-case adaptation of the simulations and
the pre-processing.
This thesis starts from the grand goal of empowering radiologists and surgeons with
a 3D environment where they can easily and intuitively interact with a patient’s volume
image. We stand on important conditions, such as the lack of knowledge of simulation
methods by clinical experts, and the need to create tools with a simple learning curve, and
without pre-processing needs. For instance, surgeons should be able to load the medical
image of a patient, and directly interact with the organs present in this image, without
complex segmentation or case-dependent parameter setting. The simulation tool should
allow surgeons to move organs using a computer mouse, or even their fingers directly
on a touchscreen. By avoiding tedious and complex processes such as segmentation, we
can reduce the cost and complexity to use simulation tools, and increase their usability
for more patient cases. Rapid setup of medical simulation is a radically new goal with
respect to most previous research.
Standing on the aforementioned goals and conditions, we have developed novel deformable
models that allow simple and interactive manipulation of volumetric medical
images, and can therefore enable the creation of simple tools for exploration and planning
of surgical interventions. Our novel models and methods can be classified as follows:
First, we have designed an interactive algorithm to deform 3D images without segmentation
through a corotational coarsening method, which uses a coarse and a fine mesh.
The fine mesh captures the properties of the volume at a fine level, and the coarse mesh
allows very fast simulations. Each coarse tetrahedron represents the material of the under lying fine tetrahedra. The meshes can be either regular or irregular, although we demonstrate
the method on regular meshes for simplicity. The coarsening method enables the
simulation of the tetrahedral mesh, and then we apply a rasterization method to deform
the full volume from the coarse nodes.
Second, we add accurate boundary conditions at arbitrary locations within the volume.
The previous step assumes the application of forces at coarse nodes, but we desing additional
methods that demonstrate how to apply external forces at fine nodes without the
need to resolve the simulation system at a fine level. We take into account fixed and moving
fine nodes inside each coarse tetrahedron, and we develop a corotational coarsening
formulation where fixed fine nodes affect directly the behaviour of coarse nodes.
Finally, we create an interaction method that allows a person to move 3D organs using
fingers on a 2D touchscreen. Motion is imposed using the fingers, which gives the user the
sensation of grasping and rotating anatomy directly. We demonstrate this novel interaction
method on a volume image that contains the back bone, kidneys, veins, the stomach
and other soft anatomy in the abdomen region. The sensation of direct manipulation
is provided by maintaining the same anatomical elements always under the fingers, as the
interaction evolves.
In conclusion, we have developed a variety of simulation and visualization methods
that allow non-technical users to deform and manipulate anatomical elements in medical
volume images in an intuitive way, directly on a touchscreen.
Descripción
Tesis Doctoral leída en la Universidad Rey Juan Carlos de Madrid en 2019. Director de la Tesis: Miguel Ángel Otaduy
Palabras clave
Citación
Colecciones
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 Internacional