Research Terms
Industries
This mixed simulation teaching tool uses augmented reality to provide training in blind and guided medical procedures. In the United States, over 250,000 deaths per year are due to medical error, making it the third leading cause of death. Many of these instances owed to substandard care and would be preventable if healthcare professionals received proper training. Many medical procedures require placing an instrument like a needle inside a target while avoiding accidental contact or puncture of surrounding organs or tissues. For procedures such as these, verbal and written instruction, while necessary and worthwhile, cannot take the place of hands-on training. Researchers at the University of Florida have developed a mixed simulation system that allows medical training instructors to visualize and consistently score trainee performance. The simulation integrates sensors and augmented reality principles with physical and virtual medical image models, enabling the students to rehearse important medical procedures and to self-debrief without endangering human lives.
Mixed simulators with anatomically correct physical and virtual components that combine real-time 3D visualization with tracked instruments, recording and playback, and automated and consistent scoring algorithms to facilitate training of clinicians in procedural skills
This mixed simulation technology collocates anatomically authentic virtual and physical 3D objects that represent the part of the human body that is of interest. The simulation has already successfully applied to three procedures: central venous access (upper torso and neck), regional anesthesia (spine) and ventriculostomy (brain). In all three applications, a sensor with six degrees of freedom that is smaller than a grain of rice secures inside the needle bore near the needle tip such that, as the trainee directly manipulates and steers the needle, the needle tip position is traceable respective to both the physical and virtual components representing the human body. Real-time 3D visualization allows trainees and instructors to observe and critique technique and strategy. Because of the needle tip tracking, metrics heretofore unavailable facilitate implementation of automated and consistent scoring algorithms. These scoring algorithms open the possibility for self-debriefing when experts are not available to provide feedback. CT and MRI scans of individual humans along with 3D files of discrete objects that represent different organs and tissues facilitate the physical and virtual imaging. 3D files fed into a 3D printer (fast prototyping machine) create the physical parts of the mixed simulation. The system integrates readily available commercial off-the-shelf components into turnkey (set up time of about seven minutes) simulation systems that are compact and lightweight (meeting airline checked luggage requirements).
This 3D-visualization precision biopsy system visually assists urologists in accurately sampling a desired location within the prostate, whether the location is a template location (freehand systematic prostate biopsy, sPBx) or a region of interest or target (targeted biopsy). Treatment cannot begin without a positive diagnosis of prostate cancer from pathology exam of biopsied samples. An earlier diagnosis offers more options for treatment or cure. Traditional TransRectal UltraSound (TRUS) guided freehand systematic prostate biopsy, and to a lesser extent targeted PBx via MRI/TRUS fusion, can miss cancerous lesions that are present and undesirably produce a false negative diagnosis. A false negative will delay initial treatment for prostate cancer by over 6 months, if a repeat biopsy is performed (and samples the lesion) or else the cancer is completely missed, complicating treatment and greatly diminishing patient outcomes. Prostate biopsy is spatially challenging because the commonly used TRUS image is 2D and properly interpreting the TRUS image requires mental rotations from the clinician.
Researchers at the University of Florida have developed a 3D, 6 degrees of freedom (6DOF) tracking, guidance, and visualization system to increase precision and ease of use during TRUS-guided prostate biopsy via the transrectal or transperineal route. The system enables urologists to capture tissue samples accurately from the entire prostate to avoid missing potentially malignant tissue. This will ensure an earlier diagnosis of prostate cancer compared to current technology, if malignancy is present, that allows for better treatment or cure options for the patient.
An intuitive, 3D-visualization guidance system for freehand systematic or targeted prostate biopsy that assists urologists and provides real-time feedback in accurately placing biopsy cores at the desired locations via the transrectal or transperineal route
This biopsy system displays a virtual TRUS probe and biopsy needle through real-time 6 DOF tracking and 3D visualization. A prostate catheter with a tracking sensor is anchored into the prostatic urethra to track the prostate location and orientation in real time. A user-interface software depicts the prostate and the location of the TRUS probe and biopsy needle to help guide the urologist during the procedure. An echogenic liquid filling the catheter ensures clear visualization of the catheter during ultrasonography. This helps to prevent unintended sampling of the catheter or tracking sensor. The urologist can then manually trace the outline of multiple cross-sections of the prostate and input them into the software to create a segmented 3D prostate that displays the locations to be sampled.
Facilitates the Training of Clinicians in Procedural and Cognitive Skills Using 3D Color Visualization
This mixed simulator technology is especially suited for training in blind and guided medical procedures. This teaching tool uses augmented reality principles similar to the yellow first-down line on TV football games and has the potential to improve patient safety. The technology seamlessly integrates numerous exponential technologies, such as medical imaging, miniature 6 degrees of freedom (DOF) sensors smaller than a grain of rice and 3D printers to collocate (superpose, overlay or underlay) physical and virtual systems that are anatomically correct replicas of individual humans.The Institute of Medicine has estimated that medical errors kill more people than motor vehicle accidents, breast cancer and AIDS combined. Many of these adverse events were attributed to substandard care and could have been prevented if healthcare professionals had been properly trained. Verbal and written instruction, while necessary and worthwhile, cannot take the place of hands-on training. Alternatively, practicing on human patients is unconscionable. This simulator allows instructors to visualize and consistently score trainee performance, while students can rehearse and self-debrief without endangering human lives.
Mixed simulators with anatomically correct physical and virtual components that combine real-time 3D visualization with tracked instruments, recording and playback, and automated and consistent scoring algorithms to facilitate training of clinicians in procedural skills
Many medical procedures require placing an instrument like a needle inside a target such as a vein while avoiding accidental contact or puncture of surrounding organs or tissues. The mixed simulation technology developed by UF researchers collocates anatomically authentic virtual and physical 3D objects that represent the part of the human body that is of interest. The technology has already been successfully applied to three procedures: central venous access (upper torso and neck), regional anesthesia (spine) and ventriculostomy (brain). In all three applications, a 6 DOF sensor smaller than a grain of rice is fixed inside the needle bore near the needle tip such that, as the trainee directly manipulates and steers the needle, the needle tip position can be tracked respective to both the physical and virtual components representing the human body. Real-time 3D visualization can be turned on to allow trainees and instructors to observe and critique technique and strategy. Because of the needle tip tracking, metrics heretofore unavailable are provided that facilitate implementation of automated and consistent scoring algorithms. These scoring algorithms open the possibility to self-debriefing when experts are not available to provide feedback. Individual humans were scanned using CT and MRI and 3D files consisting of discrete objects representing different organs and tissues were created. The physical parts of the mixed simulation are created by feeding the 3D files to a 3D printer. The technology integrates readily available commercial off-the-shelf components into turnkey (set up time of about seven minutes) simulation systems that are compact and lightweight (meeting airline checked luggage requirements).
This computer-guided bracket positioning device and indexing tool employs tracking sensors for more precise, efficient, and effective orthodontic bracket placement. In the United States alone, more than 4 million people wear braces. These patients invest thousands of dollars into orthodontic procedures to align their smiles and prevent complications due to crooked teeth or misaligned jaws. Brackets are the most important element and bracket placement the most important procedural element of orthodontic treatment. Available orthodontic procedures are labor intensive, time consuming, and operator dependent and not reliably accurate. Patients require approximately 28 brackets for a full set of braces; each bracket plays a role in rotating and aligning teeth and jaw. Researchers at the University of Florida have developed a computer-guided bracket positioning device and indexing tool with tracking sensors to guide bracket placement. Using these tools, an orthodontist or assistant can effectively plan bracket placement on a virtual or physical model of a patient’s teeth, and then use sensors in a bracket-positioning tool to place individual brackets precisely in their planned optimal position.
Reliable treatment plan and placement of orthodontic braces
This computer-guided bracket positioning device and indexing tool with tracking sensors enables precise planning and application of orthodontic brackets. An assistant would use a bracket positioning tool and an indexing tool to fit brackets onto a physical or virtual model of a patient’s teeth. (The patient does not need to be present.) The assistant saves these bracket placement coordinates (with six degrees of freedom) on a corresponding model within the computing system. During the actual bracket placement, an orthodontist or assistant would apply each bracket individually using the predetermined coordinates with computer guidance and sensor feedback. The sensors indicate (via audio, visual, or haptic feedback) when each bracket is precisely in the preplanned installation position, enhancing precision, efficiency and reliability of orthodontic brace bracket application procedures.
This ultrasound-guided needle insertion simulator is a non-invasive medical tool that interacts with actual, live ultrasound images. More realistic, applicable, and accessible ultrasound-guided simulations of needle insertions improve the learning experience of the user, greatly reducing the improper use of ultrasound and the amount of needle passes and movements inside the patient. Frequent needle passes and movements may put the patient at risk for mild complications, such as discomfort, as well as more serious complications, such as pneumothorax. The use of ultrasound in medicine is becoming the standard care for many medical procedures. The global portable ultrasound market is estimated to grow at an annual rate of 8.74 percent through 2019. In 2014, more ultrasound systems were sold than CT, MRI and X-Ray combined. In addition, procedures that utilize ultrasound range from fetal imaging to safe insertion and guidance of a needle inside a patient. Researchers at the University of Florida have developed an ultrasound-guided medical tool insertion simulator that allows clinicians to practice ultrasound-guided needle insertions directly on a patient’s anatomy before the actual procedure. This insertion tool allows for both training and clinical use. Available simulators reference a library of pre-scanned and pre-recorded ultrasound images or use ultrasound machines to scan objects that are not the actual patient of the injection. These “fictitious” ultrasound images are not as realistic as actual, live ultrasound images and they do not account for anatomical differences and anomalies. This improvement has potential to meet the increasing demand for effective ultrasound medical tools.
Ultrasound-guided needle insertion simulators for accurate training and clinical use
This ultrasound-guided needle insertion simulator tool uses a retractable needle or a needle emulator created with a mobile smart device to interact with live ultrasound images with a non-invasive format. The simulator addresses the issues of both teaching ultrasound to medical students and residents and allowing clinicians to practice ultrasound-guided needle insertions directly before the actual procedure. Improper use of ultrasound and lack of knowledge can increase the number of needle passes and movement inside a patient. These improper uses put patients at risk for mild complications, such as discomfort, to more serious complications, such as a pneumothorax. The value of the simulator is in its employment of live ultrasound on real anatomy, its use of highly accessible devices already owned by the user, and its ability to be as portable as a laptop or smart device app.