Research Terms
The University of Central Florida invention comprises tactile-visual systems and methods for social interactions between isolated patients (for example, those with COVID-19) and remote visitors such as loved ones, family members, friends, or volunteers. A primary goal is to provide the isolated patient and the remote visitors with a visual interaction augmented by touch—a perception of being touched for the isolated patient and a perception of touching for the remote visitors. For example, a loved one might be able to virtually stroke the patient’s arm or head, or even squeeze the patient’s hand. A simple realization might include tactile transducer “strips” placed on the patient, with two-way video via touch-sensitive tablets, where touching the visual image of the strips on the tablet results in tactile sensations on the patient’s skin.
A more sophisticated realization could use the Physical-Virtual Patient Bed (PVPB), developed under NSF Award #1564065, to serve as a remote physical, visual, and tactile surrogate for the isolated patient. The remote visitors would be able to see, hear, and touch the PVPB. The isolated patient would see the remote visitors via video and feel their touch interactions via the tactile transducer strips on their arms and head (for example). These interactive video, voice, and touch interactions could provide additional comfort for the isolated patient and the remote visitors. Further embodiments and enhancements include 3D depth and viewing for visitors and patients, with possible 3D free space interaction. For example, visitors wearing augmented reality head-mounted displays could reach out and touch a virtual version of the patient, and the patient would feel tactile sensations. The systems and methods are usable in any conditions giving rise to isolation, including isolation due to geographical distance.
The University of Central Florida invention is an approach that allows virtual overlays for manikins that enable realistic movement and a diversity of races/ethnicities and disorders while still providing physical contact. The innovation can be used to enhance any physical full-body patient simulator, manikin/mannequin, figure, doll, or other patient representation by using any video see-through head-mounted display (VST-HMD) or similar augmented reality (AR), virtual reality (VR), or mixed reality (MR) system to augment the appearance of the physical patient representation. It can also simulate physical movements, aspects, and behaviors that the physical patient representation cannot do on its own, while also affording the ability to touch the representation. For the healthcare domain, these enhancements offer several benefits, such as improved learning, training, mentoring, practice, and/or case planning. Other areas, including biology education, clothing retail, and any service or activity involving human representations may benefit from the invention.
Partnering Opportunity
The research team is seeking partners for licensing and/or research collaboration.
Stage of Development
Prototype available.
The University of Central Florida invention enables students to safely train for and to practice skills needed for various environments such as laboratories, industrial areas, and hospitals. An advanced augmented reality (AR) apparatus, the invention integrates various technologies to provide an enhanced user experience through visual, tactile, auditory, olfactory, and other sensory feedback mechanisms. The system uses video see-through augmented reality (VST-AR) capabilities, leveraging hardware and software components to achieve a realistic and interactive environment for learning and training in areas such as biochemistry, forensic chemistry, geochemistry, and physics-related studies (optics, thermodynamics, and acoustical).
Corporal entities (glassware replicas made of plastic or other polymers) afford the physical qualities and motor skills associated with using objects and equipment such as standard chemistry lab glassware. At the same time, AR technologies enable users to perform and safely experience simulated procedures like those in a laboratory, including chemical reactions—even unsafe reactions.
Additionally, the invention addresses temporal and geographic constraints associated with traditional lab and training systems, enabling students and trainees to engage in interactive and immersive learning experiences regardless of their physical location or time zone. This flexibility allows for continuous and accessible education, reducing the need for centralized, time-bound, and location-specific resources while maintaining the quality and safety of hands-on training.
Technical details: The UCF invention features a VST-AR device that captures images of a real-world environment and displays augmented versions on a screen. In one example application, the system includes a corporal entity, such as an empty laboratory vessel, with a unique identification marker to provide real-time spatial data for generating visual augmentations of virtual matter within the vessel. A marker can be a QR code, RFID tag, or near-field communication (NFC) chip affixed to the vessel surface.
The apparatus integrates various sensors, including an inertial measurement unit (IMU), capacitive tactile sensors, and a thermal diode, to dynamically adjust visual augmentations based on user interactions. Additional sensory enhancements include an olfactory output fan, an eccentric rotating mass (ERM) motor, and auditory outputs. Wireless connectivity via IEEE 802.11 (Wi-Fi) and IEEE 802.15.1 (Bluetooth) ensures seamless communication between components.
Real-time tracking allows the AR system to adjust digital augmentations, ensuring that they remain accurately overlaid on the physical object. This dynamic interaction is essential for applications requiring precise alignment, such as virtual training environments, interactive educational tools, and industrial maintenance systems.