|01-2024||Context-aware Opportunistic Sensing for Indoor Navigation Environment (COSINE) Context-aware Opportunistic Sensing for Indoor Navigation Environment (COSINE)|
|11-2023||DARPA Perceptually-enabled Task Guidance (PTG) program DARPA Perceptually-enabled Task Guidance (PTG) program|
|12-2022||Mixed Reality Infrastructure Inspections Mixed Reality Infrastructure Inspections|
|09-2022||Retraining Built Environment R Retraining Built Environment R|
|03-2022||Human-Swarm Interaction for th Human-Swarm Interaction for th|
|03-2022||LaViola-IPA with VA LaViola-IPA with VA|
|06-2021||Solider Perception Laboratory Solider Perception Laboratory|
|12-2020||RF: Peraton User Testing RF: Peraton User Testing|
|08-2020||NRI: Collaborative Research: Sketching Geometry and Physics Informed Inference for Mobile Robot Manipulation in Cluttered Scenes NRI: Collaborative Research: Sketching Geometry and Physics Informed Inference for Mobile Robot Manipulation in Cluttered Scenes|
|05-2019||Interactive Visualization in Support of Decision Making Under Uncertainty Interactive Visualization in Support of Decision Making Under Uncertainty|
|04-2019||Combat Casualty Care Augmented Reality Intelligent Training System - C3ARESYS Phase II Combat Casualty Care Augmented Reality Intelligent Training System - C3ARESYS Phase II|
|08-2018||ARI: Improving Augmented Reality Technologies for Training and Education ARI: Improving Augmented Reality Technologies for Training and Education|
|05-2017||Exploring the Benefits of 3D Spatial IDEs Exploring the Benefits of 3D Spatial IDEs|
|04-2016||CAREER: Mathematical Sketching: Pen-based Tools for Conceptual Understanding in Mathematics and Physics CAREER: Mathematical Sketching: Pen-based Tools for Conceptual Understanding in Mathematics and Physics|
|04-2016||REU Supplement to CAREER: Mathematical Sketching: Pen-based Tools for Conceptual Understanding in Mathematics and Physics REU Supplement to CAREER: Mathematical Sketching: Pen-based Tools for Conceptual Understanding in Mathematics and Physics|
|09-2015||Physics Based Multi-Touch Movement Interface Creation for 3D Modeling and Simulation, Phase II Physics Based Multi-Touch Movement Interface Creation for 3D Modeling and Simulation, Phase II|
|07-2015||SHF: Large: A Working Set Approach to Integrated Development Environments SHF: Large: A Working Set Approach to Integrated Development Environments|
|06-2014||Major: Enhancing Creativity and Authoring in STEM Education-Based Virtual Worlds through Concept-Oriented Design Major: Enhancing Creativity and Authoring in STEM Education-Based Virtual Worlds through Concept-Oriented Design|
|06-2014||REU Supplement to Major: Enhancing Creativity and Authoring in STEM Education-Based Virtual Worlds through Concept-Oriented Design REU Supplement to Major: Enhancing Creativity and Authoring in STEM Education-Based Virtual Worlds through Concept-Oriented Design|
|03-2014||IPA: Healthcare Informatics, Implementation, Long Term Care and Aging IPA: Healthcare Informatics, Implementation, Long Term Care and Aging|
|12-2013||Feasibility for the development of a physics, navigation, and meta gestures API for training, simulations, and entertainment Feasibility for the development of a physics, navigation, and meta gestures API for training, simulations, and entertainment|
|10-2013||Dynamic 3D Stereo Visualization of Physics Concepts through a Hybrid Stylus Interface Dynamic 3D Stereo Visualization of Physics Concepts through a Hybrid Stylus Interface|
|09-2013||IPA: Extending Smart Home Technology for Cognitively Impaired Veterans to Delay Institutionalization (Part II) IPA: Extending Smart Home Technology for Cognitively Impaired Veterans to Delay Institutionalization (Part II)|
|08-2013||Naturalistic Operator Interface for Immersive Environments Naturalistic Operator Interface for Immersive Environments|
|01-2013||VR and Gaming Project Exploration VR and Gaming Project Exploration|
|01-2013||Personalized Self-Efficacy Virtual Environment Recovery Experience (PERSEVERE) Personalized Self-Efficacy Virtual Environment Recovery Experience (PERSEVERE)|
|12-2012||Realistic Full Body Interfaces for Locomotion and Communication in 3D Virtual Environments Realistic Full Body Interfaces for Locomotion and Communication in 3D Virtual Environments|
|09-2011||Prototyping Tools for Unobtrusive Mood Assessment Prototyping Tools for Unobtrusive Mood Assessment|
Researchers at the University of Central Florida have developed a motion tracking method independent of external components commonly relied upon by conventional approaches that use environmental markers as reference points or pre-loaded models of the physical environment. The method consists of four cameras, such as digital video cameras, capturing a series of images from each quadrant of view when arranged in a square-shaped setup. The images are used to compute a series of positions and orientations of the object with attached cameras, reducing the complexity of computing an object's three-dimensional motion. Simplified positions and motion tracking applications include video game controllers, human-computer interaction input devices for scrolling, pointing, and tracking, and input devices for interacting with virtual environments.
The UCF method is an inside-out, vision-based tracking system based on cameras arranged in an orthogonal configuration of two opposing pairs. The arrangement of cameras moves along with the object being tracked and can be mounted on a mobile platform for use in robotic applications such as tracking, localization and mapping. A computing device receives a series of images from each camera and calculates successive positions for the object, simplified by the arrangement of cameras in conjunction with polar correlation of optical flow, to determine an object's three-dimensional motion.
UCF researchers have developed a method that quickly generates realistic synthetic data and enables gesture recognizers to significantly improve their accuracy. The new method, called Stochastic Resampling (SR), is computationally efficient, has minimal coding overhead, and does not require expert knowledge to implement. SR-generated synthetic samples also outperform those of competitive, state-of-the-art methods, namely Perlin Noise and Sigma-Lognormal Model. In some cases, reducing mean recognition errors by more than 70 percent.
SR intelligently selects random points along a 2D trajectory that scales the spaces between the points to create realistic variations of a given sample. For example, given a hand-drawn circle with a time series of K points, SR resamples the series into a fixed number of N points along the series' path. The path distance between points is non-uniform, and the direction vector between each consecutive pair of points is extracted and normalized to unit length. Next, the normalized in-between point direction vectors are concatenated together to create a new set of N points, with the origin of the first vector being at the center of the coordinate system. Thereafter, the resulting series can be translated, scaled, skewed, rotated, and so forth, as necessary.
Researchers at the University of Central Florida have invented an innovative, multi-sensory, interactive training system that realistically mimics wounds and provides constant, dynamic feedback to medical trainees as they treat the wounds. Almost like a video game in real-life, the Tactile-Visual Wound (TVW) Simulation Unit portrays the look, feel, and even the smell of different types of human wounds (such as a puncture, stab, slice or tear). It also tracks and analyzes a trainee's treatment responses and provides corrective instructions.
The TVW invention is a multi-sensory wound simulation unit. By combining several technologies, the invention provides an immersive experience for trainees. A TVW unit can include augmented reality software and a headset; sensors; actuators and markers integrated into a medical manikin; and a computer processor. An alternative configuration uses interactive moulage components affixed to a real person instead of a manikin. When activated, the unit's AR system continuously tracks the TVW, estimates the deformation of the wound over time, and monitors its response to treatment. For example, a trainee might see (via the AR glasses or headset) a projection that shows blood flowing out of the manikin's wound and vital signs "dropping." When the trainee applies pressure to the wound, sensors detect the action and wirelessly relay the data to the AR system. In response, the AR system renders (via computer graphics) an appropriate dynamic view of the blood loss slowing, and the physiological simulation reflects stabilized vitals. Real-time depth or other models of the trainee's hands, medical devices, and so on, can also affect the simulated visuals that the AR rendering system generates.
The University of Central Florida invention, Code Park, is a novel tool for visualizing codebases in a 3D game-like environment. Code Park aims to improve a programmer's understanding of an existing codebase that is both engaging and fun, especially for novice users such as students. The UCF tool lays out the codebase in a 3D park-like environment. Each class in the codebase is represented as a 3D room-like structure. Constituent parts of the class (variable, member functions, and so on) are laid out on the walls, resembling a syntax-aware "wallpaper." Users can interact with the codebase using an overview and a first-person viewer mode. They also can edit, compile and run code in the environment. See the informational video: https://www.youtube.com/watch?v=LUiy1M9hUKU&t=3s.
Partnering Opportunity: The research team is seeking partners for licensing, research collaboration, or both.