Research Terms
The taxonomy editor software helps promote the efficient and seamless transfer of information between software by graphically enabling the ability to assign taxonomic-based relationships and conversions among a data set or native schema. It is an organization tool that harnesses various functions and proprietary algorithms to automate creating, modifying, and exporting a taxonomy and, ultimately, provides the efficient transfer of data to any design software. Bridge Information Modeling (BrIM) is an important trend in the highway transportation industry, in which a variety of technologies and software are used in all phases of the bridge lifecycle. However, most available software consists of stand-alone applications that do not efficiently exchange data with other software programs. This lack of interoperability hinders the efficient and seamless transfer of information needed for bridge construction.
Researchers at the University of Florida have created a taxonomy editor software tool to automate the building of a taxonomy that can be converted into a software schema such as the Industry Foundation Classes (IFC) standard so that end users can design building projects with consistent semantics and logic no matter which software they use.
A streamlined system for data input to create an organized taxonomy for bridge and other building applications
Multiple functions and proprietary algorithms automate the manual tasks associated with creating, modifying, and exporting bridge taxonomy. Includes two input/output documents: data set, essentially a dictionary of components used to populate the taxonomy, and the taxonomy. The resulting taxonomy can be exported to HTML and .xlsx format which can then be imported into other ontology programs or schemas, such as IFC. Ontologies could be used by software developers as a common language. Any version of the software would yield the same output to the builder. THIS TECHNOLOGY IS NOT LIMITED TO ONLY BUILDINGS, as it can be extended to essentially any application requiring taxonomies.
This multi-access edge computing system connects and coordinates signals from multiple smart devices to provide real-time monitoring on remote construction work sites. The U.S. construction industry, valued at $2.1 trillion, is one of the largest markets in the nation. However, the industry is one of the least productive, operating in a high-risk environment. A major barrier construction projects face is the low access to high-speed internet and computing, making processing capabilities finite. Working in a dynamic and rapidly changing environment, hazard assessment is nearly impossible, and site monitoring and real-time communication are severely limited. Currently, the industry is heavily reliant on manual, time-consuming, and labor-intensive processes. But the construction industry is an environment ripe for automation. Emerging information and communication technologies offer the opportunity to address these industry challenges and beyond.
Researchers at the University of Florida have developed a multi-access edge computing system to manage signals to and from various heterogeneous sensors, software, and technology using the Internet of Things (IoT) architecture. This system, IoT-ACRES (IoT- Applied Construction Research and Education Services) uses machine learning to facilitate learning from new data to update upstream models. This enables safety risk analysis, site monitoring, and real-time updates of any aspect of a job site. It joins the physical and virtual worlds together, bringing computation and data to the source.
Multi-access edge computing system using the Internet of Things (IoT) incorporates various heterogeneous sensors, technology, software, and AI to provide real-time construction site monitoring and feedback
This multi-access edge computing system utilizes the Internet of Things (IoT) to incorporate a multitude of heterogeneous solutions and provide construction site tracking. It comprises sensors, edge computing, cloud computing, and local computing systems. The sensors collect data from the site and transmit it to the local computing system, which in turn processes it and transmits the results to the edge computing system. The edge computing system functions as a middleman, processing data at an edge location of a local network at the construction site before sending the information to the cloud computing systems. The cloud computing system uses cloud servers and machine learning models to collect and compile the data processed by edge computing systems. The integration of data from localization systems with other internet-enabled sources can provide valuable information on real-time safety risks and system optimization.