Human Factors and Simulation Laboratory
The OU HFS Lab bridges human factors, operations research, and data analytics to study human performance in complex human-integrated systems.
NSF Project
Explore the NSF CAREER project on smart learning in multi-person virtual reality and multimodal analysis.
View NSF projectFAA Projects
Browse FAA-supported work on tower control, en route scanning, training, and workload analysis.
View FAA projectsWelcome to Kang's lab
Researchers at the Human Factors and Simulation Laboratory study how people perceive, decide, learn, and perform inside complex socio-technical systems. The lab combines human factors, simulation, and computational analysis to discover patterns that can improve training, interface design, and operational safety.
Our projects often connect eye tracking, haptic interaction, brain activity, visualization, and predictive modeling. Application areas include air traffic control, virtual reality learning, weather forecasting, healthcare systems, marketing, and offshore operations.
- Human factors
- Operations research
- Data analytics
- Eye tracking
- Virtual reality
- Multimodal analysis
Research support
- National Science Foundation
- Federal Aviation Administration
- National Academy of Sciences
- Microsoft Corporation
We develop automated or semi-automated algorithms to characterize and cluster human behaviors, especially from spatio-temporal data such as eye movement networks.
The lab is a strong fit for students interested in programming, statistics, experimentation, data analysis, and human-centered system design.
News
-
1 Oct 2021Aerospace Paper Published on Eye Movements and Pilot FatigueIn October 2021, the paper “Multimodal analysis of eye movements and fatigue in a simulated glass cockpit environment” was published in Aerospace. The study examines how multimodal measures can help characterize fatigue and interaction patterns in cockpit tasks, extending the lab’s work...
-
1 Jul 2021Aerospace Paper Published on Expert En Route Controller StrategiesIn July 2021, the lab published the paper “Visual search and conflict mitigation strategies used by expert En Route air traffic controllers” in Aerospace. The paper highlights how expert controllers search complex radar displays and manage conflict mitigation in dynamic environments, continuing...
-
15 Aug 2020NSF CAREER Project on Multi-person VR Smart Learning BeginsOn August 15, 2020, Dr. Ziho Kang’s NSF CAREER project officially began. The project, titled “Non-text-based smart learning in fully immersive multi-person virtual reality using multimodal analysis of physiological measures,” studies how eye movement characteristics, haptic interactions, and brain activities can be...
NSF CAREER Award
Dr. Ziho Kang received an NSF CAREER award for research on non-text-based smart learning in fully immersive multi-person virtual reality using near real-time multimodal analysis of physiological measures.
The project studies how eye movement characteristics, haptic interactions, and brain activities can be analyzed in time to predict engagement and support adaptive scaffolding in immersive learning environments.