Human Factors and Simulation Laboratory

The OU HFS Lab bridges human factors, operations research, and data analytics to study human performance in complex human-integrated systems.

NSF Project icon
NSF Project

Explore the NSF CAREER project on smart learning in multi-person virtual reality and multimodal analysis.

View NSF project
FAA Projects

Browse FAA-supported work on tower control, en route scanning, training, and workload analysis.

View FAA projects
Overview

Welcome to Kang's lab

Researchers at the Human Factors and Simulation Laboratory study how people perceive, decide, learn, and perform inside complex socio-technical systems. The lab combines human factors, simulation, and computational analysis to discover patterns that can improve training, interface design, and operational safety.

Our projects often connect eye tracking, haptic interaction, brain activity, visualization, and predictive modeling. Application areas include air traffic control, virtual reality learning, weather forecasting, healthcare systems, marketing, and offshore operations.

  • Human factors
  • Operations research
  • Data analytics
  • Eye tracking
  • Virtual reality
  • Multimodal analysis
Funding sources
Funding

Research support

  • National Science Foundation
  • Federal Aviation Administration
  • National Academy of Sciences
  • Microsoft Corporation
What Makes The Lab Distinct

We develop automated or semi-automated algorithms to characterize and cluster human behaviors, especially from spatio-temporal data such as eye movement networks.

Student Preparation

The lab is a strong fit for students interested in programming, statistics, experimentation, data analysis, and human-centered system design.

News

  • 1 Oct 2021
    Aerospace Paper Published on Eye Movements and Pilot Fatigue
    In October 2021, the paper “Multimodal analysis of eye movements and fatigue in a simulated glass cockpit environment” was published in Aerospace. The study examines how multimodal measures can help characterize fatigue and interaction patterns in cockpit tasks, extending the lab’s work...
  • 1 Jul 2021
    Aerospace Paper Published on Expert En Route Controller Strategies
    In July 2021, the lab published the paper “Visual search and conflict mitigation strategies used by expert En Route air traffic controllers” in Aerospace. The paper highlights how expert controllers search complex radar displays and manage conflict mitigation in dynamic environments, continuing...
  • 15 Aug 2020
    NSF CAREER Project on Multi-person VR Smart Learning Begins
    On August 15, 2020, Dr. Ziho Kang’s NSF CAREER project officially began. The project, titled “Non-text-based smart learning in fully immersive multi-person virtual reality using multimodal analysis of physiological measures,” studies how eye movement characteristics, haptic interactions, and brain activities can be...
Dr. Ziho Kang
Featured Project

NSF CAREER Award

Dr. Ziho Kang received an NSF CAREER award for research on non-text-based smart learning in fully immersive multi-person virtual reality using near real-time multimodal analysis of physiological measures.

The project studies how eye movement characteristics, haptic interactions, and brain activities can be analyzed in time to predict engagement and support adaptive scaffolding in immersive learning environments.

See funding history or read more about research areas.