How Multimodal Sensing Is Redefining Workplace Safety Analysis
New research shows how touch and vision strengthen ergonomic risk assessment
A new study from tactile-sensing specialist PPS and Purdue University offers a shift in how safety teams evaluate manual lifting tasks, presenting a data-rich alternative to long-standing observational methods. Published in 10月 2025 on ScienceDirect, the research demonstrates how combining tactile and visual inputs captures the subtleties of human movement with greater precision than posture analysis alone.
Tactile sensing has steadily expanded across robotics, healthcare and automation, yet its application in ergonomics has remained limited. Purdue’s study brings the technology directly into the safety domain. Using PPS’s TactileGlove, which records detailed pressure distribution through the hand, alongside a computer-vision posture model, the research team created a unified dataset that reveals how force shifts as a worker lifts, carries or adjusts to changing load height.
“For decades, observational checklists and self-reported discomfort have been the main factors in evaluating injury,” explains Denny Yu, co-author and Purdue University researcher. “These methods record what can be seen, not what is felt, by measuring body angles and load weights but overlooks how pressure flows through the hands, wrists and forearms as a task unfolds. Tactile sensing can show the micro-interactions between human touch and physical effort.”
continue…