Valence
Smart Industry
Virtual Reality
Cognitive ergonomics
Human-Robot collaboration
How can we measure and improve the (cognitive) ergonomics in human-robot collaboration during assembly work?
With this project we aim to develop:
- An ergonomic model (digital twin) of a human-cobot interaction for industrial (assembly) contexts
- Adaptive self-learning cobot control based on this ergonomic model and fueled by interaction-data in different contexts:
- Automatically generated/captured (video) data from different operators and contexts
- Predictive control model focusing on HRI assembly tasks
3. Remote training/simulation/control for operators and (future) assembly lines
We at imec-mict-UGent focus on the use of our ExperienceDNA framework to determine the cognitive/psychological markers (i.e., exploring HMD eye tracking, hand tracking & object interactions to measure hesitation, doubt, load, fatigue & risk behavior). In addition, we aim to track these markers during human-robot collaborations to build models of cognitive interaction between robot and human.
Partners: BruBotics, IDLab Antwerp
Duration: 2 year (extendable)
Contact: Jelle Saldien