Skip to main content
Provost and Executive Vice President Homepage

Recent Awards

Maria_Kyrarini

Maria_Kyrarini

Maria Kyrarini with the Electrical and Computer Engineering Department has received a $442,383 award from the National Science Foundation

Maria Kyrarini with the Electrical and Computer Engineering Department has received a $442,383 award from the National Science Foundation to support her project "NSF-SNSF: VR-HRC: Virtual Reality-based Multi-Human-Multi-Robot Collaboration in Industrial Environments".

Collaborative Robots (Cobots) are being deployed in Industrial environments to support workers in repetitive or heavy-lifting tasks. Cobots typically have Graphic User Interfaces (GUI) and simulation software, making them easier for non-experts to program. However, while this may be an easy way to program one robot, it may not be intuitive for users to program multiple cobots that require collaboration with workers in a factory setting.

Let us think of an automotive factory that has an assembly line. Multiple robots may be required for the assembly process, and some of these robots may work alongside humans to help lift a windscreen or provide tools and other parts as needed. The humans who will teach the cobots their tasks will be required to better understand the process in space (three dimensions), while cobots will be required to understand what the humans do around the workspace and adapt accordingly. In this proposal, we will introduce an innovative virtual reality framework that facilitates collaboration between humans and cobots.

This groundbreaking system will empower non-expert users to interact with cobots using hand gestures, eye gaze, and speech in a physics-based VR simulation environment, thereby simplifying the process of teaching them industrial tasks. To address the cognitive demands of teaching multiple cobots, we will develop intent detection methods from multimodal human data and VR environment data, which will be translated into cobot actions. Furthermore, during collaborative tasks, the human can demonstrate the expected behavior to the cobot, enhancing the system's usability. Generative AI approaches will also be exploited, merging robot and human data to train general robot behaviors. Such models will be deployed to the robot for human-like task execution, also allowing for fine-tuning in the case of partially different operating conditions or (similar) new tasks. Humanin-the-loop approaches (e.g., active preference learning) for model refinement will also be exploited to fine-tune such behaviors based on human feedback.   

Past Awards