TY - GEN
T1 - OCOsense Glasses – Monitoring Facial Gestures and Expressions for Augmented Human-Computer Interaction
T2 - OCOsense Glasses for Monitoring Facial Gestures and Expressions
AU - Gjoreski, Hristijan
AU - Mavridou, Ifigeneia
AU - Archer, James Archer William
AU - Cleal, Andrew
AU - Stankoski, Simon
AU - Kiprijanovska, Ivana
AU - Fatoorechi, Mohsen
AU - Walas, Piotr
AU - Broulidakis, John
AU - Gjoreski, Martin
AU - Nduka, Charles
PY - 2023/4
Y1 - 2023/4
N2 - The paper presents the OCOsenseTM smart glasses system, which recognizes and monitors facial gestures and expressions by using non-contact optomyographic OCOTM sensors and an IMU placed inside the frames of the glasses. The glasses stream the sensor data via Bluetooth to a mobile device, where data-fusion algorithms are applied, to recognize facial gestures and expressions in real time. The recognized gestures and expressions are then used as input to interact with the mobile device. We will demonstrate how the system is used in practice, i.e., a participant will wear the OCOsenseTM glasses and will interact with the mobile device by doing facial gestures and expressions. Three use cases will be presented: video control, call control, and game control. We believe that the OCOsenseTM glasses are the next generation in wearables, which will allow for a better understanding of the user's context and emotional state, and will allow numerous ways to interact with smart devices and computer systems, even within Augmented and Extended Reality environments. Future versions of the system can be used in a variety of domains, including, affective computing, remote mental-health monitoring, and hands-free human-computer interaction, thus improving accessibility and inclusivity of future technologies.
AB - The paper presents the OCOsenseTM smart glasses system, which recognizes and monitors facial gestures and expressions by using non-contact optomyographic OCOTM sensors and an IMU placed inside the frames of the glasses. The glasses stream the sensor data via Bluetooth to a mobile device, where data-fusion algorithms are applied, to recognize facial gestures and expressions in real time. The recognized gestures and expressions are then used as input to interact with the mobile device. We will demonstrate how the system is used in practice, i.e., a participant will wear the OCOsenseTM glasses and will interact with the mobile device by doing facial gestures and expressions. Three use cases will be presented: video control, call control, and game control. We believe that the OCOsenseTM glasses are the next generation in wearables, which will allow for a better understanding of the user's context and emotional state, and will allow numerous ways to interact with smart devices and computer systems, even within Augmented and Extended Reality environments. Future versions of the system can be used in a variety of domains, including, affective computing, remote mental-health monitoring, and hands-free human-computer interaction, thus improving accessibility and inclusivity of future technologies.
U2 - 10.1145/3544549.3583918
DO - 10.1145/3544549.3583918
M3 - Conference contribution
SP - 1
EP - 4
BT - CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
ER -