Advancing remote healthcare using humanoid and affective systems

Utkarsh Tripathi, Rittvik Saran J., Vinay Chamola*, Alireza Jolfaei, Ananthakrishna Chintanpalli

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)

Abstract

Social distancing and remote work are becoming more prevalent in the post-covid world. At the same time, there is a huge demand for remote healthcare sessions as well. Although a growing number of such sessions are now utilizing online platforms as a medium of communication, other critical parameters such as the affective state and other feedback opportunities are lost during the transmission of this digital information. This paper presents a solution that leverages a brain-computer interface system for this affective feedback and a humanoid robot for teaching effectively during remote sessions. The solution uses Kinect as a sensing mechanism for the trainer. It utilizes state-of-the-art deep learning algorithms at the back-end to understand the emotional state of the trainee. The training poses (from humanoid's camera feed and kinect) are calculated using AlphaPose compared using inverse kinematics. To ascertain the trainees' state (high valence and arousal vs. low valence and arousal), a Capsule Network was used that gives an average accuracy of 90.4% for this classification with a low average inference time of 14.3ms on the publicly available DREAMER and AMIGOS datasets. The system also allows real-time communication through the humanoid, making this experience even more distinct for the trainee.

[Graphic presents]

Original languageEnglish
Pages (from-to)17606-17614
Number of pages9
JournalIEEE Sensors Journal
Volume22
Issue number18
DOIs
Publication statusPublished - 15 Sept 2022

Fingerprint

Dive into the research topics of 'Advancing remote healthcare using humanoid and affective systems'. Together they form a unique fingerprint.

Cite this