TrackEd: An emotion tracking tool for e-meeting platforms

Jamie McGrath, Nonso Nnamoko

Research output: Contribution to journalArticle (journal)peer-review

2 Citations (Scopus)
306 Downloads (Pure)

Abstract

Online classrooms has seen an unprecedented spike in demand since the COVID-19 pandemic. This presents a challenge for educators to observe students’ body language, which is more easily done in a face-to-face setting. This paper presents TrackEd, a tool based on emotion classification that detects body language in form of emotions expressed facially. The tool extracts image frames of attendees on live online meeting stream, and tracks their facial emotion. This is facilitated by a convolutional neural network model trained on 40,254 images categorised into seven facial expressions. The complete source code is freely available on Github entitled ‘TrackEd’
Original languageEnglish
Article number100560
Pages (from-to)1-5
JournalSoftware Impacts
Volume17
Early online date5 Aug 2023
DOIs
Publication statusPublished - 11 Aug 2023

Keywords

  • E-meeting platform
  • Affective computing
  • Facial expression recognition
  • Deep learning
  • Online classroom
  • Emotion detection
  • Artificial intelligence

Research Centres

  • Data and Complex Systems Research Centre

Research Groups

  • SustainNET

Fingerprint

Dive into the research topics of 'TrackEd: An emotion tracking tool for e-meeting platforms'. Together they form a unique fingerprint.

Cite this