Abstract
Online classrooms has seen an unprecedented spike in demand since the COVID-19 pandemic. This presents a challenge for educators to observe students’ body language, which is more easily done in a face-to-face setting. This paper presents TrackEd, a tool based on emotion classification that detects body language in form of emotions expressed facially. The tool extracts image frames of attendees on live online meeting stream, and tracks their facial emotion. This is facilitated by a convolutional neural network model trained on 40,254 images categorised into seven facial expressions. The complete source code is freely available on Github entitled ‘TrackEd’
Original language | English |
---|---|
Article number | 100560 |
Pages (from-to) | 1-5 |
Journal | Software Impacts |
Volume | 17 |
Early online date | 5 Aug 2023 |
DOIs | |
Publication status | Published - 11 Aug 2023 |
Keywords
- E-meeting platform
- Affective computing
- Facial expression recognition
- Deep learning
- Online classroom
- Emotion detection
- Artificial intelligence
Research Centres
- Data and Complex Systems Research Centre
Research Groups
- SustainNET