In Affective Computing, different modalities, such as speech, facial expressions, physiological properties, smartphone usage patterns, and their combinations, are applied to detect the affective states of a user. In this project, we explore different challenges of developing smartphone based emotion detection application like modality selection, influence of Experience Sampling Method (ESM) on emotion detection, simplifying self-report collection.
Keystroke analysis i.e. study of the typing behavior in desktop computer is found to be an effective modality for emotion detection because of its reliability, non-intrusiveness and low resource overhead. As smartphones proliferate, typing behavior on smartphone presents an equally powerful modality for emotion detection. It has the added advantage to run in-situ experiments with better coverage than the experiments using desktop computer keyboards. As a part of this project, we explore the efficacy of smartphone typing to detect multiple affective states. We design, develop and evaluate an Android based application TapSense to determine multiple emotion states from typing activity in smartphone. This has been developed by the researchers from Complex Network Research Group (CNeRG) of IIT Kharargpur, India in collaboration with researchers from Georgia Southern University, USA .
Related Publication | Results | Links | Contact
PUBLICATIONS [Back to top]
Surjya Ghosh, Shivam Goenka, Niloy Ganguly, Bivas Mitra and Pradipta De; Representation Learning for Emotion Recognition from Smartphone Keyboard Interactions, 8th International Conference on Affective Computing & Intelligent Interaction (ACII 2019), Cambridge, UK, Sep 2019. (Accepted)
Surjya Ghosh, Kaustubh Hiware, Niloy Ganguly, Bivas Mitra and Pradipta De; Emotion Detection from Touch Interactions during Text Entry on Smartphones , International Journal of Human-Computer Studies, Elsevier, 2019. (Impact Factor: 2.3) [Article].
Surjya Ghosh, Niloy Ganguly, Bivas Mitra and Pradipta De; Designing An Experience Sampling Method for Smartphone based Emotion Detection, IEEE Transactions on Affective Computing, 2019. [Early Access]
Surjya Ghosh, Kaustubh Hiware, Niloy Ganguly, Bivas Mitra and Pradipta De; Does Emotion Influence the Use of Auto-suggest during Smartphone Typing?, 24th International Conference on Intelligent User Interfaces (ACM IUI 2019), Los Angeles, USA, Mar 2019. [PDF] [PPT]
Surjya Ghosh, Sumit Sahu, Niloy Ganguly, Bivas Mitra and Pradipta De; EmoKey: An Emotion-aware Smartphone Keyboard for Mental Health Monitoring, 11th International Conference on Communication Systems and Networks (COMSNETS 2019), Bangalore, India, Jan 2019 (Poster). [PDF] [Poster]
Surjya Ghosh, Niloy Ganguly, Bivas Mitra and Pradipta De; Effectiveness of Deep Neural Network Model in Typing-based Emotion Detection on Smartphones, 24th Annual International Conference on Mobile Computing and Networking (Mobicom 2018), New Delhi, India, Oct 2018 (Poster). [PDF] [Poster]
Surjya Ghosh, Niloy Ganguly, Bivas Mitra and Pradipta De; Evaluating Effectiveness of Smartphone Typing as an Indicator of User Emotion, 7th International Conference on Affective Computing and Intelligent Interaction (ACII 2017), San Antonio, Texas, USA, Oct 2017. [PDF] [PPT]
Surjya Ghosh, Niloy Ganguly, Bivas Mitra and Pradipta De; TapSense: Combining Self-Report Patterns and Typing Characteristics for Smartphone based Emotion Detection, MobileHCI, Vienna, Austria, Sep 2017. [PDF] [PPT]
Surjya Ghosh; Emotion-aware Computing using Smartphone, COMSNETS 2017 (PhD Forum), Bangalore, India. [PDF]
Surjya Ghosh, Niloy Ganguly, Bivas Mitra and Pradipta De; Towards Designing an Intelligent Experience Sampling Method for Emotion Detection, IEEE CCNC 2017, Las Vegas, USA. [PDF] [PPT]
Surjya Ghosh, Vatsalya Chauhan, Niloy Ganguly, Bivas Mitra and Pradipta De; Impact of Experience Sampling Methods on Tap Pattern based Emotion Recognition, 4th ACM Workshop on Mobile Systems for Computational Social Science - MCSS (Ubicomp.15) Osaka, Japan. [PDF] [PPT]
RESULTS [Back to top]
We performed an online survey using Facebook among 120 participants of different age group (18 to 50 years). This survey has revealed many interesting findings about individual typing pattern in smartphone. Our study reveals that 56% of the participants spent more than half an hour daily in typing and 27% of the participants spent at least an hour daily in typing using smartphone. This reveals that among several typing intensive applications, WhatsApp is the one mostly used, while texting and browsing are the ones used rarely.
We also find that different typing cues like typing speed, typing mistakes, usage of special characters, typing duration are correlated well changes in emotion states.
We have conducted a 3-week on-the-wild study involving 22 participants (20 male, 2 female). During this study we have collected close to 135 hours of typing data spanning across 2700 typing session. We analyse the collected data to find out how accurately TapSense can determine multiple emotion states without much resource overhead.
We obtain an average accuracy (AUCROC) of 84% (standard deviation 6%) while the maximum AUCROC is 94%. The emotion states are identified with precision between 67% and 75%, and recall rate between 57% and 80%). We observe that relaxed state is identified with highest precision, followed by stressed, happy and sad states respectively. Similarly, we observe highest recall for relaxed, followed by happy, sad and stressed states.
We measure the energy consumption of TapSense using a Moto G2 (Android version 6:0) smartphone. We charge the phone fully and monitor how quickly the battery is completely drained by keeping TapSense on and off. It reveals that in both cases, not only the depletion time, but also the depletion rate is similar.
C. Post-study Participant Feedback
We conducted a post study survey following the Post-Study System Usability Questionnaire (PSSUQ) to gauge the system effectiveness from usability perspective. 75% participants marked TapSense as non-intrusive, while 20% ranked it as moderately intrusive. 61% of the participants reported lack of swype facility as an inconvenience. Only 24% participants expressed dicomfort with level of privacy and less than 10% participants were concerned about energy consumption.
Links [Back to top]
APK (Link) Please note that this version of APK captures typing data and stores them locally on phone. It does not perform any prediction. This is a slightly modified version of TapSense application to display the statistics on the user recorded self-reports as well.
Installation Guideline and Demo (Manual) Please refer to this document for help on installation. You may find some discrepancies in the app names and screenshots. It also gives a demo on the application.
Feasibility of Typing for Emotion Detection (Survey Form) We have performed an online survey to identify the feasibility of smartphone typing for emotion detection. We distributed this survey form through Facebook as well as through personal emails for this purpose.
CONTACT [Back to top]
Email addresses: surjya [DOT] ghosh [AT] gmail [DOT] com
Date modified: Dec 14, 2018