A Research Space for
Earable Computing

ABOUT

Sensory earables are increasingly becoming a mainstream compute platform with a promise to fundamentally transform personal-scale human sensing applications. Over the past few years, a number of research efforts in the ubiquitous computing domain have sought to achieve useful, engaging, and sometimes ambitious behavioural analytics with sensory earables including studies of the human face; of emotion and stress; continuous monitoring of cardiovascular function; oxygen consumption and blood flow; and tracking eating episodes as well as dietary and swallowing activities. At the same time, we have started seeing commercial efforts such as Bragi's The Dash, Bose SoundSport, Jabra Elite Sport, and Sony Xperia offering music experience augmented with sensory services including fitness tracking, real-time translations and conversational agents. Naturally, earables are becoming an intense interdisciplinary area of study with many and diverse applications including HCI, empathic communication, behavioural science, health and wellbeing, entertainment, education, and security.

This site, a voluntarily research service to the community, is intended to catalyse advancements in sensory earable technology by building an academic platform to bring together researchers, practitioners, and design experts from academia and industry to discuss, share, and shape this exciting new area of earable computing.

As a launchpad, we have developed an Open Earable Platform, eSense, and shared with 60+ academic institutions around the globe to accelerate the research in this space. This site is the nucleus of this effort and will offer details of different projects and associated papers, and datasets as they are developed by this community for the community.

The site is also the entry point of EarComp workshop, our annual meeting to discuss the progress of this place, and present a clear sense of direction for the research community to proceed in this space.

eSense OVERVIEW

eSense is a multi-sensory earable platform for personal-scale behavioural analytics research. It is a True Wireless Stereo (TWS) earbud augmented with a 6-axis inertial motion unit, a microphone, and dual mode Bluetooth (Bluetooth Classic and Bluetooth Low Energy). eSense is built with a custom-designed 15 × 15 × 3 mm PCB and composed of a Qualcomm CSR8670, a dual-mode Bluetooth audio system-on-chip (SoC) with a microphone per earbud; a InvenSense MPU6500 six-axis inertial measurement unit (IMU) including a three-axis accelerometer, a three-axis gyroscope, and a two-state button; a circular LED; associated power regulation; and battery-charging circuitry. There is no internal storage or real-time clock. It is powered by an ultra-thin 40-mAh LiPo battery. The carrier casing is equipped with a battery enabling recharging of eSense earbuds on the go (up to 3 full charges). Each earbud weights 20 g and is 18 × 20 × 20 mm.

Please check the IEEE Pervasive Computing article on eSense for more details.

If you use eSense in your research project, we would appreciate if you kindly cite the following two papers.

[1] Fahim Kawsar, Chulhong Min, Akhil Mathur, and Alessandro Montanari,. "Earables for Personal-scale Behaviour Analytics", IEEE Pervasive Computing, Volume: 17, Issue: 3, 2018
[2] Chulhong Min, Akhil Mathur and Fahim Kawsar . "Exploring Audio and Kinetic Sensing on Earable Devices", In WearSys 2018, The 16th ACM Conference on Mobile Systems, Applications, and Services (MobiSys 2018), June 2018 , Munich, Germany.

eSense RESOURCES

  • User Manual - Learn how to quickly get started with eSense.
    User Manual [PDF]
     
  • BLE Specification - Detailed structure of BLE transport and data commands
    Doc [PDF]
     
  • Data Visualiser - A node app to visualise and configure eSense sensory streams at real-time (with manual)
    Source Code | Documentation [PDF]
     
  • Android Library -A light-weight android library to build mobile apps with eSense (with manual)
    Source Code | Documentation [PDF]
     
  • eSense COMMUNITY RESOURCES

  • Flutter Plugin - The eSense Flutter plugin for both Android and iOS.
    Installation Doc - credit: CACHET Team, DTU
     
  • iOS Library - iOS library for eSense
    Source Code
    Credit: Yuuki Nishiyama, University of Tokyo
     
  • eSense iOS app - eSense client for iOS
    Download
    Credit: Yuuki Nishiyama, University of Tokyo
     
  • eSense Android app - eSense client for Android
    Source Code
    Credit: Md. Shafiqul Islam, Kyushu Institute of Technology
     
  • Publications, Projects, Models and Dataset

    60+ academic research groups from all over the globe are contributing to this research effort. The following is a living list of research papers that have used eSense or cited eSense.


    Nokia Bell Lab's Publications


    1. Earables for Personal-scale Behaviour Analytics
      Fahim Kawsar, Chulhong Min, Akhil Mathur, Alessandro Montanari, Nokia Bell Labs
      IEEE Pervasive 2018
       
    2. Exploring Audio and Kinetic Sensing on Earable Devices
      Chulhong Min, Akhil Mathur, Fahim Kawsar, Nokia Bell Labs
      ACM WearSys 2018
       
    3. eSense: Open Earable Platform for Human Sensing
      Fahim Kawsar, Chulhong Min, Akhil Mathur, Alessandro Montanari, Utku Gunay Acer, Marc Van den Broeck, Nokia Bell Labs
      ACM MobiSys 2018, ACM UbiComp 2018, ACM SenSys 2018 [Demo]
       
    4. Cross-modal approach for conversational well-being monitoring with multi-sensory earables
      Chulhong Min, Alessandro Montanari, Akhil Mathur, Seungchul Lee and Fahim Kawsar,Nokia Bell Labs
      ACM WellComp 2018
       
    5. Audio-Kinetic Model for Automatic Dietary Monitoring with Earable Devices
      Chulhong Min, Akhil Mathur, Fahim Kawsar, Nokia Bell Labs
      ACM MobiSys 2018 [Poster]
       
    6. Automatic Smile and Frown Recognition with Kinetic Earables
      Seungchul Lee, Chulhong Min, Akhil Mathur, Alessandro Montanari, Youngjae Chang, Junehwa Song, Fahim Kawsar, Nokia Bell Labs and KAIST
      Augmented Human 2019
       
    7. Situation-Aware Conversational Agent with Kinetic Earables
      Shin Katayama, Akhil Mathur, Tadashi Okoshi, Jin Nakazawa, Fahim Kawsar,Keio University and Nokia Bell Labs
      ACM MobiSys 2019 [Demo]
       
    8. Situation-Aware Emotion Regulation of Conversational Agents with Kinetic Earables
      Shin Katayama, Akhil Mathur, Marc Van den Broeck Tadashi Okoshi, Jin Nakazawa, Fahim Kawsar Keio University and Nokia Bell Labs
      ACM ACII 2019
       
    9. A closer look at quality-aware runtime assessment of sensing models in multi-device environments
      Chulhong Min, Alessandro Montanari, Akhil Mathur, Fahim Kawsar, Nokia Bell Labs
      ACM SenSys 2019
       
    10. An early characterisation of wearing variability on motion signals for wearables
      Chulhong Min, Akhil Mathur, Alessandro Montanari, Fahim Kawsar, Nokia Bell Labs
      ACM ISWC 2019
       
    11. The city as a personal assistant
      Utku Günay Acer, Marc Van den Broeck, Fahim Kawsar, Nokia Bell Labs
      ACM UPA 2019
       
    12. A Systematic Study of Unsupervised Domain Adaptation for Robust Human-Activity Recognition
      Youngjae Chang, Akhil Mathur, Anton Isopoussu, Junehwa Song, Fahim Kawsar Nokia Bell Labs and KAIST
      ACM IMWUT 2020
       


    Research Community's Publications


    Besides Bell Labs' work, we sincerely acknowledge research groups across the world using our platform to advance state of the art. We are proud to list the following published papers using eSense or cited eSense.


    1. ExerSense: Physical Exercise Recognition and Counting Algorithm from Wearables Robust to Positioning
      Shun Ishii, Anna Yokokubo, Mika Luimula, Guillaume Lopez
      Aoyama Gakuin University
      Sensors 2021
       
    2. Mapping Vicon Motion Tracking to 6-Axis IMU Data for Wearable Activity Recognition
      Pellatt Lloyd, Dewar Alex, Philippides Andy, Roggen Daniel
      University of Sussex
      Activity and Behavior Computing 2020
       
    3. Exploring Human Activities Using eSense Earable Device
      Md Shafiqul Islam, Tahera Hossain, Md Atiqur Rahman Ahad, Sozo Inoue
      Kyushu Institute of Technology
      Activity and Behavior Computing 2020
       
    4. ExerSense: Real-Tme Physical Exercise Segmentation, Classification, and Counting Algorithm Using an IMU Sensor
      Shun Ishii, Kizito Nkurikiyeyezu, Anna Yokokubo, Guillaume Lopez,
      Aoyama Gakuin University
      Activity and Behavior Computing 2020
       
    5. AI-enabled Prediction of eSports Player Performance Using the Data from Heterogeneous Sensors
      Anton Smerdov, Evgeny Burnaev, Andrey Somov
      Skolkovo Institute of Science and Technology
      arXiv 2020
       
    6. EarphoneTrack: involving earphones into the ecosystem of acoustic motion tracking
      Gaoshuai Cao, Kuang Yuan, Jie Xiong, Panlong Yang, Yubo Yan, Hao Zhou, Xiang-Yang Li
      University of Science and Technology of China
      ACM SenSys 2020
       
    7. Towards recognizing perceived level of understanding for online lectures using earables
      Dongwoo Kim, Chulhong Min, Seungwoo Kang
      KOREATECH
      ACM SenSys 2020 [Poster]
       
    8. Motion Coupling of Earable Devices in Camera View
      Christopher Clarke, Peter Ehrich, Hans Gellersen
      Lancaster University
      MUM 2020
       
    9. Attracktion: Field Evaluation of Multi-Track Audio as Unobtrusive Cues for Pedestrian Navigation
      Florian Heller, Jelco Adamczyk, Kris Luyten
      Hasselt University
      MobileHCI'20
       
    10. Opportunities to Share Collective Human Hearing
      Risa Kimura, Tatsuo Nakajima
      Waseda University
      NordiCHI'20
       
    11. Towards a characterisation of emotional intent during scripted scenes using in-ear movement sensors
      Sabrina A Frohn, Jeevan S Matharu, Jamie A Ward
      University of London
      ISWC 2020
       
    12. Design space and usability of earable prototyping
      Tobias Röddiger, Michael Beigl, Anja Exler
      Karlsruhe Institute of Technology
      ISWC 2020
       
    13. Collectively Sharing People’s Visual and Auditory Capabilities: Exploring Opportunities and Pitfalls
      Risa Kimura, Tatsuo Nakajima
      Waseda University
      SN Computer Science 2020
       
    14. Interactive Auditory Mediated Reality: Towards User-defined Personal Soundscapes
      Gabriel Haas, Evgeny Stemasov, Michael Rietzler, Enrico Rukzio
      Ulm University
      DIS 2020
       
    15. Gathering People’s Happy Moments from Collective Human Eyes and Ears for a Wellbeing and Mindful Society
      Risa Kimura, Tatsuo Nakajima
      Waseda University
      HCII 2021
       
    16. Devices and Application Tools for Activity Recognition: Sensor Deployment and Primary Concerns
      Md. Atiqur Rahman Ahad, Anindya Das Antar, Masud Ahmed
      University of Dhaka
      IoT Sensor-Based Activity Recognition 2020
       
    17. Sensor-Based Benchmark Datasets: Comparison and Analysis
      Md Atiqur Rahman Ahad, Anindya Das Antar, Masud Ahmed
      University of Dhaka
      IoT Sensor-Based Activity Recognition 2020
       
    18. Augmenting TV Viewing using Acoustically Transparent Auditory Headsets
      Mark McGill, Florian Mathis, Julie Williamson, Mohamed Khamis,
      University of Glasgow
      ACM IMX 2020
       
    19. Fast and scalable in-memory deep multitask learning via neural weight virtualization
      Seulki Lee, Shahriar M Nirjon
      University of North Carolina at Chapel Hill
      ACM MobiSys 2020
       
    20. A comparison between audio and IMU data to detect chewing events based on an earable device
      Roya Lotfi, George Tzanetakis, Rasit Eskicioglu and Pourang Irani
      University of Manitoba
      AH 2020
       
    21. Acoustic Transparency and the Changing Soundscape of Auditory Mixed Reality
      Mark McGill, Stephen Brewster, David McGookin, Graham Wilson,
      University of Glasgow
      ACM CHI 2020
       
    22. As you are, so shall you move your head: a system-level analysis between head movements and corresponding traits and emotions
      Sharmin Akther Purabi, Rayhan Rashed, Mirajul Islam, Nahiyan Uddin, Mahmuda Naznin, and A. B. M. Alim Al Islam,
      BUET
      NSysS 2019
       
    23. Enhanced gesture sensing using battery-less wearable motion trackers
      Huy Vu Tran,
      Singapore Management University
      PhD Dissertation 2019
       
    24. Esports Athletes and Players: A Comparative Study
      Nikita Khromov, Alexander Korotin, Andrey Lange, Anton Stepanov, Evgeny Burnaev, Andrey Somov,
      Skolkovo Institute of Science and Technology
      IEEE Pervasive Computing 2019
       
    25. Wearable Sensing Technology for Capturing and Sharing Emotional Experience of Running
      Tao Bi,
      University College London
      ACIIW 2019
       
    26. EarEcho: Using Ear Canal Echo for Wearable Authentication
      Yang Gao, Wei Wang, Vir V. Phoha, Wei Sun, Zhanpeng Jin,
      University of Buffalo and Syracuse University
      ACM IMWUT 2019
       
    27. The CAMS eSense Framework: Enabling Earable Computing for mHealth Apps and Digital Phenotyping
      Jakob E. Bardram,
      Technical University of Denmark
      EarComp'19
       
    28. Using the eSense Wearable Earbud as a Light-Weight Robot Arm Controller
      Henry Odoemelem, Alexander Hölzemann, Kristof Van Laerhoven,
      University of Siegen
      EarComp 2019
       
    29. STEAR: Robust Step Counting from Earables
      Jay Prakash, Zhijian Yang, Yu-Lin Wei, Romit Roy Choudhury,
      University of Illinois at Urbana Champaign (UIUC) and Singapore University of Technology and Design (SUTD)
      EarComp 2019
       
    30. Towards Respiration Rate Monitoring Using an In-Ear Headphone Inertial Measurement Unit
      Tobias Röddiger, Daniel Wolffram, David Laubenstein, Matthias Budde and Michael Beigl
      Karlsruhe Institute of Technology
      EarComp 2019
       
    31. Head Motion Tracking Through in-Ear Wearables
      Andrea Ferlini, Alessandro Montanari, Cecilia Mascolo and Robert Harle
      University of Cambridge, and Nokia Bell Labs
      EarComp 2019
       
    32. Can Earables Support Effective User Engagement during Weight-Based Gym Exercises?
      Meeralakshmi Radhakrishnan and Archan Misra
      Singapore Management Univeristy
      EarComp 2019
       
    33. EStep: Earabales as opportunity for physio-analytics
      Jay Prakash, Zhijian Yang, Yu-Lin Wei and Romit Roy Choudhury
      University of Illinois at Urbana Champaign (UIUC)
      EarComp 2019
       
    34. Towards In-Ear Inertial Jaw Clenching Detection
      Siddharth Rupavatharam and Marco Gruteser
      Rutgers University
      EarComp 2019
       
    35. Veers: a Case Study of Acoustical Manipulation in Walking without Sight both on Subtle and Overt Conditions
      Kohei Matsumura and Kazushi Okada
      Ritsumeikan University
      EarComp 2019
       
    36. Using an in-ear device to annotate activity data across multiple wearable sensors
      Alexander Hölzemann, Henry Odoemelemand and Kristof Van Laerhoven
      University of Siegen
      EarComp 2019
       
    37. Using the eSense Wearable Earbud as a Light-Weight Robot Arm Controller
      Henry Odoemelem, Alexander Hölzemann and Kristof Van Laerhoven
      University of Siegen
      EarComp 2019
       
    38. A data sharing platform for earables research
      Jovan Powar and Alastair R Beresford
      University of Cambridge
      EarComp 2019
       
    39. Human activity recognition using earable device
      Tahera Hossain, Md Shafiqul Islam, Md Atiqur Rahman Ahad, Sozo Inoue,
      Kyushu Institute of Technology, University of Dhaka
      UbiComp/ISWC'19
       
    40. A survey on gait recognition
      Changsheng Wan, Li Wang, Vir V. Phoha
      Southeast University
      ACM Computing Surverys 2018
       
    41. Devising and evaluating wearable technology for social dynamics monitoring
      Alessandro Montanari,
      University of Cambridge
      PhD Dissertation 2018
       

    UNIVERSITY RESEARCH GROUPS USING eSense

    60+ academic research groups from all over the globe are contributing to this research effort. We sincerely acknowledge their contributions.

    Contact Us


     
    Pervasive Systems team at Nokia Bell Labs Cambridge maintains this research space voluntarily as a community service.
    Please get in touch with us to contribute to this effort in advancing the earable computing research.

    Credits : Alessandro Montanari, Chulhong Min, Akhil Mathur, Utku Acer, Marc Van den Broeck, Fahim Kawsar