Navigation auf uzh.ch

Suche

Department of Geography Geographic Information Visualization and Analysis (GIVA)

Emotion Tracking: Facial Expressions and mobile Electrodermal Activity (mEDA)

The GIVA research group uses two wrist-worn systems to record galvanic skin responses (GSR) for mobile indoor/outdoor studies (Empatica E4), and facial expression recognition software for online settings Motions Affectiva.

The images below are of Dr. Sara Lanini Maggi.

Empatica E4 wristband
A participant explores a virtual city during an online navigation study. Her emotional state is tracked using an Empatica E4 wristband measuring EDA.

Technical Specifications

Empatica E4 specifications of the electrodermal activity (EDA) sensor

  • Sampling frequency: 4 Hz.
  • Resolution: 1 digit ~ 900  pSiemens.
  • Range: 0.01 μSiemens – 100 μSiemens.
  • Alternating current (8Hz frequency) with a max peak to peak value of 100 μAmps (at 100 μSiemens).
  • Electrodes:
    • Placement on the ventral (inner) wrist.
    • Snap-on silver (Ag) plated with metallic core.
    • Electrode longevity: 4–6 months.
  • Device storage capacity exceeds 60 recording hours.
Empatica E4 specifications of the electrodermal activity (EDA) sensor

Figure: Empatica E4 specifications (Empatica S.r.l., https://support.empatica.com/hc/en-us/articles/202581999-E4-wristband-technical-specifications, accessed 28.03.2022). More information: https://empatica.app.box.com/v/E4-User-Manual (accessed 28.03.2022).

Empatica E4 software

Empatica E4 software

Figure: Empatica E4 software (Empatica S.r.l., https://www.empatica.com/research/e4/, accessed 28.03.2022).

  • A desktop platform (E4 manager) to import data collected with Empatica E4 via USB and transfers it to the Empatica cloud platform.
  • Dashboard (E4 connect) to view and manage participants’ EDA data on the Empatica cloud platform. It is also possible to download raw data in CSV format for easy processing and analysis in third party applications. In addition to EDA data, the following participants’ data can also be obtained: Blood Volume Pulse (BVP), Acceleration, Heart Rate (HR), and Temperature.
Empatica dashboard showing the physiological data

Figure: Empatica dashboard showing the psycho-physiological data stream of one participant collected during a navigation task (EDA = electrodermal activity; BVP = blood volume pulse, or heart rate metric, and accelerometer data), viewed with E4 connect. 

iMotions Affectiva AFFDEX

  • Affectiva AFFDEX uses automated facial coding metrics to recognize 18 facial expressions. These are equivalent to Action Units described by the Facial Action Coding System (FACS). This algorithm collects significant occurrences of facial expressions that occur at approximately the same time. From those facial expressions, it derives 7 core emotions, i.e., joy, anger, fear, disgust, contempt, sadness, and surprise. It also provides an overview of the overall expressed emotional response with the two summary scores of engagement (i.e., a participant’s expressiveness) and valence (i.e., positiveness or negativeness of a participant’s emotional experience).

  • Affectiva AFFDEX allows you to:

    • Getting insights into the consistency of facial expressions elicited by the provided sensory stimulus, and the underlying valence and intensity of the emotional response.
    • Collecting data in the lab or online through the iMotions Online Data Collection (ODC) tool.
    • Getting real-time feedback on calibration quality for highest measurement accuracy.
    • Visualizing raw and aggregated data in the moment and after data collection.
    • Get a statistical report of the data such as the number of participants who showed a given emotional response such as joy or sadness in a certain time window.
    • Synchronizing and managing multiple data streams from any biometric sensor in combination to facial expressions such as eye tracking data and EEG data.
iMotions Affectiva AFFDEX
Photo: Sara Lanini-Maggi

Figure: A screenshot of the iMotions software, showing how the Affectiva algorithm classifies a user's facial expressions in real time while she is navigating through a virtual city online.