pyEDA: An Open-Source and Versatile Feature Extraction Python Toolkit for Electrodermal Activity
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine

UC Irvine Electronic Theses and Dissertations bannerUC Irvine

pyEDA: An Open-Source and Versatile Feature Extraction Python Toolkit for Electrodermal Activity

  • Author(s): Aqajari, Seyed Amir Hossein
  • Advisor(s): Rahmani, Amir M.
  • et al.
Abstract

Electrodermal Activity (EDA), also known as Galvanic Skin Response (GSR), measures changes in perspiration by detecting the changes in electrical conductivity of skin.The changes in perspiration is one of the examples of physiological response to a stimulus such as stress, emotion, pain, etc. Previous studies have already shown that EDA is one of the leading indicators for a stimulus. However, the EDA signal itself is not trivial to analyze. To detect different stimuli in human subjects, variety of features are extracted from EDA signals such as the number of peaks, max peak amplitude, to name a few, showing the prevalence of this signal in bio-medical as well as ubiquitous and wearable computing research. In this paper, we present an open-source Python toolkit for EDA signal preprocessing and statistical and automatic feature extraction. To the best of our knowledge, this is the first effort for developing a versatile and generic tool to extract any number of automatic features from EDA signals. Our online toolkit is evaluated using different machine learning algorithms applied to Wearable Stress and Affect Detection (WESAD) dataset which is publicly available. Our results show that our proposed pipeline outperforms the state of the art accuracy using either statistical or automatic extracted features on a same dataset. Based on our results, in all of our four machine learning algorithms, we achieve a higher validation accuracy using automatic features compared with statistical features.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View