Skip to content

Official implementation of the pupillometry system called PupilSense proposed in the article "PupilSense: Detection of Depressive Episodes Through Pupillary Response in the Wild".

License

Notifications You must be signed in to change notification settings

CommanderPho/PupilSense

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PupilSense: Detection of Depressive Episodes Through Pupillary Response in the Wild

This is the official codebase of the pupillometry system paper PupilSense: Detection of Depressive Episodes Through Pupillary Response in the Wild, accepted by 2024 International Conference on Activity and Behavior Computing.

News 📰

The data collection app paper accepted by ACM MobileHCI (2024), titled FacePsy: An Open-Source Affective Mobile Sensing System -- Analyzing Facial Behavior and Head Gesture for Depression Detection in Naturalistic Settings. GitHub is coming soon!

Our work mood detection using pupillary response got accepted at IEEE BSN 2024, titled MoodPupilar: Predicting Mood Through Smartphone Detected Pupillary Responses in Naturalistic Settings.

Introduction

PupilSense is a deep learning-based pupillometry system. It uses eye images collected from smartphones for research in the behavior modeling domain.

The accompanying data collection app will be released soon for the research community.

Pupil-to-Iris Ratio (PIR) Estimation Pipeline

Installation

Follow these steps to set up the project:

(1) Clone the repository: git clone https://github.com/stevenshci/PupilSense.git
(2) Navigate to the project directory: cd PupilSense
(3) Install the required packages: pip install -r requirements.txt

Setup

After you install the required packages, you can just run the setup.sh script to set up the project environment:

source setup.sh

Dataset

This project uses a custom dataset of eye images for training and evaluation. The dataset should be organized in the following structure:

dataset/
├── train/
│   │── image1.png
│   │── image2.png
│   └── ...
|
│── train_data.json
│    
└── test/
│   ├── image1.png
│   ├── image2.png
│   └── ...
|
└── test_data.json

Note: The annotations(test_data.json, train_data.json) should be in COCO format, with the pupil and iris regions labeled as separate categories.

Annotations

To annotate the dataset, use tools like MakeSense.ai, Roboflow, Labelbox, LabelImg, or VIA to label pupil and iris regions on the images. Export these annotations in the COCO format, which should include necessary details for images, annotations, and categories.

The COCO format is a standard for object detection/segmentation tasks and is widely supported by many libraries and tools in the computer vision community.

Usage

To fine-tune the Detectron2 model on your dataset, run the following command:

python scripts/finetune.py

To test your images on a batch of images on the trained model:

python scripts/inference.py

Results

Our fine-tuned Detectron2 model achieves accurate pupil and iris segmentation on eye images captured in naturalistic environments.

Segmented Eye Image 1 Segmented Eye Image 2

The model robustly segments the pupil and iris regions in diverse real-world conditions, including varying lighting, eye positions, and backgrounds.

This capability enables practical applications in biometrics, human-computer interaction, and medical imaging, where precise segmentation in naturalistic settings is crucial.

Pretrained Models

Click Pretrained Models to download our pre-trained model for PupilSense, and unzip it into models.

Citation

If you find this repository useful, please consider giving a star ⭐ and citation using the given BibTeX entry:

@INPROCEEDINGS{10652166,
  author={Islam, Rahul and Bae, Sang Won},
  booktitle={2024 International Conference on Activity and Behavior Computing (ABC)}, 
  title={PupilSense: Detection of Depressive Episodes through Pupillary Response in the Wild}, 
  year={2024},
  volume={},
  number={},
  pages={01-13},
  keywords={Laboratories;Mental health;Depression;Real-time systems;Wearable devices;Monitoring;Smart phones;Pupillometry;Depression;Affective computing;Machine Learning},
  doi={10.1109/ABC61795.2024.10652166}}

License

This project is licensed under the MIT License.

Contact

If you have any questions or suggestions, please feel free to contact Rahul and Priyanshu.

About

Official implementation of the pupillometry system called PupilSense proposed in the article "PupilSense: Detection of Depressive Episodes Through Pupillary Response in the Wild".

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 93.6%
  • Shell 6.4%