EaseUp (Fatigue Alarm Device)

by Tofa in Circuits > Raspberry Pi

90 Views, 1 Favorites, 0 Comments

EaseUp (Fatigue Alarm Device)

poster_easeup.png
photo_2025-06-15_19-50-16.jpg

I made a device - a box with a picture frame attached. It has a camera, two displays, a buzzer, and an LED strip. The device uses a YOLO model to detect open or closed eyes, tracking blinking frequency. Based on this, it estimates a person's tiredness level - the less frequent the blinks, the more tired the person is. The buzzer and LED alert the user to take a break.

The project helps me (and hopefully others) remember to take breaks, as being in a flow state makes it easy to miss signs of tiredness and end up exhausted. It's easy to build (the coding part was the hardest, but I’ve already finished it!).

Supplies

  1. Raspberry Pi 5 – 110 EUR
  2. Camera – 8 EUR
  3. Custom-made laser-cut box – 7 EUR
  4. 2 LCDs – 6 EUR (optional)
  5. GPIO connection board – 5 EUR
  6. LED strip – 4 EUR
  7. Picture frame – 4 EUR
  8. 2 buttons – 2 EUR
  9. 2 buzzers – 0.3 EUR

Collecting and Labeling Eye Dataset

image_2025-06-18_125451997.png
image_2025-06-18_125529601.png
image_2025-06-18_125617188.png
image_2025-06-18_130002826.png

I collected a dataset of people’s faces using a Python script which asked subject to have opened/closed eyes randomly (for better balancing), managing to capture a variety of angles to cover more edge cases. Eyes in the images were labeled with two classes: opened_eye and closed_eye. I used Roboflow for labeling, as it has a relatively easy-to-use UI.

There are 128 images in total: https://app.roboflow.com/easeup/easeuplabeling/browse

Collecting and Labeling Fatigue Dataset

image_2025-06-18_131448795.png
image_2025-06-18_131350269.png

We need fatigue data to find a correlation with blink frequency. For that, I suggest using automated software running in the background.

I collected videos (1 minute long, recorded every 10 minutes) while working at my computer. After each video, my app triggered a pop-up where I had to click a button when it changed color. This technique is often used in medical research on fatigue. Instead of asking how tired someone feels, it's more reliable to measure their reaction time.

I saved the videos along with labels, so in the end, I had 108 minutes of footage of me and my friends, each labeled with our reaction times.

Training YOLO Model

1748338844_closed_jpg.rf.8caa37dd5478d944812900187594fa7b.jpg
1748347973_closed_jpg.rf.d74d567ff4e374dbbdb1ec5ffb7f5844.jpg
1748347378_opened_jpg.rf.15a798cfc00e9e34ff38d7bf08c4fa02.jpg
image_2025-06-18_132411144.png
confusion_matrix.png

First, I trained models directly on Roboflow, but when I realized my dataset was enough for good precision, I wrote a short script for local training and ran it for 200 epochs.

I used brightness augmentation, as it's the most common issue I noticed. To increase model speed, I resized the images to 320×320 pixels—this boosted performance to 17 FPS on a Raspberry Pi, compared to just 5 FPS with the standard 640×640 resolution.

All images were in grayscale to avoid color bias, which is irrelevant for eye detection. I used YOLOv11n for its speed, since the detection task was simple but needed to be fast.

Finding Blinks/fatigue Relation

image_2025-06-18_134917128.png

Remember the tiredness data we collected earlier? Now that the detection model is ready, it's possible to extract blinking frequency from each video and correlate it with reaction time.

I wrote a script that counts the number of opened and closed eyes per frame and assigns a label to each frame based on the majority class. It then detects falling edges (when frame x−1 shows open eyes and frame x shows closed eyes) and filters out any that are isolated (i.e., only one closed-eye frame in a row). Finally, it divides the number of valid blinks by the video length to get the blinking frequency.

After collecting this data for each video, I plotted "reaction time" vs. "blinking frequency" and got a scatterplot. While there is a general correlation, a more noticeable pattern emerged: when reaction time is low, blinking frequency becomes inconsistent. However, around 0.3 Hz, this "randomness" starts to disappear. I used this 0.3 Hz threshold as the tiredness cutoff in the later stages.

Software

photo_2025-06-17_15-23-26.jpg
photo_2025-06-17_15-23-31.jpg

Now that it's possible to detect blinking frequency and we have a defined threshold, I started building the actual script.

In short, the program takes 20-second clips using the camera, detects blinking frequency, and applies moving average smoothing to reduce noise. It then chains these segments into a time series, and if it detects three consecutive measurements below the threshold, it activates the buzzer and LED to alert the user to take a break.

I also added a fun design touch - two LCDs that imitate eyes and follow the user’s gaze (completely optional, I just wanted to practice working with SPI interfaces).

For testing, I built the whole setup on a breadboard first.

Building Electronics Part

photo_2025-06-17_15-23-22.jpg

After successful tests, I soldered all the components onto a Raspberry Pi extension board.

Building the Box

photo_2025-06-17_15-23-13.jpg
photo_2025-06-17_15-23-18.jpg
Box_Timofejs_multiplex_4mm.png

For the casing, I used a simple laser-cut triangular box, combining it with a picture frame for the design. I made three holes in the frame - two for the LCDs and one for the camera. The back wall isn’t glued - it can be opened.

To Run the Code...

To run the device, first connect the Raspberry Pi to the board, then start main.py from the ./EaseUp_RPiRunner/ folder. You may need to create a virtual environment and install the libraries from requirements.txt. Make sure to run the file from that specific folder.

https://drive.google.com/drive/folders/1q9rucpnlHf8O9D2d3GRgJ8vpPKyaVCSP?usp=drive_link