Automated Traffic Monitoring With Computer Vision
by Hadi Hleihel in Circuits > Raspberry Pi
33 Views, 0 Favorites, 0 Comments
Automated Traffic Monitoring With Computer Vision

In today’s cities, emergency vehicles often lose precious time at intersections because traffic lights don’t recognize their urgency. This project solves that by building a smart traffic light system using a Raspberry Pi 5 and computer vision. The system detects vehicles in real time, counts congestion, and most importantly automatically gives green light priority to emergency vehicles like ambulances and fire trucks. The goal: make intersections safer, faster, and smarter for everyone, especially in emergencies.
Supplies
.png)
.png)
.png)
.png)
.png)


.jpeg)
.jpeg)


Raspberry Pi 5 - 8Gb Starter Pack (2023)
Powerful computing for real-time AI/traffic monitoring
€160.00
USB Webcam (+ Microphone)
Captures traffic footage for computer vision
€17.55
Freenove Project Kit for Raspberry Pi 5
Sensors, wires, and parts for prototyping
€60.00
Active Cooler for Raspberry Pi 5
Keeps the Pi cool for stable, long-term operation
€5.95
Multi Card Reader, 6 in 1
Quick data transfer between storage and Pi
NNBILI OTG SD TF Type C Micro SD-kaartlezer
€3.00
3D Printing (Bases for Traffic Lights)
€3.00
Laser Cutting (Wooden Housing)
€12.00
Roadmap & Toy Vehicles
Ali Express/Amazon
€20.00
3 Traffic Light LED modules
Ali Express/Amazon
€5.99
Jumper Wires, Soldering Iron, Screwdriver, Hot Glue Gun
already had them (Wiring: Only jumper wires used, soldered with extension cables under the roadmap to reach all traffic lights (tools: soldering iron with extension cables)
so,
Total Real Cost: €252.49
Dataset Creation and Annotation
.png)
.png)
.png)
.png)
.png)
.png)
.png)
To ensure the model could reliably detect all relevant objects, I created a custom dataset using my webcam and a set of toy vehicles representing each class: Ambulance, Bus, Car, Fire Truck, Motorcycle, Police Car, Traffic Light, and Truck. I placed these on a three-way intersection map and captured hundreds of photos per class, making sure to include various angles and lighting conditions. Each image was annotated in Roboflow, with traffic lights included as a separate class so the model could distinguish them from vehicles. This careful annotation is crucial for robust, real-world detection.
Hardware Assembly and Housing

.jpeg)
.jpeg)
.jpeg)
.jpeg)
.jpeg)
All electronics are housed in a custom laser-cut wooden box, painted black and gold with a silver lid and engraved branding. The Raspberry Pi, LCD, and wiring are neatly organized inside. The three traffic light modules are mounted on 3D-printed bases, which fit into holes on the intersection map. All cables are routed under the map for a clean look, and the webcam is mounted for a top-down view. Only jumper wires were used, soldered with extension cables to reach all traffic lights under the map, ensuring a tidy and robust assembly.
Model Training
.png)
.png)
.png)
With the dataset ready, I trained a YOLOv8n (nano) model for 120 epochs ideal for real-time inference on the Raspberry Pi 5. The model was trained to recognize all eight classes, including traffic lights, using robust augmentations for lighting, angle, and position. The trained model was exported as ONNX for deployment on the Pi. The result: high accuracy for all vehicle classes and reliable detection of both moving vehicles and static traffic lights.
System Integration and Live Demo
.png)
.jpeg)
.jpeg)
.jpeg)
.jpeg)
The Raspberry Pi runs a Python script that:
- Captures live video from the webcam.
- Runs real-time detection using the YOLOv8 model.
- Assigns detected vehicles (excluding traffic lights) to Road 1 (right), Road 2 (center), or Road 3 (left) based on position.
- Counts vehicles per road and detects emergency vehicles.
- Dynamically controls the traffic lights and LCD:
- Normal Mode: Cycles green/yellow/red based on vehicle count per road (congestion logic).
- Emergency Mode: Instantly gives green to any road with an emergency vehicle, displaying a clear alert on the LCD.
- Live LCD: Always shows the current vehicle count for each road (excluding traffic lights), which road is green, and a countdown timer for each phase.
- Provides a live video stream for monitoring and debugging.
Example scenario:
If a police car is detected on Road 2 and an ambulance on Road 1, both roads get green priority for as long as the emergency vehicles are present. The LCD displays "EMERG: R2 Police" or "EMERG: R1 Ambul" with live vehicle counts and timers. When no emergencies are present, the system cycles through roads based on congestion, with the LCD updating in real time.
Results and Main Image
.png)
.png)
.png)
.jpeg)
.png)

The final setup is visually impressive: a painted, engraved box housing the electronics, a neat intersection map with 3D-printed traffic light bases, and a live LCD display showing real-time traffic logic. The main image should be a bright, clear shot of the whole setup in action, with the LCD and traffic lights visible, and toy vehicles on the map to demonstrate detection. Additional photos should show the parts, the dataset annotation process, the housing assembly, and the live detection screen.
Build It Yourself

- Gather all parts and tools (see above SUPPLIES ).
- Assemble the housing using laser-cut wood and 3D-printed bases.
- Wire up the traffic lights, LCD, and webcam to the Raspberry Pi.
- Download the YOLOv8n ONNX model and Python code .
- Set up the intersection map and place vehicles/traffic lights.
- Run the Python script on the Pi and access the live stream via your browser.
- Observe the LCD and traffic lights as you move vehicles around vehicle counts and emergency alerts update live.
Conclusion
This smart traffic light project is a practical, real-world solution to a common urban problem. By combining computer vision, custom hardware, and clever logic, it demonstrates how accessible technology can make intersections safer and more efficient for everyone. Try building it yourself and see emergency vehicle priority in action at your own model intersection.