VR-Based Vitals Assessment Training With Haptic Feedback
by Hussein Diab in Circuits > Arduino
29 Views, 0 Favorites, 0 Comments
VR-Based Vitals Assessment Training With Haptic Feedback

In real-world medical emergencies, quickly and accurately assessing a patient’s vital signs is critical. However, most first aid and CPR training tools lack the tactile realism needed to develop proper palpation skills and diagnostic confidence. Traditional mannequins are often static and fail to simulate the subtle physiological sensations of a real patient.
This project introduces a VR-based vitals assessment training system that merges immersive visuals with realistic haptic feedback. By combining Meta Quest 3S for real-time spatial tracking with the Meta Quest controllers and a customized glove using Drake TacHammer actuators, we offer a simulation where trainees can physically “feel” a patient’s vitals through vibration cues.
The goal is to provide learners with a realistic and immersive training tool. Within the VR simulation, users assess two virtual patients: one with stable vital signs and another in critical condition. When the controller reaches the correct anatomical position (chest or neck), the glove vibrates to simulate a heartbeat or pulse. These cues vary in intensity depending on the patient’s condition, prompting the trainee to make a decision: to initiate CPR or not.
By adding tactile feedback to virtual training, we aim to enhance learning outcomes and bridge the gap between theoretical knowledge and real-world readiness.
This project was developed by Hussein Diab and Jimmy Al Khawand as part of the Medical Devices course in the Master of Electromechanical Engineering program at KU Leuven – Group T Leuven Campus.
Supplies


To replicate the VR-based vitals assessment training system, the following components and materials are required:
- Meta Quest 3: A standalone VR headset used to track hand or controller position in 3D space.
- Unity Engine: Used to develop the virtual 3D training environment, hand interaction zones, and feedback logic.
- Arduino Micro: A microcontroller board that communicates with Unity via USB to control haptic motors.
- Drake TacHammer Actuators: High-fidelity haptic actuators that simulate heartbeat and pulse sensations with vibration patterns.
- Custom Haptic Glove: A wearable glove with actuators placed on the index fingertip (carotid pulse) and the palm (heartbeat zone).
- TCA9548A I²C Multiplexer: Used to control multiple DRV2605 motor drivers on the same I²C bus from the Arduino.
- Jumper Wires, Breadboard, and USB Cable: Basic electrical components used to wire the haptic circuit, connect the actuators, and establish serial communication between Unity and Arduino.
All hardware components were sourced from the KU Leuven Haptic Interfaces Lab inventory and assembled according to recommended electrical safety and VR integration practices.
Create the VR Environment in Unity





To set up the virtual training environment, we began by creating a new Unity scene configured for VR interaction using the Meta Quest 3. The development involved installing the Meta XR All-in-One SDK, the XR Interaction Toolkit, and enabling OpenXR features, including hand interaction profiles for both PC and Android. These packages allowed for real-time hand or controller tracking and seamless integration with Meta Quest hardware.
The simulation environment was built around two virtual patients, each represented by a "Banana Man" character model. Interactive zones were added using 3D colliders placed at anatomically accurate locations (chest and neck) to simulate the heart and pulse. These zones were tagged to detect contact from the user’s virtual hand, triggering specific serial commands ("Pulse", "Heart", etc.) via a central communication script (SerialTest). The commands were then sent to an Arduino board to activate the appropriate haptic feedback. To reinforce the training objectives, a UI system with feedback buttons and real-time scenario-specific responses was implemented, accompanied by character animations and breathing sounds for added realism.
To enable VR testing and development, the Meta Quest 3 headset was connected to the PC using the Meta Quest Link application, which facilitated a stable interface with the Unity Editor. Through an Air Link connection, developers could interact with the VR scene in real time and verify alignment between hand movements and in-game feedback. The headset’s built-in controller tracking continuously monitored the user’s hand proximity to specific anatomical zones marked by 3D colliders. When overlap occurred, C# logic scripts triggered corresponding serial commands to the Arduino, activating haptic feedback. This setup allowed for accurate, wireless, and portable VR training deployment without external sensors.
Integrate Haptics With Arduino


Unity communicates with the Arduino via USB serial connection, where each vital sign zone in the simulation is assigned a unique command (e.g., “Pulse”, “Heart”, “FaintPulse”, “FaintHeart”). When the user’s hand enters a target zone, Unity sends the appropriate command to the Arduino. The Arduino parses the input and uses a multiplexer-controlled DRV2605 driver to activate the corresponding TacHammer actuator. Two types of haptic feedback patterns are defined: a steady dual-pulse for normal vitals and a fainter pulse for critical conditions. This setup enables realistic, location-specific vibrations that match the simulated physiological state.
Build the Haptic Glove

The haptic glove was constructed using only available lab materials and a standard fabric glove. The wires of the TacHammer actuators were extended to allow flexible placement, and two motors were affixed directly to the glove: one on the index fingertip to simulate the carotid pulse and another on the palm to simulate the heartbeat. Both actuators were secured using adhesive, ensuring reliable skin contact for effective tactile feedback during interaction with the virtual patient.
For more information on any part, refer to the attached pdf.