VisualVroom is an accessibility application designed to help users with hearing impairments detect and identify approaching vehicles and emergency sirens through audio analysis and visual/haptic feedback. The system consists of an Android mobile application and a Wear OS app that work together to provide real-time alerts about vehicle sounds through visual cues and smartwatch vibration patterns.
- Real-time Vehicle Sound Detection: Identifies sirens, horns, and bicycle sounds
- Direction Identification: Shows whether sounds are coming from the left or right
- Visual Alerts: Clear visual indicators showing vehicle type and direction
- Speech-to-Text: Additional accessibility feature for transcribing speech
- Continuous Audio Monitoring: Background service that processes audio in intervals
- Haptic Feedback: Provides distinct vibration patterns based on the type of vehicle detected
- Direction Indication: Communicates the direction of approaching vehicles through different vibration patterns
- Low-power Operation: Minimizes battery consumption while maintaining connectivity
- Standalone UI: Simple interface showing connection status
- Audio Processing: Stereo channel recording with amplitude-based direction detection
- Machine Learning Integration: Connects to a Python backend that runs a Vision Transformer model for classification
- Continuous Monitoring: Records and analyzes audio in 3-5 second intervals
- Low Resource Usage: Optimized for battery efficiency during continuous usage
- Wearable Messaging API: Uses Google's Wearable Message API for reliable communication
- Custom Vibration Patterns:
- Sirens: Rapid, urgent patterns
- Bicycle bells: Gentle, repeating patterns
- Car horns: Strong, attention-grabbing patterns
- Built with Wear OS Design Principles: Follows material design for wearables
- Android 11+ (API level 30) for mobile app
- Wear OS 3.0+ for watch app
- Stereo microphone support on mobile device
- Bluetooth connectivity between phone and watch
- Clone the repository
- Open the project in Android Studio Iguana or later
- Update the server URL in
AudioRecorder.javaif using a custom backend - Build and run the application
- The Wear OS app will be automatically installed on your paired watch when you install the mobile app
- Alternatively, you can manually install the Wear OS APK from the release page
/mobile/src/main/java/edu/skku/cs/visualvroomandroid/MainActivity.java: Main application entry point and tab controllerAudioRecorderFragment.java: UI for sound detection and visualizationSpeechToTextFragment.java: Speech transcription functionalityAudioRecorder.java: Core audio recording and processingAudioRecordingService.java: Background service for continuous monitoringWearNotificationService.java: Handles Wear OS communication
/wear/src/main/java/edu/skku/cs/visualvroomandroid/presentation/MainActivity.java: Primary entry point and message receiver
/wear/src/main/res/layout/activity_main.xml: Main UI layoutactivity_main-round.xml: Round watch optimization
RECORD_AUDIO: For sound detectionINTERNET: For backend communicationACCESS_FINE_LOCATIONandACCESS_COARSE_LOCATION: For future location-aware featuresFOREGROUND_SERVICE_MICROPHONE: For Android 14+ foreground servicePOST_NOTIFICATIONS: For Android 13+ notifications
VIBRATE: For providing haptic feedbackWAKE_LOCK: To ensure alerts are delivered even when the watch screen is off
The mobile app communicates with a PyTorch-based backend running a Vision Transformer model. The backend processes audio spectrograms and returns:
- Vehicle type classification
- Direction prediction
- Confidence score
- Notification decision
The system uses the Wearable Message API to send alerts from the phone to the watch when a vehicle is detected with high confidence.
/vibration: Triggers the watch to vibrate/vehicle_alert: Contains detailed information about detected vehicles
Messages use a JSON format containing:
- Vehicle type
- Direction
- Vibration pattern specifications
Different patterns are used based on the vehicle type:
- Siren: Urgent pattern with short pulses (100ms on, 100ms off, 100ms on, 100ms off, 300ms on)
- Bike: Moderate pattern with medium pulses (200ms on, 200ms off, 200ms on)
- Horn: Alert pattern with longer pulses (400ms on, 200ms off, 400ms on)
- Launch the application
- Navigate between the "Sound Detection" and "Speech to Text" tabs
- On the Sound Detection tab, press the microphone button to start monitoring for vehicle sounds
- When a vehicle is detected, the app will display the type and direction
- Use the "Vibrate Watch" button to test the connection with your Wear OS device
- The watch app runs automatically in the background
- The main screen shows the current status of the connection
- When a vehicle is detected by the phone, the watch will vibrate with the appropriate pattern
- Ensure Bluetooth is enabled on both devices
- Check that the mobile app is running
- Verify the watch is properly paired with the phone
- Disable battery optimization for the app on both devices
- Ensure microphone permissions are granted
- Check that the device has a stereo microphone
- Position the phone with clear line of sight to potential sound sources
- Increase volume or adjust the phone position if sounds are too quiet