An interactive 3D art framework combining hand gesture recognition with a nostalgic Y2K aesthetic. This project enables users to manipulate 3D objects in a Blender environment using intuitive hand gestures captured through a webcam, inspired by futuristic interfaces from films like Minority Report.
This project gained attention beyond the dev community our interactive hand-tracking demo reached 200,000+ views on Instagram, resonating with digital artists, designers, and tech enthusiasts alike.
- Real-time hand tracking and gesture recognition using MediaPipe
- Interactive 3D environment with Y2K-inspired visuals
- Two-hand gesture support for advanced interactions
- Intuitive gestures: point to select, pinch to move, and more
- RGB color plane separation effects
- Sound feedback for interactions
- Client-server architecture connecting Python (vision) and Blender (3D)
- Python 3.7+
- Blender 2.93+ (3.x recommended)
- Webcam with clear view of your hands
-
Clone the repository:
git clone https://github.com/NathanKneT/RTHT-3D cd RTHT-3D -
Install required Python dependencies:
pip install -r requirements.txt
-
Create necessary directories:
mkdir -p sounds images
-
Start the Blender environment:
blender sandbox.blend
-
In Blender, open the Scripting tab and run
blender_listener.py -
In a separate terminal, run the hand tracking module:
python hand_tracking.py
-
Position your hands in view of the webcam and start interacting!
| Gesture | Hands | Action |
|---|---|---|
| Point (index finger) | One | Select object |
| Pinch (thumb + index) | One | Move selected object |
| Pinch | Two | Rotate and scale object |
| V Sign | Two | Duplicate selected object |
| Palm | Two | Create new object |
| Fist | Two | Delete selected object |
| Palm + Pinch | Two | Toggle RGB separation effect |
project/
├── hand_tracking.py # Hand tracking and gesture recognition module
├── Blender/
│ ├── sounds/ # Sound effect files (not provided)
│ ├── images/ # Custom images for texture mapping (not provided)
│ ├── blender_listener.py # Blender script for 3D environment and UDP listener
│ ├── sandbox.blend # Blender sandbox scene for testing
├── requirements.txt # Python dependencies
├── docs/ # Documentation resources
└── examples/ # Example configurations and outputs
Place your images in the images/ directory to have them automatically loaded as textures for the 3D planes in Blender.
Modify the create_y2k_material() function in blender_listener.py to customize:
- Colors and glow intensity
- Transparency effects
- Material properties
By default, the system uses localhost:5006 for communication. To change:
- Update
blender_addressinhand_tracking.py - Update
HOSTandPORTinblender_listener.py
Contributions are welcome! See CONTRIBUTING.md for detailed guidelines.
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes
- Run the linter:
flake8 *.py - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
This project follows PEP 8 guidelines. Please ensure your code is properly formatted using:
black .- Socket Error: Ensure no other application is using port 5006
- No Hand Detection: Check lighting conditions and camera position
- Blender Not Responding: Verify the Blender script is running properly
- Missing Libraries: Run
pip install -r requirements.txtagain
For more detailed troubleshooting, see the Troubleshooting Guide.
- Additional gesture support
- VR/AR integration
- Animation recording and playback
- Custom shader effects library
This project is licensed under the MIT License - see the LICENSE file for details.
- MediaPipe team for the hand tracking technology
- Blender Foundation for the 3D creation platform
- All contributors who have helped shape this project
Project Link: https://github.com/NathanKneT/RTHT-3D
Join our Discord GLHF community for discussions and support!
