From neurons to flight paths
An accessible framework for controlling drones using computer vision and brain signals. Control drones with hand gestures, head movements, or real EEG motor imagery. No expensive hardware required - just a webcam and Python.
Started as a BCI research project in 2018 (MATLAB + $800 EEG headset + $300 drone). Rebuilt from the ground up in 2025 with modern ML so anyone can try it with just a webcam.
Original 2018 version: brain-computer-interface-for-drones
Control a drone by moving your fist - like dragging a cursor!
python demos/hand_gesture_demo.pyโ Move your fist around the screen, drone follows!
Gestures:
- โ Make a fist and move it โ Drone follows your fist position!
- Move fist to upper screen โ Drone moves forward
- Move fist to lower screen โ Drone moves backward
- Move fist to left side โ Drone strafes left
- Move fist to right side โ Drone strafes right
- Keep fist in center โ Drone hovers
- โ Open palm (5 fingers) โ Takeoff/Land
Tips:
- Start with fist in center of screen
- Move fist slowly at first
- The further from center, the faster the drone moves
How it works: Uses Mediapipe Hands to track your wrist position in real-time. Most natural and intuitive control method - no need to think about specific gestures!
Control the drone with head movements - no hands needed!
python demos/head_gesture_demo.py๐ฃ๏ธ Look down to fly forward, tilt head to strafe - hands-free control
Controls:
- ๐ Look down โ Drone moves forward
- ๐ Look up โ Drone moves backward
- ๐ Turn left โ Drone rotates left
- ๐ Turn right โ Drone rotates right
โ๏ธ Tilt left โ Drone strafes leftโ๏ธ Tilt right โ Drone strafes right- โต SPACE โ Takeoff/Land
- โ ESC โ Exit
How it works: Uses Mediapipe Face Mesh to track head pose (pitch/yaw/roll). Like BCI, but using neck muscles instead of reading brain signals. Quirky, effective, and great for accessibility.
Train a PyTorch model on real EEG data and control the drone with imagined hand movements.
# 1. Download dataset and train model (takes ~10 minutes)
python demos/train_model.py
# 2. Run the demo
python demos/motor_imagery_demo.py๐ง Real brain signals controlling a drone - no physical movement required
Controls:
- โต SPACE โ Takeoff/Land
- R โ Classify random EEG epoch and execute command
- โ ESC โ Exit
How it works: Uses the PhysioNet Motor Movement/Imagery Dataset to train an EEGNet model with residual connections that classifies motor imagery (imagined left/right hand movements) from EEG signals. Achieves 73% cross-subject accuracy with 17 channels.
-
๐ฏ Three Control Modes:
- Hand Gesture: Fist-following control (most intuitive)
- Head Gesture: Hands-free control (accessibility)
- Motor Imagery: Real EEG classification (research-grade, 73% accuracy)
-
๐ค Modern ML Stack:
- PyTorch 2.0+ with CUDA support
- EEGNet with residual connections for BCI
- Real-time inference (60 FPS)
-
๐ฎ Simulated Drone:
- Physics-based movement
- Real-time visualization
- No hardware required for testing
-
๐ Production Ready:
- Clean, modular architecture
- Type hints throughout
- Pre-commit hooks with Ruff
- Comprehensive configs
- Proper package installation (setup.py)
- Python 3.10 or higher
- Webcam (for hand/head gesture demos)
- ~2GB disk space (for EEG dataset)
- GPU recommended for EEG training (optional)
# Clone the repository
git clone https://github.com/yourusername/NeuralFlight.git
cd NeuralFlight
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install package (recommended!)
pip install -e .
# Or just install dependencies
pip install -r requirements.txt
# Install pre-commit hooks (for development)
pre-commit installThat's it! The package is now installed and you can run demos from anywhere.
See INSTALLATION.md for more details.
After pip install -e ., you can run from anywhere:
# Hand gesture control
neuralflight-hand
# Head gesture control
neuralflight-head
# Motor imagery EEG control
neuralflight-eeg
# Train EEG model
neuralflight-train# Hand gesture control
python demos/hand_gesture_demo.py
# Head gesture control
python demos/head_gesture_demo.py
# Train EEG model
python demos/train_model.py
# Motor imagery demo
python demos/motor_imagery_demo.pyNeuralFlight/
โโโ src/neuralflight/
โ โโโ simulator/ # Pygame-based drone simulator
โ โโโ gestures/ # Hand & head gesture detection (Mediapipe)
โ โโโ eeg/ # EEG data loading and preprocessing
โ โโโ models/ # PyTorch neural networks (EEGNet + Residual)
โ โโโ controllers/ # High-level drone control abstraction
โ โโโ utils/ # Config loading, logging, etc.
โโโ demos/ # Runnable demos
โโโ config/ # YAML configuration files
โโโ notebooks/ # Jupyter notebooks for analysis
โโโ tests/ # Unit tests
Simulator (src/neuralflight/simulator/drone_sim.py)
- Physics-based drone movement
- Real-time pygame visualization
- Supports all standard drone commands
Hand Gesture Detector (src/neuralflight/gestures/hand_detector.py)
- Mediapipe Hands for tracking
- Wrist position tracking (works with closed fist!)
- Fist detection via fingertip-to-wrist distance
- Open palm detection for takeoff/land
Head Gesture Detector (src/neuralflight/gestures/head_detector.py)
- Mediapipe Face Mesh for pose estimation
- Temporal smoothing for stable control
- Configurable gesture thresholds
EEG Pipeline (src/neuralflight/eeg/)
- PhysioNet dataset downloader
- Bandpass filtering (8-30 Hz for motor imagery)
- MNE-Python integration
- Subject-level train/val split (prevents overfitting)
EEGNet with Residuals (src/neuralflight/models/eegnet_residual.py)
- Compact CNN for EEG classification
- Based on Lawhern et al. (2018) + residual connections
- ~10K parameters, trains in 10-15 minutes
- 73% cross-subject accuracy (17 channels)
This project is a complete modernization of my original 2018 BCI drone controller, which used:
- MATLAB for signal processing
- Node.js for drone control
- Emotiv EPOC+ headset ($800)
- AR Parrot drone ($300)
What's changed in 2025:
- โ Pure Python (no MATLAB)
- โ PyTorch for deep learning
- โ Modern EEG processing with MNE
- โ Accessible demos without expensive hardware
- โ Clean, maintainable architecture
- โ Open-source datasets
- โ Residual connections for better accuracy
- โ Proper package installation
Dataset: PhysioNet Motor Movement/Imagery
- 109 subjects, 64-channel EEG
- Motor imagery tasks: left hand, right hand, feet, fists
- 160 Hz sampling rate
Preprocessing:
- Bandpass filter: 8-30 Hz (alpha/beta bands)
- Channels: 17 (FC3-FC4, C5-C6, CP3-CP4 - full motor cortex coverage)
- Epoch length: 3 seconds
- Subject-level split (19 train, 5 validation)
Model: EEGNet with Residual Connections
- Architecture: Temporal conv โ Depthwise spatial conv โ Separable conv (with skip connection) โ FC layers
- Parameters: ~10,000
- Training: ~100 epochs, Adam optimizer
- Accuracy: 73% cross-subject validation (excellent for 17 channels!)
- Training time: ~15 minutes on GPU, ~45 minutes on CPU
Why 73% is good:
- Cross-subject (not person-specific) = harder task
- 17 channels (not full 64-channel cap) = less data
- Motor imagery (imagined movement) = weak signals
- Comparable to published research papers
Tracking: Mediapipe Hands
- 21 hand landmarks
- Tracks wrist position (not just fingertips!)
Gesture Recognition:
- Fist detection: Measures average fingertip-to-wrist distance
- Distance < 0.15 = fist detected
- Open palm detection: All fingers extended far from wrist
- Distance > 0.2 for all fingers = palm detected
Control Mapping:
- Wrist position relative to screen center
- Deadzone threshold: 0.10 (configurable)
- Movement intensity scales with distance from center
Filtering:
- Temporal smoothing (5-frame window)
- Adaptive dimension matching for edge cases
Tracking: Mediapipe Face Mesh
- 468 facial landmarks
- Key points: nose, eyes, chin
- Calculates pitch, yaw, roll angles
Gesture Mapping:
- Pitch (ยฑ10ยฐ) โ Forward/Backward
- Yaw (ยฑ15ยฐ) โ Rotate Left/Right
- Roll (ยฑ10ยฐ) โ Strafe Left/Right
Filtering:
- Temporal smoothing (5-frame window)
- Configurable dead zones
This framework is useful for:
- Research: Rapid prototyping of BCI algorithms
- Education: Teaching ML, signal processing, and robotics
- Accessibility: Alternative control methods for users with motor impairments
- Autonomous Systems: Intent detection, attention monitoring
- Portfolio: Demonstrating ML/robotics/neuroscience skills
- Hackathons: Quick BCI demos and prototypes
All settings are in YAML files under config/:
drone_config.yaml- Simulator physics, display settingshand_config.yaml- Hand gesture thresholds, camera settingsgesture_config.yaml- Head gesture thresholdseeg_config.yaml- Signal processing, model hyperparameters
Example: Adjust hand gesture sensitivity
# config/hand_config.yaml
gestures:
position:
threshold: 0.10 # Lower = more sensitive (was 0.15)Example: Use more subjects for training
# config/eeg_config.yaml
dataset:
train_subjects: [1, 2, 3, 4, 6, 7, 8, 9, 10, ..., 40] # More subjects
val_subjects: [41, 42, 43, 44, 45]Contributions welcome! See CONTRIBUTING.md for guidelines.
This project uses:
- Ruff for linting and formatting
- Pre-commit hooks for code quality
- PyTorch for deep learning
- Type hints throughout
# Set up development environment
pip install -e ".[dev]"
pre-commit install
# Format code
black src/
ruff check src/Please read our Code of Conduct before contributing.
For security concerns, please see SECURITY.md for our vulnerability reporting process.
Safety Note: NeuralFlight is designed for research and education. Always test in controlled environments. Do not use for safety-critical applications without extensive validation.
- Lawhern et al. (2018). "EEGNet: A Compact Convolutional Network for EEG-based Brain-Computer Interfaces"
- Schalk et al. (2004). "BCI2000: A General-Purpose Brain-Computer Interface System"
- Goldberger et al. (2000). "PhysioBank, PhysioToolkit, and PhysioNet"
- He et al. (2016). "Deep Residual Learning for Image Recognition" (Residual connections)
Apache License 2.0 - see LICENSE file for details.
This means you can:
- โ Use commercially
- โ Modify and distribute
- โ Patent use
- โ Private use
With conditions:
โ ๏ธ Include license and copyright noticeโ ๏ธ State changes madeโ ๏ธ Include NOTICE file if present
Saumya Saksena Originally created in 2018, modernized in 2025.
If you find this project useful, consider giving it a star! It helps others discover the project.
- Support for real drone hardware (DJI Tello, CrazyFlie)
- Multi-class motor imagery (4+ classes for 3D control)
- Real-time EEG streaming from consumer headsets (Muse, OpenBCI)
- Web dashboard for remote control
- Reinforcement learning for autonomous navigation
- Multi-modal control (voice + BCI + gestures)
- Mobile app for gesture control
- VR/AR visualization
- Multi-drone swarm control
- PhysioNet for the EEG dataset
- Google Mediapipe team for computer vision tools
- PyTorch team for the deep learning framework
- Original EEGNet paper authors
- All contributors and users!
Built with ๐ง and โ by someone who thinks drones and brains are both cool.
