VortexArm is a modular, extensible bionic arm, with a focus on real-world deployment, research, and education. It uses a dataflow architecture (dora-rs) for flexible integration of kinematics, actuation, and perception modules. The dataset used for spices detection was annotated in Roboflow before fintunning with Yolo8m.pt.
VORTEXARM/
├── model-finetunning/
│ ├── Cooking_Jollof_Rice-1/
│ ├── runs/
│ │ ├── detect/
│ ├── predict-sample/
│ │ │ ├── best.pt
│ │ │ ├── image.jpg
│ │ │ ├── spice-detection.py
│ ├── spices_training_yoloy8.ipynb
│ ├── yolov8m.pt
├── samples/
│ ├── ...
├── src/
│ ├── inverse_kinematics/
│ │ ├── inverse_kinematics/
│ │ │ ├── __init__.py
│ │ │ ├── __main__.py
│ │ │ ├── inverse_kinematics.py
│ │ │ ├── main.py
│ │ │ ├── robot_config.py
│ │ │ └── simulator.py
│ │ ├── inverse_kinematics.egg-info/
│ │ └── tests/
│ ├── ssc32u_controller/
│ │ ├── ssc32u_controller/
│ │ │ ├── __init__.py
│ │ │ ├── __main__.py
│ │ │ ├── main.py
│ │ │ ├── robot_config.py
│ │ │ └── ssc32u.py
│ │ ├── ssc32u_controller.egg-info/
│ │ └── tests/
│ └── vision/
│ ├── vision/
│ │ ├── __init__.py
│ │ ├── __main__.py
│ │ ├── main__.py
│ │ ├── object_detection.py
│ │ ├── plot.py
│ │ ├── utils.py
│ │ ├── webcam.py
│ │ ├── yolov8mSpice.pt
│ └── tests/
│ └── dataflow-graph.html
│ └── dataflow.yml
│ └── requirements.txt
│ └── pyproject.toml
├── README.md
└── .gitignore
-
Inverse Kinematics:
Compute joint angles for desired end-effector positions. -
SSC-32U Servo Controller Integration:
Python interface for the Lynxmotion SSC-32U, supporting up to 32 servos, synchronized/group moves, and feedback querying. -
Vision Module:
Camera and object detection integration (Finetuned YOLOv8m), Detects spice location and send coordinates -
Simulation:
Test and validate algorithms in a simulated environment. -
Dataflow Orchestration:
Modular nodes defined indataflow.ymlfor flexible, scalable system integration.
Prerequisites:
Installation:
git clone https://github.com/attahiruj/VortexArm.git
cd VortexArm
uv venv -p 3.11 --seed
dora build dataflow.yml --uvRunning the Project:
uv pip install -r requirements.txt
dora up
dora start dataflow.yml --attach- All main modules (inverse kinematics, controller, vision) are in
src/. - Configure your robot in
robot_config.pyfiles. - Connect modules using
dataflow.yml. - For vision-based tasks, ensure
yolov8m.ptis present, can be replaced with compatible YoLO models.
- Annotated dataset exported for Yolo8 from RoboFlow Model Finetuning Folder

- Finetunned model used for detection in Vision module Vision module
- Test script for detection available model-finetunning spice-detection
- Validated and Tested model preform on our custom datatest for detecting food ingredients
- Format code with ruff:
uv pip install ruff uv run ruff check . --fix - Lint code with ruff:
uv run ruff check . - Run tests with pytest:
uv pip install pytest uv run pytest .
The dataflow.yml file defines the nodes, their connections, and runtime parameters for the system. Modify this file to add, remove, or reconfigure modules.
- Inverse Kinematics:
Use theinverse_kinematicsmodule to compute joint angles for a target position. - Servo Control:
Send joint commands to the SSC-32U controller for real-world actuation. - Vision:
Run object detection using the YOLOv8 model and trigger arm movement based on detections.

