Skip to content

attahiruj/VortexArm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VortexArm

VortexArm is a modular, extensible bionic arm, with a focus on real-world deployment, research, and education. It uses a dataflow architecture (dora-rs) for flexible integration of kinematics, actuation, and perception modules. The dataset used for spices detection was annotated in Roboflow before fintunning with Yolo8m.pt.

Table of Contents


Project Structure

VORTEXARM/
├── model-finetunning/
│   ├── Cooking_Jollof_Rice-1/
│   ├── runs/
│   │   ├── detect/
│   ├── predict-sample/
│   │   │   ├── best.pt
│   │   │   ├── image.jpg
│   │   │   ├── spice-detection.py
│   ├── spices_training_yoloy8.ipynb
│   ├── yolov8m.pt
├── samples/
│   ├── ...
├── src/
│   ├── inverse_kinematics/
│   │   ├── inverse_kinematics/
│   │   │   ├── __init__.py
│   │   │   ├── __main__.py
│   │   │   ├── inverse_kinematics.py
│   │   │   ├── main.py
│   │   │   ├── robot_config.py
│   │   │   └── simulator.py
│   │   ├── inverse_kinematics.egg-info/
│   │   └── tests/
│   ├── ssc32u_controller/
│   │   ├── ssc32u_controller/
│   │   │   ├── __init__.py
│   │   │   ├── __main__.py
│   │   │   ├── main.py
│   │   │   ├── robot_config.py
│   │   │   └── ssc32u.py
│   │   ├── ssc32u_controller.egg-info/
│   │   └── tests/
│   └── vision/
│       ├── vision/
│       │   ├── __init__.py
│       │   ├── __main__.py
│       │   ├── main__.py
│       │   ├── object_detection.py
│       │   ├── plot.py
│       │   ├── utils.py
│       │   ├── webcam.py
│       │   ├── yolov8mSpice.pt
│       └── tests/
│   └── dataflow-graph.html
│   └── dataflow.yml
│   └── requirements.txt
│   └── pyproject.toml
├── README.md
└── .gitignore

Features

  • Inverse Kinematics:
    Compute joint angles for desired end-effector positions.

  • SSC-32U Servo Controller Integration:
    Python interface for the Lynxmotion SSC-32U, supporting up to 32 servos, synchronized/group moves, and feedback querying.

  • Vision Module:
    Camera and object detection integration (Finetuned YOLOv8m), Detects spice location and send coordinates

  • Simulation:
    Test and validate algorithms in a simulated environment.

  • Dataflow Orchestration:
    Modular nodes defined in dataflow.yml for flexible, scalable system integration.


Getting Started

Prerequisites:

  • Python >3.11
  • uv (fast Python package management)
  • dora (dataflow orchestration)

Installation:

git clone https://github.com/attahiruj/VortexArm.git
cd VortexArm
uv venv -p 3.11 --seed
dora build dataflow.yml --uv

Running the Project:

uv pip install -r requirements.txt
dora up
dora start dataflow.yml --attach

Usage

  • All main modules (inverse kinematics, controller, vision) are in src/.
  • Configure your robot in robot_config.py files.
  • Connect modules using dataflow.yml.
  • For vision-based tasks, ensure yolov8m.pt is present, can be replaced with compatible YoLO models.

Yolo model finetuned on custom dataset for Food Ingredients needed to cook Jollof Rice

  • Annotated dataset exported for Yolo8 from RoboFlow Model Finetuning Folder Dataset sample
  • Finetunned model used for detection in Vision module Vision module
  • Test script for detection available model-finetunning spice-detection
  • Validated and Tested model preform on our custom datatest for detecting food ingredients
    • Results below: Validation sample predection Cooking sample

Contribution Guide

  • Format code with ruff:
    uv pip install ruff
    uv run ruff check . --fix
  • Lint code with ruff:
    uv run ruff check .
  • Run tests with pytest:
    uv pip install pytest
    uv run pytest .

YAML Specification

The dataflow.yml file defines the nodes, their connections, and runtime parameters for the system. Modify this file to add, remove, or reconfigure modules.


Examples

  • Inverse Kinematics:
    Use the inverse_kinematics module to compute joint angles for a target position.
  • Servo Control:
    Send joint commands to the SSC-32U controller for real-world actuation.
  • Vision:
    Run object detection using the YOLOv8 model and trigger arm movement based on detections.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors