Aura is a forgiving, human-friendly declarative language designed specifically for building AI/ML pipelines and fast, AI-integrated web applications. Drawing inspiration from the elegance and developer happiness of Ruby (think Rails' magic but for AI and web), Aura aims to reduce boilerplate, eliminate common errors, and make prototyping joyful.
Whether you're a data scientist iterating on neural networks, a web developer embedding AI features, or a hobbyist exploring ML, Aura provides a natural, expressive syntax that feels like writing pseudocode—while transpiling to efficient Ruby code powered by Torch-rb and Sinatra for speed and scalability.
Launched in late 2025, Aura is in its early stages (v0.1) but already supports basic ML workflows and web serving. Join us in shaping a language that prioritizes programmer happiness over strictness.
In a world dominated by verbose Python scripts for ML (PyTorch/TensorFlow) and fragmented web stacks (Next.js + APIs), Aura bridges the gap:
- Human-Readable Syntax: Ruby-like blocks and natural keywords make code read like English.
- Forgiving by Design: Smart defaults, auto-inference (e.g., tensor shapes, devices), and friendly error suggestions (e.g., "OOM? Halving batch size for you! 😊").
- AI/Web Integration: Seamless from data loading to model training to deploying AI endpoints— all in one file.
- Fast Performance: Transpiles to Ruby with Torch-rb for ML and Puma/Sinatra for concurrent web serving. No slow interpreters.
- Zero Boilerplate: No
__init__.py, no manual imports— just declare and run. - Interoperable: Outputs standard Ruby code; easy to extend or integrate with existing ecosystems.
Aura fills the niche for "conversational coding" in AI, where iteration speed and error resilience matter most. If you've ever debugged shape mismatches in PyTorch or wrestled with web deployment, Aura is for you.
- Declarative ML Pipelines: Define datasets, models, training, and evaluation in concise blocks.
- Built-in Forgiveness:
- Auto-device selection (GPU if available).
- Runtime recovery (e.g., reduce batch size on memory errors).
- Plain-English errors with fix suggestions.
- Web Primitives: Routes with AI hooks (e.g., predict on user input) transpiled to fast Sinatra apps.
- Extensible: Add custom layers or integrations via Ruby extensions.
- Native LLMs: Built-in AST mapping for OpenAI and local Ollama hooks.
- MVP Scope: Supports simple neural nets (dense, dropout), mock/Hugging Face data, JSON/HTML responses.
- Future-Proof: Roadmap includes deployment strategies and more nuanced AI primitives.
Aura is distributed as a Ruby gem. Requires Ruby 3.3+.
- Install Ruby (if not already): ruby-lang.org.
- Clone the repo:
git clone https://github.com/johnvteixido/aura-lang.git cd aura-lang - Install dependencies:
This pulls in Parslet (for parsing), Torch-rb (for ML), Sinatra/Puma (for web), and dev tools like Pry.
gem install bundler bundle install
For global CLI access:
gem install aura-lang
- Create a
.aurafile (e.g.,hello.aura):model greeter neural_network do input text output greeting "Hello from Aura! 🌟" end route "/hello" get do output prediction from greeter.predict(input) format :json end run web on port: 3000 - Run it:
This transpiles to Ruby, "trains" (mock), and starts a server.
bin/aura run hello.aura - Test:
curl http://localhost:3000/hello→{"prediction": "Hello from Aura! 🌟"}
For errors, Aura suggests fixes automatically—e.g., "Missing 'end'? Added it for you! 😊"
Aura files (.aura) are declarative scripts. The CLI transpiles and executes them:
aura init <project_name>: Scaffolds a complete Aura project structure.aura run <file.aura>: Parse, transpile, run (trains models, starts web server).- Future:
aura deploy <file.aura>for cloud (Vercel/Fly.io).
Key Commands:
aura --help: Usage info.aura repl: Run the interactive Readline live evaluation shell!
Train a simple classifier and serve predictions:
dataset "mnist" from huggingface "mnist"
model classifier neural_network do
input shape(28, 28, 1) flatten
layer dense units: 128, activation: :relu
layer dropout rate: 0.2
output units: 10, activation: :softmax
end
train classifier on "mnist" do
epochs 5
batch_size 32
optimizer :adam, learning_rate: 0.001
loss :cross_entropy
metrics :accuracy
end
evaluate classifier on "mnist/test"
route "/predict" post do
output prediction from classifier.predict(image) format :json
end
run web on port: 3000
Run: bin/aura run mnist_classifier.aura
- Trains on mock data (real HF coming soon).
- POST to /predict:
curl -X POST http://localhost:3000/predict -d '{"image": [[...28x28 array...]]}'
route "/recommend" get do
output prediction from recommender.predict(user_id) format :html
end
run web on port: 8080
Outputs dynamic HTML with AI-driven content.
Hooking into a local Ollama model directly over the network logic:
model assistant from ollama "llama3"
route "/api/chat" post do
output prediction from assistant.predict(message) format :json
end
run web on port: 3000
- Blocks: Use
do ... endfor models, trains, routes. - Keywords: Natural like
layer dense units: 128. - Defaults: Many inferred (e.g., activation: :relu).
- Formats: JSON/HTML for outputs.
Full grammar in
lib/aura.rb(Parslet-based).
Inspired by David Heinemeier Hansson (DHH) and Ruby on Rails:
- Beautiful over Clever: Code should read like a story.
- Forgiving over Strict: Help users recover from mistakes.
- Human Happiness First: Optimize for joy, not performance alone.
- Zero Boilerplate: No ceremony—just create. We believe languages should empower, not frustrate. Aura is our attempt at "Ruby for AI in 2025."
- v0.4: Extended visual/conv2d mapping for image models targeting Hugging Face parameters.
- v1.0: Native compilation (via MRuby?), full web frameworks (Hanami/Rails interop), community extensions.
- Long-term: Visual editor, enterprise features (federated learning).
Track issues/PRs on GitHub.
We ❤️ contributions!
- Fork the repo.
- Create a branch:
git checkout -b feature/awesome-thing. - Commit changes:
git commit -m "Add awesome thing". - Push:
git push origin feature/awesome-thing. - Open a PR.
See CONTRIBUTING.md (coming soon) for guidelines. Focus on parser/transpiler extensions or examples.
MIT License. See LICENSE for details.
- Ruby community for endless inspiration.
- Torch-rb for ML power.
- You, for checking out Aura! 🌟
Questions? Open an issue or find me on LinkedIn @johnvteixido. Let's build the future of AI coding together!