Skip to content

Platform-agnostic Rust library for AI routing on mobile devices

License

Unknown, Unknown licenses found

Licenses found

Unknown
LICENSE
Unknown
LICENSE.txt
Notifications You must be signed in to change notification settings

hyperpolymath/heterogenous-mobile-computing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

70 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

MPL-2.0 Palimpsest

Mobile AI Orchestrator

RSR Compliance Build Status Safety crates.io

What This Is

mobile-ai-orchestrator is a platform-agnostic Rust library for intelligent AI routing on constrained devices. It decides where and how to run AI inference - locally, remotely, or hybrid.

Important
This is a library/framework, NOT an application. For a complete Android application using this library, see neurophone.

Core Purpose

┌─────────────────────────────────────────────────────────────────┐
│                   mobile-ai-orchestrator                        │
│                   (THIS LIBRARY)                                │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│   Query ──► Expert ──► Router ──► Context ──► Inference         │
│            System      Decision   Manager     Dispatch          │
│            (Safety)    (Where?)   (History)   (Execute)         │
│                                                                 │
│   Decides: "Should this run locally or in the cloud?"           │
│   Handles: Safety rules, privacy filtering, context tracking    │
│   Provides: Routing decisions, NOT the actual inference         │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘
                              │
                              ▼
              ┌───────────────┴───────────────┐
              │                               │
      ┌───────▼───────┐             ┌─────────▼─────────┐
      │ Local SLM     │             │ Cloud API         │
      │ (llama.cpp)   │             │ (Claude, etc.)    │
      │ Your choice   │             │ Your choice       │
      └───────────────┘             └───────────────────┘

Key Differentiators

Feature This Library Typical AI SDKs

Platform

Platform-agnostic (Linux, Android, iOS, embedded)

Platform-specific

Decision Layer

Built-in routing intelligence

Direct API calls only

Safety

Zero unsafe blocks, rule-based safety

Varies

Offline

Offline-first by design

Network required

Privacy

Automatic sensitive data detection

Manual handling

Use Cases

  • Mobile apps - Let the library decide local vs cloud based on query complexity

  • Edge devices - Intelligent fallback when connectivity is limited

  • Privacy-sensitive - Automatic blocking of sensitive data from cloud

  • Resource-constrained - Battery and memory-aware routing

Installation

Add to your Cargo.toml:

[dependencies]
mobile-ai-orchestrator = "0.1"

# Optional: enable network features
mobile-ai-orchestrator = { version = "0.1", features = ["network"] }

Quick Start

use mobile_ai_orchestrator::{Orchestrator, OrchestratorConfig};

// Create orchestrator with default config
let config = OrchestratorConfig::default();
let orchestrator = Orchestrator::new(config);

// Process a query - library decides routing
let decision = orchestrator.process("How do I iterate a HashMap?")?;

match decision.route {
    Route::Local => {
        // Run with your local SLM (llama.cpp, etc.)
        let response = your_local_llm.generate(&decision.context);
    }
    Route::Remote => {
        // Send to your cloud API (Claude, GPT, etc.)
        let response = your_cloud_api.query(&decision.context);
    }
    Route::Blocked => {
        // Query blocked by safety rules
        println!("Blocked: {}", decision.reason);
    }
}

Architecture

Components

Component Purpose File

Expert System

Rule-based safety layer (block dangerous queries)

src/expert.rs

Router

Heuristic + neural routing decisions

src/router.rs

Context Manager

Conversation history and project state

src/context.rs

Orchestrator

Main coordinator, pipeline execution

src/orchestrator.rs

Reservoir

Echo State Network for temporal compression

src/reservoir.rs

MLP

Multi-layer perceptron for learned routing

src/mlp.rs

SNN

Spiking neural network for wake detection

src/snn.rs

Neural Components (Phase 2+)

The library includes three neural computing approaches:

  1. Reservoir Computing (ESN) - Compress conversation history into fixed-size state

  2. Multi-Layer Perceptron - Learn routing decisions from data

  3. Spiking Neural Network - Ultra-low-power wake-word detection

These are optional enhancements over the heuristic baseline.

Features

Safety-First

  • Zero unsafe blocks in entire codebase

  • Type-safe by design (Rust ownership model)

  • Memory-safe (compile-time guarantees)

  • Formal rule-based safety layer

Offline-First

  • Core functionality works without internet

  • Network features behind feature flag

  • Graceful degradation

  • Local decision-making always available

Privacy-Preserving

  • Automatic detection of sensitive data patterns

  • Configurable blocking rules

  • On-device processing by default

  • Explainable routing decisions

Configuration

let config = OrchestratorConfig {
    // Safety thresholds
    safety_threshold: 0.8,

    // Routing preferences
    prefer_local: true,
    max_local_complexity: 0.6,

    // Context settings
    max_history_items: 100,
    project_aware: true,

    // Privacy rules
    block_patterns: vec![
        r"password|secret|api.?key".to_string(),
    ],
};

CLI Tool (Optional)

The library includes a CLI for testing:

# Build CLI
cargo build --release

# Interactive mode
./target/release/mobile-ai --interactive

# Single query
./target/release/mobile-ai "Explain ownership in Rust"

# With project context
./target/release/mobile-ai --project myproject "What's the architecture?"

Integration Examples

With neurophone (Android)

neurophone uses this library for AI routing:

// In neurophone-core
use mobile_ai_orchestrator::{Orchestrator, Route};

let orchestrator = Orchestrator::new(config);
let decision = orchestrator.process(&user_query)?;

match decision.route {
    Route::Local => llama_client.generate(decision.context),
    Route::Remote => claude_client.query(decision.context),
    Route::Blocked => /* handle blocked query */,
}

With llama.cpp

use mobile_ai_orchestrator::{Orchestrator, Route};

let orchestrator = Orchestrator::new(config);
let decision = orchestrator.process(query)?;

if decision.route == Route::Local {
    let llama = LlamaModel::load("model.gguf")?;
    let response = llama.generate(&decision.prepared_prompt);
}

Benchmarks

Run performance benchmarks:

cargo bench
Operation Time Memory

Route decision (heuristic)

~50μs

~1KB

Route decision (MLP)

~200μs

~10KB

Context update

~10μs

~100B

Reservoir step

~500μs

~50KB

RSR Compliance

This project achieves Bronze-level RSR (Rhodium Standard Repository) compliance:

  • Type safety (Rust type system)

  • Memory safety (ownership model, zero unsafe)

  • Offline-first (network is optional)

  • Comprehensive documentation

  • Test coverage (>90%)

  • Build system (justfile, flake.nix)

  • CI/CD automation

  • Security policy

Project Relationship Description

neurophone

Consumer

Android app that uses this library for on-device AI

echomesh

Complementary

Conversation context preservation across sessions

oblibeny

Inspiration

Safety-critical programming language concepts

Development

# Run tests
cargo test

# Run benchmarks
cargo bench

# Generate docs
cargo doc --open

# Run examples
cargo run --example basic_usage
cargo run --example reservoir_demo
cargo run --example mlp_router

Contributing

Contributions welcome! This project operates under TPCF Perimeter 3 (Community Sandbox).

See CONTRIBUTING.md for guidelines.

License

Dual-licensed under:

  • Palimpsest-MPL-1.0 License

  • Palimpsest License v0.8

Citation

@software{mobile_ai_orchestrator_2025,
  author = {Jewell, Jonathan D.A.},
  title = {Mobile AI Orchestrator: Platform-Agnostic AI Routing Library},
  year = {2025},
  url = {https://github.com/hyperpolymath/heterogenous-mobile-computing},
  note = {RSR Bronze-compliant}
}

Contact


Platform-Agnostic • Offline-First • Zero Unsafe • RSR Bronze Compliant

Sponsor this project

Packages

No packages published

Contributors 2

  •  
  •