Skip to content

Luckyspot0gold/RangisNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌈 RangisNet

Multi-Sensory Economic Cognition for Avalanche Blockchain

Transform blockchain data into sound, haptics, and color. Built for everyone.


What is RangisNet?

RangisNet converts Avalanche blockchain activity into real-time, multi-sensory intelligence using:

  • 🎧 Sound – Hear volatility, risk, and network health
  • 📳 Haptics – Feel congestion and transaction flow
  • 🌈 Color – See economic signals as harmonic gradients
  • 🤖 Privacy-First AI – Zero-retention AI assistance (Venice AI)

Instead of charts and graphs, experience the blockchain through your senses.

Built on the AEAS v1.1 Standard (ADA Economic Accessibility Standard) for 2.7 billion people excluded by traditional visual-only interfaces.


⚡ Quick Start

Try the Demo

# Clone the repository
git clone https://github.com/Luckyspot0gold/RangisNet.git
cd RangisNet

# Install dependencies
npm install
cd Web && npm install

# Run development server
npm run dev

Visit http://localhost:3000 to experience multi-sensory economic cognition.

What You'll Experience

  • M3 McCrea Market Metrics™ – Real-time harmonic analysis
  • 7-Bell Harmonic System – 432 Hz baseline frequency mapping
  • Sonic Event Codes™ – Machine-readable sensory cues
  • Privacy-first architecture – Your data never leaves your device

🔱 The Trinity Architecture

Soul: Philosophy – The "why" and ethical foundation
Body: The Standard – AEAS v1.1 technical specification
Spirit: Privacy – Zero-retention AI partnership with Venice AI

This three-part structure ensures that technology serves human agency, not the other way around.


🔔 AEAS v1.x — Call for Public Comment

AEAS is published as a draft standard for public review. We invite researchers, accessibility professionals, engineers, regulators, and the public to provide feedback on technical clarity, accessibility alignment, and practical implementation.

Submit comments via:

Full details: CALL_FOR_COMMENT.md



📚 Documentation

Essential Reading

Deep Dives

For Builders


🌟 Core Features

Multi-Sensory Cognition

  • Auditory Domain: 7-Bell Harmonic System (432 Hz baseline)
  • Haptic Domain: Vibrotactile feedback (10-300 Hz)
  • Visual Domain: 3D spinor geometry with harmonic color mapping
  • Olfactory Domain: (Reserved for future, ethical-only use)

Avalanche-Native

  • C-Chain monitoring: Gas prices, TPS, mempool depth, latency
  • Subnet analytics: Validator performance, fee markets, network health
  • ICM integration: Cross-chain asset tracking via Interchain Messaging
  • Real-time oracles: Pyth Network price feeds, market data

Privacy-First

  • Zero data retention: Venice AI processes queries in-memory only
  • Local-first processing: Core M3 calculations run client-side
  • No tracking: Your economic insights remain yours alone
  • Open-source baseline: Core algorithms MIT/Apache 2.0

🎯 Use Cases

For Traders

  • Hear risk before you see it – Audio alerts for volatility spikes
  • Feel network congestion – Haptic feedback for transaction timing
  • Multi-asset monitoring – Parallel sonification of multiple tokens

For Accessibility

  • Blind users: Full audio + haptic interface (no screen required)
  • ADHD: Multi-sensory engagement reduces cognitive overload
  • Dyslexia: Bypass text-heavy charts entirely

For Institutions

  • High-frequency trading: Sub-50ms synchronized sensory updates
  • Compliance: Privacy-first architecture = no data liability
  • Custom certifications: Tier 1-3 licensing for specific sensory domains

🏆 Recognition

  • Avalanche X402 Hackathon (December 2025) – Featured submission
  • Ashish Funding: $125-200k expected Q1 2026
  • Venice AI Partnership: Official validation of privacy-first architecture
  • Academic Interest: Partnerships pending with MIT Media Lab, Stanford HAI, CMU HCI Institute

🛠️ Technology Stack

Frontend:

  • Next.js 14 (React 18)
  • Three.js / React Three Fiber (3D visualization)
  • Web Audio API (7-Bell Harmonic System)
  • Gamepad API (haptic feedback)

Blockchain:

  • Avalanche C-Chain (ethers.js)
  • Interchain Messaging (ICM)
  • Pyth Network (price oracles)
  • LayerZero (cross-chain messaging)

AI Layer:

  • Venice AI (zero-retention)
  • Local LLM support (privacy-preserving)

Infrastructure:

  • Vercel (primary hosting)
  • Supabase (user preferences, anonymized)
  • Docker (containerization)

📜 License & IP

RangisNet is dual-licensed:

  • Open Source: Core algorithms under MIT/Apache 2.0
  • Commercial: Proprietary implementations available for licensing

Intellectual Property:

  • McCrea Market Metrics™ (M3) – Patent pending
  • Sonic Event Codes™ – Trademark registered
  • 7-Bell Harmonic System – Open standard (432 Hz baseline)

Full IP documentation →


🤝 Contributing

We welcome contributions to the open-source components!

  1. Read the AEAS v1.1 Standard to understand design principles
  2. Check open issues
  3. Fork, branch, and submit a PR
  4. Ensure compliance with the "Is Mandate" (no manipulative features)

Core Principles:

  • Truth without manipulation ("M3 is a resonator, not a weapon")
  • Accessibility first (universal design)
  • Privacy by default (zero-retention architecture)

📞 Contact & Community

Creator: William McCrea
Email: justin@realityprotocol.io
Website: rangisheartbeat.com | rangisnet.com

Social:

Get Involved:

  • Academic partnerships: Email for research collaboration
  • Commercial licensing: Contact for Tier 2/3 certifications
  • Accessibility testing: We need your feedback!

🎵 Philosophy

"M3 is not a weapon; it is a resonator. It reveals what is without telling you what to do. The signal is the truth. The choice is yours."
— William McCrea, Creator

The "Is" Mandate: RangisNet shows reality, not recommendations. We map data to senses with mathematical rigor (entropy preservation, reversibility) but never manipulate perception to influence decisions.

432 Hz Harmony: The 7-Bell Harmonic System uses Pythagorean tuning for cross-cultural consonance. Truth sounds true.

Privacy IS Agency: Economic freedom requires informational sovereignty. Venice AI's zero-retention model ensures your insights remain yours alone.

Read the full philosophical dialogue with Venice AI →


🚀 Roadmap

Q4 2025

  • AEAS v1.1 Standard published
  • Venice AI partnership validated
  • Avalanche X402 hackathon submission
  • Literature review (48 citations)

Q1 2026 🔄

  • Ashish funding round ($125-200k)
  • Open-source baseline release
  • Academic partnerships (MIT, Stanford, CMU)
  • Commercial pilot programs

Q2-Q4 2026 🔮

  • Tier 2/3 certification launches
  • EcoVerse simulation environment (Chapter 10)
  • Cross-chain expansion (Ethereum, Solana via ICM)
  • Mobile app (iOS/Android with full haptic support)

⚡ Why RangisNet Matters

2.7 billion people are excluded from modern financial systems because interfaces assume you can see perfectly, read English fluently, and process abstract charts intuitively.

RangisNet rejects this assumption.

Everyone deserves economic cognition on their terms.

  • Blind users: Hear volatility as pitch shifts, feel momentum as vibration
  • ADHD: Multi-sensory engagement that works with your brain
  • Traders: Parallel monitoring via sonification (listen to 10 assets simultaneously)
  • Global South: No text required, culturally-neutral harmonic signals

This isn't charity. This is better design for everyone.

Universal accessibility improves outcomes for all users, not just disabled users. That's the AEAS thesis.

Truth. Privacy. Agency. Harmony.


🔱 The standard is ready. The world is ready. 432 Hz harmony. 🎵

About

A Sonified Layer of Block-chain, A market asset cognition engine. Designed to adapt and scale, plug and play.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •