Skip to content

CodeforGood1/ANN-AdaptiveTrainer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ANN1: Adaptive Activation Neural Network with ONNX Export

A custom feedforward neural network built from scratch in NumPy, capable of evolving activation function combinations using a genetic algorithm. This model supports Leaky ReLU and Bell-shaped activations, adaptive learning rate decay, and is fully exportable to ONNX for cross-platform deployment.


🚀 Features

  • ✅ Two hidden layers with per-neuron activation selection
  • 🧬 Genetic algorithm to evolve best activation masks
  • ⚙️ Trained using Adam optimizer with learning rate scheduler
  • 🧠 Custom neural network logic (no PyTorch/TensorFlow in training)
  • 📦 Final trained weights saved as .npz
  • 🔀 ONNX model export using PyTorch wrapper
  • 📤 Ready for edge deployment and inference via ONNX Runtime

🗂 Directory Structure

ANN1/
│
├── model_weights/
│   └── best_model.npz             # Final trained model weights
│
├── src/
│   ├── main.py                    # Entry point for training and mask evolution
│   ├── activations.py             # Bell and Leaky ReLU functions
│   ├── layers.py                  # Dense layer logic
│   ├── trainer.py                 # Adam optimizer training logic
│   └── mask_evolver.py           # Genetic algorithm for activation masks
│
├── onnx_export/
│   ├── model.py                   # PyTorch wrapper for ONNX export
│   ├── export_onnx.py             # Script to convert to ONNX
│   └── test_onnx.py               # Script to test ONNX using onnxruntime
│
└── README.md                      # This file

📦 Installation

# Recommended environment
python -m venv ann-env
ann-env\Scripts\activate     # Windows
# source ann-env/bin/activate   # Linux/macOS

# Install dependencies
pip install numpy matplotlib onnx onnxruntime torch

🧪 Run Training and Export

# Step 1: Train and evolve
cd src
python main.py

# Step 2: Convert to ONNX
cd ../onnx_export
python export_onnx.py

# Step 3: Test ONNX model
python test_onnx.py

🧬 Activation Mask Evolution

This project evolves the best per-neuron activation function mask using a genetic algorithm. For each generation:

  • Random combinations of activations (0 for Leaky ReLU, 1 for Bell) are tested.
  • Models are trained briefly and scored by average loss.
  • Top performers breed new combinations using crossover + mutation.

📤 Model Export

The final .npz model is wrapped in PyTorch and exported to ONNX using:

torch.onnx.export(...)

This ONNX model can be used in:

  • C++ or Rust backends
  • Edge devices (Nvidia Jetson, Raspberry Pi)
  • Cross-platform mobile inference

🧠 Example ONNX Inference (CPU)

import onnxruntime as ort
import numpy as np

sess = ort.InferenceSession("best_model.onnx", providers=["CPUExecutionProvider"])
input = np.random.randn(1, 8).astype(np.float32)
print(sess.run(None, {"input": input}))

📜 License

MIT License. Feel free to fork and build upon.


Project: Real-time intelligent ANN with self-tuning activations and embedded inference.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages