Skip to content

andreaslam/TensorOps

Repository files navigation

TensorOps

A Work-In-Progress Autograd Library

Powered by Python Powered by Rust Total commits

Setting up TensorOps

Firstly, to use this repo, use git clone to make the Repository available locally:

git clone https://github.com/andreaslam/TensorOps.git

At the root of the repository, run:

pip install .

Building the Backend

To compile the Rust OpenCL backend, cargo is needed.

Then run:

cd tensorops
maturin develop --release

Support for non-OpenCL backend is still in progress. PRs welcome.

Getting started with TensorOps

There are some examples available in the examples folder

Most examples implemented in the examples folder, will have a corresponding PyTorch implementation for comparison and juxtaposition.

TensorOps Features

Node (Deprecating)

  • Forward pass
  • Backward pass
  • Node weight and gradient tracking (enable/disable)
  • Arithmetic operations (BIDMAS, negation, exponentiation, modulo, several Python reverse operations)
  • Non-linear activation functions (sin, cos, tanh, ReLU, sigmoid, ramp)
  • Lazy evaluation

Tensor

  • Weight and gradient tracking (enable/disable)
  • Arithmetic operations (BIDMAS, negation, exponentiation, modulo, several Python reverse operations)
  • Non-linear activation functions (sin, cos, tanh, ReLU, sigmoid, ramp)
  • Lazy evaluation
  • OpenCL Backend
  • Partial graph execution
  • Kernel fusion

Model (New Version In Progress)

  • Mix and match activation functions
  • Configurable layer sizes
  • Customisable loss functions
  • Customisable forward passes and general-purpose neural network abstractions

Loss functions (New Version In Progress)

  • Mean Absolute Error
  • Mean Square Error

Optimisers (New Version In Progress)

  • Adam
  • AdamW
  • Stochastic Gradient Descent (SGD)

Utility features (New Version In Progress)

  • Function graphing and plotting
  • Colour-coded plotter for Directed Acyclic Graphs (DAGs)

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •