Skip to content

NachuT/Aspectum

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Aspectum

The first VR app to truly support the colorblind differentiate between colors and see our world .

Your paragraph text

Check out the demo here:https://youtu.be/22uRZOzXkL0

Features:

  • C++ high speed (30 fps) gradient compiled into WASM
  • Python LTSM for Hyper parameter tuning and gradient descent for smoothening
  • Javascript API in background for describing objects in real world
  • VR friendly interface to use as an app if wanted

How to use:

  • Open the link to the release of the app placed or in the Github description
  • If Using VR mode place phone into glasses
  • Tap the Screen to call description API (NOTE: For this you need to open up your settings to allowing for TTS to work)

How it works:

  • Converts the RGB that the camera sees into an HSV shifting algorithm based on a piecewise function
  • Uses features from the 2 Python models which includes intensity hyperparameter and gradient descent smoothing and applys that to the output image
  • 2 canvases are created on the frontend which duplicate the view from the camera
  • Gvies an effect of VR and from here there is also an event listener placed on the frontend
  • When clicked it prompts a response to the proxy API which calls a Llamma model(Openrouter) thus returning JSON text which is then read aloud

AI Use

  • AI was used to debug code as well as create the logo design

How to Contribute

  • Run git clone on the repo
  • Place one of your Open Router API Keys where it prompts you for it
  • Thennpm install in both folders then npm run dev to create the nescassary local servers
  • Start testing!

Thanks

Thanks to the various researchers since the 1990s that have been developing and refining the algorithms that I built this project on top of.