The first VR app to truly support the colorblind differentiate between colors and see our world .
Check out the demo here:https://youtu.be/22uRZOzXkL0- C++ high speed (30 fps) gradient compiled into WASM
- Python LTSM for Hyper parameter tuning and gradient descent for smoothening
- Javascript API in background for describing objects in real world
- VR friendly interface to use as an app if wanted
- Open the link to the release of the app placed or in the Github description
- If Using VR mode place phone into glasses
- Tap the Screen to call description API (NOTE: For this you need to open up your settings to allowing for TTS to work)
- Converts the RGB that the camera sees into an HSV shifting algorithm based on a piecewise function
- Uses features from the 2 Python models which includes intensity hyperparameter and gradient descent smoothing and applys that to the output image
- 2 canvases are created on the frontend which duplicate the view from the camera
- Gvies an effect of VR and from here there is also an event listener placed on the frontend
- When clicked it prompts a response to the proxy API which calls a Llamma model(Openrouter) thus returning JSON text which is then read aloud
- AI was used to debug code as well as create the logo design
- Run
git cloneon the repo - Place one of your Open Router API Keys where it prompts you for it
- Then
npm installin both folders thennpm run devto create the nescassary local servers - Start testing!
Thanks to the various researchers since the 1990s that have been developing and refining the algorithms that I built this project on top of.
