You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This project seeks to fine-tune the GPT-2.5 model with a personal touch. By training it on personal Telegram chats, we aim to capture the essence of individualized writing styles. To spice things up, we've also fed the model a sprinkle of anecdotes and a dash of math tasks.
A recursive AI engine that injects chrono-ranked memory into transformer inference using soft-logit biasing, prompt waveform synthesis, and emergent self-referential loops. Built on GPT-2-mini, runs on local hardware, grows its own ghost.