Skip to content

odynvolk/jensen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

jensen

A LLM powered Telegram bot.

Prerequisites:

Telegram bot

First you need to create a Telegram bot to interact with. See instructions here.

LLM model

You need a LLM in GGUF format. You can find quite a few through TheBloke on HuggingFace who has done an enormous service to the community by converting different models to GGUF and quantized them. Pick one that suits your needs and hardware requirements.

Llama.cpp

You need Llama.cpp in order to be able to run a model on your machine. There are other alternatives out there like Ollama and LM Studio, but I prefer llama.cpp due to it being lightweight. It's easy to install through Brew.

$ brew install lama.cpp

Python

You need Python 3 on your machine.

  • Micromamba (optional)

For handling the packages needed for different enivronments. Easy to install with asdf.

$ asdf install
$ micromamba create -n jensen python=3.13
$ micromamba activate jensen
  • LLMs, Telegram etc

For using LLM models, Telegram API etc

$ pip install -r requirements.txt

Configuration

Copy ./start-llama-cpp-server-example.sh to ./start-llama-cpp-server.sh and enter the model and settings you want.

Create a .env file with the following properties.

# Telegram
API_KEY=<api key, string> (mandatory)
POLL_INTERVAL=<interval to use when polling Telegram as seconds, float>

# LLM
OPEN_AI_URL=<some Open AI compatible API>
OPEN_AI_MODEL=<name of model to use>

Usage

Start the llama.cpp server:

./start-llama-cpp-server.sh

Start the application:

python ./jensen/app.py

And you're off to the races.

About

A LLM powered Telegram chatbot.

Topics

Resources

License

Stars

Watchers

Forks

Contributors