Skip to content

elliotSchmango/Querious

Repository files navigation

Querious - Local Retrieval-Augmented Generation Chatbot

Querious is a local-based RAG (Retrieval-Augmented Generation) system powered by a lightweight large language model (LLM). It allows users to upload PDFs and ask questions about their content. The system parses the documents, retrieves relevant chunks, and generates informed answers grounded in the source material.

Features

  • Upload and analyze multiple PDF documents
  • Query documents using natural language
  • Uses retrieval-augmented generation (RAG) to ground LLM responses
  • Citations include source filename and page/chunk references
  • Works fully offline — all inference and document handling are local

Tech Stack

  • LLM: LLaMA 3.2 (3B parameters) via Llamafiles
  • Retrieval: FAISS vector store with chunked document embeddings
  • Frontend: Python Streamlit Library
  • Backend: Python
  • Tools: LangChain

Project Structure

SETUP:

  1. Clone this repository
  2. Add pdf files of your choice to the "data" folder
  3. Run in terminal:

a. Install streamlit globally

pip3 install streamlit

b. Install a local virtual environment

pip3 install virtualenv
virtualenv venv

c. Run the virtual environment. For macOS/Linux:

source venv/bin/activate

For Windows:

.\venv\Scripts\activate

d. Install remaining dependencies:

pip3 install -r requirements.txt

NOTE: to deactivate virtual environment, simply run:

deactivate

To Run:

  1. In your project directory, run the following to populate your database:
streamlit run populate_database.py
  1. Then in the same directory, run the app:
streamlit run ollama-streamlit-app.py

Sources/Inspiration

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages