This guide explains how to ingest documents and maintain the Vector Database for the Civilization Node.
Before first use, ensure the embedding model is active:
./rag_setup.shThen configure Open WebUI: Admin Panel > Documents > Embedding Model > nomic-embed-text.
- PDF: Best for manuals and papers.
- TXT/MD: Good for notes.
- Drag & Drop: Open the "Documents" tab in Open WebUI sidebar (
#). - Collection Tags:
- Always tag your uploads! Use
#manualsfor technical docs and#researchfor papers. - This allows targeted querying (e.g., "In #manuals, how do I reset the device?").
- Always tag your uploads! Use
- Wait:
- Watch the progress bar.
- Only upload 5-10 large PDFs at a time to prevent jamming the queue.
To ensure your RTX 4070 is doing the work (not CPU):
- Open a terminal during ingestion.
- Run
watch -n 1 nvidia-smi. - You should see a process for
ollamawith >1000MB VRAM and strict Compute usage.
Issue: "Embedding generation failed"
- Check model:
./rag_setup.sh - Check logs:
docker logs civ_ollama(look for OOM errors).
Issue: Search is slow
- You might have too many small files. Try merging related PDFs.
- Ensure
nomic-embed-textis being used (it's faster than larger models).