- Start an OpenAI compatible API in a host accessible from the containers
OLLAMA_HOST=localhost:8000 ollama serve- Pull the
mistral:latestmodel on the host you are running Ollama
ollama pull mistral:latest- Start the containers using Docker Compose
docker-compose up- Wait for everything to be up and then pull the
orca-minimodel:
podman exec -it camel-assistant_ollama_1 ollama pull orca-miniNOTE: this may take a while, as it needs to download about 2Gb of data from HuggingFace.
- Proceed to the Loading Data section for details about how to load data
- A Kafka instance up and running and able to receive remote requests.
- Podman installed and running (locally or remote)
- Ollama installed and running (locally or remote)
NOTE: URLs and hostnames can be configured in the application.properties file or exported via environment variables. For instance
if using Qdrant in another host, you can set its host using the QDRANT_HOST variable.
- Build the project
mvn clean package- Launch Qdrant:
podman run -d --rm --name qdrant -p 6334:6334 -p 6333:6333 -v $(pwd)/qdrant_storage:/qdrant/storage:z qdrant/qdrant:v1.13.6-unprivileged- Launch Ollama:
OLLAMA_HOST=localhost:8000 ollama serveNOTE: make sure you have the mistral:latest model available. If not, then download it using OLLAMA_HOST=localhost:8000 ollama pull mistral:latest.
- Launch the ingestion sink:
KAFKA_BROKERS=kafka-host:9092 java -jar ./assistant-ingestion-sink/target/quarkus-app/quarkus-run.jar- Launch the ingestion source:
KAFKA_BROKERS=kafka-host:9092 java -jar ./assistant-ingestion-sources/plain-text-source/target/quarkus-app/quarkus-run.jar- Launch the backend:
KAFKA_BROKERS=kafka-host:9092 java -jar ./assistant-backend/target/quarkus-app/quarkus-run.jar- Launch the UI:
java -jar assistant-ui-vaadin/target/quarkus-app/quarkus-run.jarTo load PDF data (such as those from documentation, books, etc) into the QDrant DB, use the command:
cd assistant-cli && java -jar target/quarkus-app/quarkus-run.jar consume file /path/to/red_hat_build_of_apache_camel-4.0-tooling_guide-en-us.pdfNOTE: you can download some PDFs from here.
You can load data from the Camel Dataset.
To download the dataset for data formats, components, EIPs and languages:
huggingface-cli download --repo-type dataset --local-dir camel-dataformats megacamelus/camel-dataformats
huggingface-cli download --repo-type dataset --local-dir camel-components megacamelus/camel-components
huggingface-cli download --repo-type dataset --local-dir camel-eips megacamelus/camel-eips
huggingface-cli download --repo-type dataset --local-dir camel-languages megacamelus/camel-languagesTo download the dataset for components:
huggingface-cli download --repo-type dataset --local-dir camel-components megacamelus/camel-componentsUse this command to load the dataset into the DB:
java -jar target/quarkus-app/quarkus-run.jar consume dataset --path ~/code/datasets/camel-dataformats/ --source org.apache.camel
java -jar target/quarkus-app/quarkus-run.jar consume dataset --path ~/code/datasets/camel-components/ --source org.apache.camel
java -jar target/quarkus-app/quarkus-run.jar consume dataset --path ~/code/datasets/camel-eips/ --source org.apache.camel
java -jar target/quarkus-app/quarkus-run.jar consume dataset --path ~/code/datasets/camel-languages/ --source org.apache.camelWait a few seconds after running the load command, and then check if the data is available in the Qdrant DB:
curl -X POST http://localhost:6333/collections/camel/points/scroll -H "Content-Type: application/json" -d "{\"limit\": 50 }" | jq .To build the containers locally:
mvn clean package -Dquarkus.container-image.build=true