-
Download Ollama from Ollama's official website.
-
Install the Llama3.2:4b model using Ollama by running the following command in your terminal:
ollama install llama3.2:4b
-
Use the Live Server extension in VS Code to run the
index.htmlfile.
4E-org/Blocate-AI
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|