diff --git a/docs/developer-hub/building-on-0g/compute-network/inference-provider.md b/docs/developer-hub/building-on-0g/compute-network/inference-provider.md index 331065ec..71ed27b0 100644 --- a/docs/developer-hub/building-on-0g/compute-network/inference-provider.md +++ b/docs/developer-hub/building-on-0g/compute-network/inference-provider.md @@ -112,7 +112,7 @@ targetUrl: "http://localhost:8000" # Your model service model: "llama-3.3-70b-instruct" # Model identifier ``` :::info Serving URL -Serving URL must be publically accessible from the internet. +Serving URL must be publicly accessible from the internet. ::: ### Configure Docker Port @@ -175,4 +175,4 @@ The automatic settlement engine handles payments. If issues occur: ## Next Steps - **Join Community** → [Discord](https://discord.gg/0glabs) for support -- **Explore SDK** → [SDK Documentation](./sdk) for integration details \ No newline at end of file +- **Explore SDK** → [SDK Documentation](./sdk) for integration details