Using LocalAI from China — connecting to upstream OpenAI/Anthropic via relay #9048
holysheep123
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Use Case
LocalAI can act as a drop-in OpenAI replacement, but if you want to use it as a proxy to forward requests to the real OpenAI/Anthropic APIs, those endpoints are blocked in mainland China.
Solution: Configure LocalAI to route through an API relay
In LocalAI config YAML (model config)
Or using environment variables
For Anthropic/Claude backends
The relay at holysheep.ai provides OpenAI/Anthropic-compatible API access from China with pay-as-you-go pricing (¥1=$1). Free registration: holysheep.ai/register
Does LocalAI's proxy mode work well with custom API bases? Happy to share more details about the setup.
Beta Was this translation helpful? Give feedback.
All reactions