Ollama

If you want to use Ollama local OpenAI compitable API through a browser based tool, you need to allow CORS

Check whether CORS is enabled

curl -X OPTIONS http://localhost:11434 -H "Origin: http://example.com" -H "Access-Control-Request-Method: GET" -I

Here we are checking if origin example.com is allowed

If you get this output

HTTP/1.1 403 Forbidden
Date: Wed, 09 Oct 2024 10:12:15 GMT
Content-Length: 0

It means CORS is not enabled

Enable CORS

On macOS

launchctl setenv OLLAMA_ORIGINS "*"

or any origin you would like for example

launchctl setenv OLLAMA_ORIGINS "example.com,voov.ai"

On Linux it in systemd and in windows set a system environment variable

You need to restart ollama after doing this

Testing

Your output should be like below if everything is setup right

(base)   ~ curl -X OPTIONS http://localhost:11434 -H "Origin: http://example.com" -H "Access-Control-Request-Method: GET" -I
HTTP/1.1 204 No Content
Access-Control-Allow-Headers: Authorization,Content-Type,User-Agent,Accept,X-Requested-With,X-Stainless-Lang,X-Stainless-Package-Version,X-Stainless-Os,X-Stainless-Arch,X-Stainless-Runtime,X-Stainless-Runtime-Version,X-Stainless-Async
Access-Control-Allow-Methods: GET,POST,PUT,PATCH,DELETE,HEAD,OPTIONS
Access-Control-Allow-Origin: *
Access-Control-Max-Age: 43200
Date: Wed, 09 Oct 2024 10:13:26 GMT
By: Gavi Narra on: