Ollama
Ollama is an open-source tool for running large language models locally. It lets you download, run, and manage models on your own machine, with inference performed locally. Performance and available models depend on your hardware and configuration.
💡 Locally deployed large language models, completely free to use.
Application and Setup Process
- Open the Ollama website and download the installer for your operating system.
- Go to models, select the model you need, and run the command in the terminal (e.g.,
ollama pull translategemma:4b) to download the translategemma:4b model. - Start Ollama: run
ollama run translategemma:4bin the terminal. The system will automatically use the default API key, so you can start using the translation service without entering it manually. - If you encounter any issues or have suggestions, you can submit them on the Feedback page.
Frequently asked questions
If you have successfully run the Ollama program and correctly entered the model name and API endpoint in Translation Models Settings, but still encounter a 403 error, it may be due to a local network cross-origin issue. Follow the steps below for your operating system to enable CORS support.
- Windows
Open Control Panel → System → Advanced system settings → Environment Variables, and under User variables, add two new environment variables:
OLLAMA_HOST: set to0.0.0.0OLLAMA_ORIGINS: set to*(allow access from all origins)- After saving, restart the Ollama program.
- macOS
In the terminal, run
launchctl setenv OLLAMA_ORIGINS "*"and then start the Ollama program. - Linux
In the terminal, run
OLLAMA_ORIGINS="*" ollama serveand then start the Ollama program.
⚠️ The model name used by the Ollama program must match the model name in Translation Models Settings. For example, if the program is running the model translategemma:4b, the model name in Translation Models Settings must also be translategemma:4b.
💡 For more Ollama operations, please refer to the official Ollama documentation.
✅ Before resolving the cross-origin issue, make sure the Ollama program is closed.
💡 For more Ollama operations, please refer to the official Ollama documentation.
✅ Before resolving the cross-origin issue, make sure the Ollama program is closed.

