| 开发者 |
fueled
10up |
|---|---|
| 更新时间 | 2026年4月23日 23:18 |
| PHP版本: | 7.4 及以上 |
| WordPress版本: | 7.0 |
| 版权: | GPL-2.0-or-later |
| 版权网址: | 版权信息 |
/wp-content/plugins/ai-provider-for-ollama/.Visit ollama.com to download and install Ollama for your platform. Once installed, pull a model (example ollama pull llama3.2) and the provider will automatically discover it.
No. For local Ollama instances, no API key is needed. The plugin automatically handles authentication for local setups.
For remote Ollama instances that require authentication, enter the API key in the Settings > Connectors screen. If using Ollama Cloud, you also need to set your Ollama host URL in the Settings > Ollama screen to https://ollama.com.
By default, the provider connects to http://localhost:11434. You can change this in two ways:
OLLAMA_HOST environment variable (takes precedence).wpai_has_ai_credentials filter to ensure the AI plugin sees Ollama as a valid, connected provider (props @dkotter, @jeffpaul via #43).