Linux 软件免费装
Banner图

AI Provider for Ollama

开发者 fueled
10up
更新时间 2026年3月25日 21:42
PHP版本: 7.4 及以上
WordPress版本: 7.0
版权: GPL-2.0-or-later
版权网址: 版权信息

标签

ai connector ollama llm local-ai

下载

1.0.2 1.0.3

详情介绍:

This plugin provides Ollama integration for the WordPress AI Client. It lets WordPress sites use large language models running locally or on a remote Ollama instance for text generation and other AI capabilities. Ollama exposes an OpenAI-compatible API, and this provider uses that API to communicate with any model you have pulled into Ollama (Llama, Mistral, Gemma, Phi, and many more). Features: Requirements:

安装:

  1. Ensure the WordPress AI Client plugin is installed and activated.
  2. Upload the plugin files to /wp-content/plugins/ai-provider-for-ollama/.
  3. Activate the plugin through the 'Plugins' menu in WordPress.
  4. Go to Settings > Ollama Settings to configure the host URL and see available models.

升级注意事项:

1.0.0 Initial release.

常见问题:

How do I install Ollama?

Visit ollama.com to download and install Ollama for your platform. Once installed, pull a model with ollama pull llama3.2 and the provider will automatically discover it.

Do I need an API key?

No. For local Ollama instances, no API key is needed. The plugin automatically handles authentication for local setups. For remote Ollama instances that require authentication, enter the API key in the WordPress AI Client Settings > AI Credentials screen. If using Ollama Cloud, you also need to set your Ollama host URL in the Settings > Ollama Settings screen to https://ollama.com.

How do I change the Ollama host URL?

By default, the provider connects to http://localhost:11434. You can change this in two ways:

  1. Set the OLLAMA_HOST environment variable (takes precedence).
  2. Go to Settings > Ollama Settings in the WordPress admin and enter your host URL.

更新日志:

1.0.3 - 2026-03-25 1.0.2 - 2026-03-23 1.0.1 - 2026-03-20 1.0.0 - 2026-03-05