Linux 软件免费装

Ultimate AI Connector for Compatible Endpoints

开发者 superdav42
更新时间 2026年4月4日 04:26
PHP版本: 7.4 及以上
WordPress版本: 7.0
版权: GPL-2.0-or-later
版权网址: 版权信息

标签

ai ollama llm local-ai connectors

下载

1.1.0

详情介绍:

This plugin extends the WordPress AI Client to support any AI service or server that uses the standard chat completions API format (/v1/chat/completions and /v1/models endpoints). Supported services include: Requirements by WordPress version: Why it matters: Other AI-powered plugins that use the WordPress AI Client (such as AI Experiments) can automatically discover and use any model you connect through this plugin. Configure your endpoint once and every AI feature on your site can use it. How it works:
  1. Install and activate the plugin.
  2. Go to Settings > Connectors and configure the connector with your endpoint URL (e.g. http://localhost:11434/v1 for Ollama).
  3. Optionally provide an API key for services that require authentication.
  4. The plugin registers a provider with the WordPress AI Client and dynamically discovers all available models from your endpoint.
The plugin also handles practical concerns like extended HTTP timeouts for slow local inference and non-standard port support.

安装:

  1. Upload the plugin files to /wp-content/plugins/ultimate-ai-connector-compatible-endpoints/.
  2. Activate the plugin through the 'Plugins' menu in WordPress.
  3. WordPress 6.9 only: Make sure the AI Experiments plugin is installed and active. It provides the AI Client SDK that this plugin requires.
  4. Go to Settings > Connectors and configure the connector.
  5. Optionally enter an API key if your endpoint requires one.

常见问题:

What endpoints are compatible?

Any AI inference server that implements the standard /v1/chat/completions and /v1/models endpoints. This includes Ollama, LM Studio, vLLM, LocalAI, text-generation-webui, and many cloud services.

Do I need an API key?

It depends on your endpoint. Local servers like Ollama and LM Studio typically do not require a key. Cloud services like OpenRouter require one. Leave the API Key field blank for servers that do not need authentication.

What models will be available?

The plugin automatically queries your endpoint's /models resource and registers every model it finds. Whatever models your server offers will appear in the WordPress AI Client.

Does this work with WordPress 6.9?

Yes, but you must also install the AI Experiments plugin, which bundles the AI Client SDK. On WordPress 7.0 and later, the SDK is built into core and no additional plugin is needed.

Does this work on WordPress 7.0 without the AI Experiments plugin?

Yes. WordPress 7.0 ships the AI Client SDK in core, so this connector plugin works on its own. You only need the AI Experiments plugin if you want the experimental AI features it provides (excerpt generation, summarization, etc.).

更新日志:

1.1.0 - Released on 2026-04-01 1.0.0