| 开发者 | superdav42 |
|---|---|
| 更新时间 | 2026年4月4日 04:26 |
| PHP版本: | 7.4 及以上 |
| WordPress版本: | 7.0 |
| 版权: | GPL-2.0-or-later |
| 版权网址: | 版权信息 |
/v1/chat/completions and /v1/models endpoints).
Supported services include:
http://localhost:11434/v1 for Ollama)./wp-content/plugins/ultimate-ai-connector-compatible-endpoints/.Any AI inference server that implements the standard /v1/chat/completions and /v1/models endpoints. This includes Ollama, LM Studio, vLLM, LocalAI, text-generation-webui, and many cloud services.
It depends on your endpoint. Local servers like Ollama and LM Studio typically do not require a key. Cloud services like OpenRouter require one. Leave the API Key field blank for servers that do not need authentication.
The plugin automatically queries your endpoint's /models resource and registers every model it finds. Whatever models your server offers will appear in the WordPress AI Client.
Yes, but you must also install the AI Experiments plugin, which bundles the AI Client SDK. On WordPress 7.0 and later, the SDK is built into core and no additional plugin is needed.
Yes. WordPress 7.0 ships the AI Client SDK in core, so this connector plugin works on its own. You only need the AI Experiments plugin if you want the experimental AI features it provides (excerpt generation, summarization, etc.).