Ollama 模型节点常见问题#
以下是 Ollama 模型节点的一些常见错误和问题及解决或排除故障的步骤。
处理参数#
Ollama 模型节点是一个子节点。子节点在使用表达式处理多个项目时的行为与其他节点不同。
包括根节点在内的大多数节点,接受任意数量的项目作为输入,处理这些项目,并输出结果。您可以使用表达式引用输入项目,节点会依次解析每个项目的表达式。例如,给定五个名称值的输入,表达式 {{ $json.name }}
会依次解析为每个名称。
在子节点中,表达式总是解析为第一个项目。例如,给定五个名称值的输入,表达式 {{ $json.name }}
总是解析为第一个名称。
无法连接到远程 Ollama 实例#
Ollama 模型节点支持 Bearer 令牌身份验证,用于连接在身份验证代理(如 Open WebUI)后面的远程 Ollama 实例。
对于远程身份验证连接,请在您的 Ollama 凭证中配置远程 URL 和 API 密钥。
有关更多信息,请遵循 Ollama 凭证说明。
使用 Docker 时无法连接到本地 Ollama 实例#
Ollama 模型节点使用 Ollama 凭证定义的基础 URL 连接到本地托管的 Ollama 实例。当您在 Docker 中运行 n8n 或 Ollama 时,您需要配置网络以便 n8n 可以连接到 Ollama。
Ollama 通常在 localhost
(本地网络地址)上侦听连接。在 Docker 中,默认情况下,每个容器都有自己的 localhost
,只能从容器内部访问。如果 n8n 或 Ollama 中的任意一个在容器中运行,它们将无法通过 localhost
连接。
解决方案取决于您如何托管这两个组件。
如果只有 Ollama 在 Docker 中#
如果只有 Ollama 在 Docker 中运行,请配置 Ollama 通过在容器内部绑定到 0.0.0.0
来侦听所有接口(官方镜像已经这样配置了)。
When running the container, publish the ports with the -p
flag. By default, Ollama runs on port 11434, so your Docker command should look like this:
1 |
|
When configuring Ollama credentials, the localhost
address should work without a problem (set the base URL to http://localhost:11434
).
If only n8n is in Docker#
If only n8n is running in Docker, configure Ollama to listen on all interfaces by binding to 0.0.0.0
on the host.
If you are running n8n in Docker on Linux, use the --add-host
flag to map host.docker.internal
to host-gateway
when you start the container. For example:
1 |
|
If you are using Docker Desktop, this is automatically configured for you.
When configuring Ollama credentials, use host.docker.internal
as the host address instead of localhost
. For example, to bind to the default port 11434, you could set the base URL to http://host.docker.internal:11434
.
If Ollama and n8n are running in separate Docker containers#
If both n8n and Ollama are running in Docker in separate containers, you can use Docker networking to connect them.
Configure Ollama to listen on all interfaces by binding to 0.0.0.0
inside of the container (the official images are already configured this way).
When configuring Ollama credentials, use the Ollama container's name as the host address instead of localhost
. For example, if you call the Ollama container my-ollama
and it listens on the default port 11434, you would set the base URL to http://my-ollama:11434
.
If Ollama and n8n are running in the same Docker container#
If Ollama and n8n are running in the same Docker container, the localhost
address doesn't need any special configuration. You can configure Ollama to listen on localhost and configure the base URL in the Ollama credentials in n8n to use localhost: http://localhost:11434
.
Error: connect ECONNREFUSED ::1:11434#
This error occurs when your computer has IPv6 enabled, but Ollama is listening to an IPv4 address.
To fix this, change the base URL in your Ollama credentials to connect to 127.0.0.1
, the IPv4-specific local address, instead of the localhost
alias that can resolve to either IPv4 or IPv6: http://127.0.0.1:11434
.
Ollama and HTTP/HTTPS proxies#
Ollama doesn't support custom HTTP agents in its configuration. This makes it difficult to use Ollama behind custom HTTP/HTTPS proxies. Depending on your proxy configuration, it might not work at all, despite setting the HTTP_PROXY
or HTTPS_PROXY
environment variables.
Refer to Ollama's FAQ for more information.
🚀 与作者交流

📚 教程 💡 案例 🔧 技巧

⚡ 快答 🎯 定制 🚀 支持