Skip to content

Ollama Chat Model node common issues#

Here are some common errors and issues with the Ollama Chat Model node and steps to resolve or troubleshoot them.

Processing parameters#

The Ollama Chat Model node is a sub-node. Sub-nodes behave differently than other nodes when processing multiple items using expressions.

Most nodes, including root nodes, take any number of items as input, process these items, and output the results. You can use expressions to refer to input items, and the node resolves the expression for each item in turn. For example, given an input of five name values, the expression {{ $json.name }} resolves to each name in turn.

In sub-nodes, the expression always resolves to the first item. For example, given an input of five name values, the expression {{ $json.name }} always resolves to the first name.

Can't connect to a remote Ollama instance#

The Ollama Chat Model node is only designed to connect to a locally hosted Ollama instance. It doesn't include the authentication features you'd need to connect to a remotely hosted Ollama instance.

To use the Ollama Chat Model, follow the Ollama credentials instructions to set up Ollama locally and configure the instance URL in REA Automation.

Can't connect to a local Ollama instance when using Docker#

The Ollama Chat Model node connects to a locally hosted Ollama instance using the base URL defined by Ollama credentials. When you run either REA Automation or Ollama in Docker, you need to configure the network so that REA Automation can connect to Ollama.

Ollama typically listens for connections on localhost, the local network address. In Docker, by default, each container has its own localhost which is only accessible from within the container. If either REA Automation or Ollama are running in containers, they won't be able to connect over localhost.

The solution depends on how you're hosting the two components.