[APP][Pro] Ollama - Use Local LLMs in your Flows

I just discovered your app !
That looks great, and could replace what I do today :
I have a Ollama with local models, and I use them in Homey Flows with a HomeyScript card…

I tried to set up the Ollama App, but I have an error when trying to set up the model :

In the app settings, I have this :

have I done something wrong ?

For reference, here is the HomeyScript I Use in the HomeyScript card, and works well (I am not a developper… code might be ugly ! :slight_smile: )
(I changed the IP address ! :wink: )

const input = args[0] !== undefined ? args[0] : "Bonjour le monde";
const prompt = ` '${input}'.`;


const response = await fetch("http://0.0.0.0:11434/api/generate", {
  method: 'POST',
  headers: {'content-type' : 'application/json'},
  body: JSON.stringify({ 
    model: "Lumen",
    prompt: prompt,
    stream: false
  })
});

const json = await response.json();
const output = json.response;
return output.substring(0, output.length - 1).substring(1);

My app uses the /api/tags endpoint to get the models. Is the app configured with the correct details, and is your Ollama instance up to date?

My app uses the same /api/generate endpoint for generating responses, but your issue seems to be related to the autocomplete which uses /api/tags

Thanks for your help…

when i do curl http://localhost:11434/api/tags in a terminal (MacOS), I do get the installed models…
Also, the “expose Ollama to the network” is on…

If I connect on a phone using Reins, I can list models, and use them as long I am on the same network. So the Ollama instance seems up and running.

Any ideas ?

Maybe your firewall is blocking communications between Homey and your Ollama instance. Is Homey on another VLAN?

Oh I forgot to mention: Do NOT put “http://“ in front of the IP, that’s not needed. The app will place that there automatically.

I removed the http:// and it worked !

Thank you ! :star_struck:

1 Like

I tried to play along with the app…

It is great, and I like it very much !

Could it be possible to change other parameters like temperature, num predict, top_p, repeat_penalty…

Also, it could be great to be able to set up some kind of agent with a dedicated system prompt and parameters, and then be able to use them in flows without having to change the system prompt each time…

Great work anyway !

I left out the extra parameters, because the Homey App Store doesn’t allow too many arguments

image

It might be possible to add the parameters as options like the system prompt (which you can set in the settings and also set using a different Flow card), I will add that feature to the app soon

What do you mean by an agent with a dedicated system prompt? Like presets of all the parameters? I can build that too if that’s what you mean

Yes exactly !

I mean presets that can be used for different kind of tasks !

Thanks again !