[HOW TO] use a local LLM to generate unique notifications

A bit of background, for quite some time I’ve been sending push and timeline notifications around to notify my household that the dishwasher salt needs to be refilled or that the dryer is done. So I thought time to make it more fun with a local running LLM.

First we need to install a LLM, I decided to run Ollama in docker:

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

If you have an Nvidia GPU, then you first need to install the Nvidia container toolkit.
And use the following command:

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

But you can also run it on a Mac/Pc/Linux, see this page for more information

then you need to install a Model, I decided to use Meta’s llama3.2, see this page for all available models
Time to run the model:
docker exec -it ollama ollama run llama3.2

And now we can create an advanced homey flow that starts with a text tag, I called mine LLM

The homey script that I use looks something like this

const input = args[0] !== undefined ? args[0] : "Het eten is klaar";
const prompt = `Schrijf een korte one-liner, waarin je laat weten dat '${input}'. Gebruik een flinke dosis humor.`;

const response = await fetch("http://dockerIP:11434/api/generate", {
  method: 'POST',
  headers: {'content-type' : 'application/json'},
  body: JSON.stringify({ 
    model: "llama3.2",
    prompt: prompt,        
    stream: false
  })
});

const json = await response.json();
const output = json.response;
return output.substring(0, output.length - 1).substring(1);

And now you can trigger the LLM flow from all your other flows

And you will receive a unique notifications

Docs:

2 Likes

“Dat betekent dat de kleding nog steeds niet schoon is”? :thinking:

See: llama3.2
Supported Languages: English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai are officially supported. Llama 3.2 has been trained on a broader collection of languages than these 8 supported languages.

I read somewhere that If you want to use a dutch model this is the suggested one Rijgersberg/GEITje-7B-chat-v2 · Hugging Face
it’s on my todo list to test that :wink:

Thank you so much Sre !!!

I was able to install Llama3 and other models, and call them from Homey… I can now transition from OpenAI API to this ! :smile:

Great stuff !

Has anyone tested the different models ? which one is the fastest ?

I have update a bit

const API_URL = "http://DOCKER_IP/api/generate"; // Your LLM API
const MODEL_NAME = "llama3.2";  // Using GPU-powered LLaMA 3.3

// Check if an input was provided
const input = args[0] !== undefined && args[0] !== null && args[0].trim() !== "" ? args[0].toLowerCase() : null;

// Determine the event type (motion or button press)
let eventType;
if (input && (input.includes("motion") || input.includes("movement") || input.includes("detected"))) {
    eventType = "motion";
} else if (input && (input.includes("button") || input.includes("pressed") || input.includes("bell"))) {
    eventType = "button";
} else {
    eventType = "unknown"; // Default case
}

// Define the LLM prompt based on the event type
let prompt;
if (eventType === "motion") {
    prompt = `Someone is moving near the door! Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**. 

    ### Rules:
    - **Be engaging and fun**.
    - **Use sarcasm** and make it sound like a casual, human text message.
    - **Use at least 3-4 emojis** 🤡🚪😂🔥.
    - **Make it mysterious, like something spooky or unexpected is happening**.
    - **Each response should be unique**.
    
    ### Example responses:
    1. "🚶‍♂️ Someone is lurking near the door! Either it’s the pizza guy 🍕… or a ninja sneaking in 🤔😂."
    2. "Movement detected! 🏃‍♂️ It could be a visitor, or just a ghost who forgot their keys. 👻🔑"
    3. "Uh oh, motion detected! 🤡 Hope it's a friendly visitor and not a squirrel planning a break-in again. 🐿️😂"`;

} else if (eventType === "button") {
    prompt = `Someone **pressed the doorbell button!** Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**. 

    ### Rules:
    - **Be engaging and fun**.
    - **Use sarcasm** and make it sound like a casual, human text message.
    - **Use at least 3-4 emojis** 🤡🚪😂🔥.
    - **Make it sound like a dramatic event, like something huge is happening**.
    - **Each response should be unique**.
    
    ### Example responses:
    1. "🚪 Ding Dong! Somebody just **slammed** the doorbell like it’s an emergency! Probably just Amazon again. 📦😂"
    2. "ALERT! 🚨 The sacred button has been pressed! Time to decide… answer it or pretend you're not home? 🤔🏡"
    3. "Ding Dong! 🏡 It’s either your best friend, or a salesperson trying to sell you a new vacuum cleaner. Choose wisely! 😆😂"`;

} else {
    // Default case when input doesn't match "motion" or "button"
    prompt = `Someone is at the door! Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**. 

    ### Rules:
    - **Be engaging and fun**.
    - **Use sarcasm** and make it sound like a casual, human text message.
    - **Use at least 3-4 emojis** 🤡🚪😂🔥.
    - **Make it a general notification about a visitor**.
    - **Each response should be unique**.
    
    ### Example responses:
    1. "🚪 Someone’s at the door! Might be the pizza 🍕… or an alien invasion 👽! Choose wisely!"
    2. "Ding Dong! 🏡 Either it’s a visitor, or the neighbor’s cat is back demanding food again! 🐱😂"
    3. "Someone's knocking! 🤔 FBI, pizza delivery, or your annoying neighbor? Good luck! 🤡🚪"`;
}

// Function to fetch LLM response
async function fetchLLMResponse() {
    try {
        const response = await fetch(API_URL, {
            method: 'POST',
            headers: { 'Content-Type': 'application/json' },
            body: JSON.stringify({
                model: MODEL_NAME,
                prompt: prompt,
                stream: false
            })
        });

        if (!response.ok) {
            throw new Error(`HTTP Error: ${response.status}`);
        }

        const json = await response.json();
        const output = json.response || "No response received";

        return output.trim();
    } catch (error) {
        return `Error: ${error.message}`;
    }
}

// Execute and return the result
return await fetchLLMResponse();

2 Likes