A bit of background, for quite some time I’ve been sending push and timeline notifications around to notify my household that the dishwasher salt needs to be refilled or that the dryer is done. So I thought time to make it more fun with a local running LLM.
First we need to install a LLM, I decided to run Ollama in docker:
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
If you have an Nvidia GPU, then you first need to install the Nvidia container toolkit.
And use the following command:
But you can also run it on a Mac/Pc/Linux, see this page for more information
then you need to install a Model, I decided to use Meta’s llama3.2, see this page for all available models
Time to run the model: docker exec -it ollama ollama run llama3.2
And now we can create an advanced homey flow that starts with a text tag, I called mine LLM
See: llama3.2 Supported Languages: English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai are officially supported. Llama 3.2 has been trained on a broader collection of languages than these 8 supported languages.
const API_URL = "http://DOCKER_IP/api/generate"; // Your LLM API
const MODEL_NAME = "llama3.2"; // Using GPU-powered LLaMA 3.3
// Check if an input was provided
const input = args[0] !== undefined && args[0] !== null && args[0].trim() !== "" ? args[0].toLowerCase() : null;
// Determine the event type (motion or button press)
let eventType;
if (input && (input.includes("motion") || input.includes("movement") || input.includes("detected"))) {
eventType = "motion";
} else if (input && (input.includes("button") || input.includes("pressed") || input.includes("bell"))) {
eventType = "button";
} else {
eventType = "unknown"; // Default case
}
// Define the LLM prompt based on the event type
let prompt;
if (eventType === "motion") {
prompt = `Someone is moving near the door! Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**.
### Rules:
- **Be engaging and fun**.
- **Use sarcasm** and make it sound like a casual, human text message.
- **Use at least 3-4 emojis** 🤡🚪😂🔥.
- **Make it mysterious, like something spooky or unexpected is happening**.
- **Each response should be unique**.
### Example responses:
1. "🚶♂️ Someone is lurking near the door! Either it’s the pizza guy 🍕… or a ninja sneaking in 🤔😂."
2. "Movement detected! 🏃♂️ It could be a visitor, or just a ghost who forgot their keys. 👻🔑"
3. "Uh oh, motion detected! 🤡 Hope it's a friendly visitor and not a squirrel planning a break-in again. 🐿️😂"`;
} else if (eventType === "button") {
prompt = `Someone **pressed the doorbell button!** Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**.
### Rules:
- **Be engaging and fun**.
- **Use sarcasm** and make it sound like a casual, human text message.
- **Use at least 3-4 emojis** 🤡🚪😂🔥.
- **Make it sound like a dramatic event, like something huge is happening**.
- **Each response should be unique**.
### Example responses:
1. "🚪 Ding Dong! Somebody just **slammed** the doorbell like it’s an emergency! Probably just Amazon again. 📦😂"
2. "ALERT! 🚨 The sacred button has been pressed! Time to decide… answer it or pretend you're not home? 🤔🏡"
3. "Ding Dong! 🏡 It’s either your best friend, or a salesperson trying to sell you a new vacuum cleaner. Choose wisely! 😆😂"`;
} else {
// Default case when input doesn't match "motion" or "button"
prompt = `Someone is at the door! Generate a **funny, short (30-50 words)** message with **sarcasm, humor, and emojis**.
### Rules:
- **Be engaging and fun**.
- **Use sarcasm** and make it sound like a casual, human text message.
- **Use at least 3-4 emojis** 🤡🚪😂🔥.
- **Make it a general notification about a visitor**.
- **Each response should be unique**.
### Example responses:
1. "🚪 Someone’s at the door! Might be the pizza 🍕… or an alien invasion 👽! Choose wisely!"
2. "Ding Dong! 🏡 Either it’s a visitor, or the neighbor’s cat is back demanding food again! 🐱😂"
3. "Someone's knocking! 🤔 FBI, pizza delivery, or your annoying neighbor? Good luck! 🤡🚪"`;
}
// Function to fetch LLM response
async function fetchLLMResponse() {
try {
const response = await fetch(API_URL, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
model: MODEL_NAME,
prompt: prompt,
stream: false
})
});
if (!response.ok) {
throw new Error(`HTTP Error: ${response.status}`);
}
const json = await response.json();
const output = json.response || "No response received";
return output.trim();
} catch (error) {
return `Error: ${error.message}`;
}
}
// Execute and return the result
return await fetchLLMResponse();