To choose just a couple of my helpful ones:
Radon and Air Pollution Control
My wife and I live in a rental a manufactured home/trailer that we discovered has high radon levels. Itβs common where I am in Colorado, and the trailer skirt acts to βtrapβ the radon, where it then seeps up into the house.
To mitigate this (not being able to modify a rental) I have a smart air quality monitor (an Airthings View Plus) that HA monitors. When the levels of any of the air quality components (radon, CO2, VOC, PM2.5) get above levels I define in the automation, it will turn on a smart window fan (Vornado Transom via Alexa Media Player on HACS) to bring in fresh air and put the house in a positive-pressure state. In addition to this, I have limits to prevent the automation from running if itβs too cold or hot outside (32F or 85F) as well as if the air quality outside is worse than it is inside (like during forest fire season). This also coordinates with my Generic Thermostat integration (since I also use the window fan as a βcoolingβ device whenever outside ambient temperature is below the interior temp when cooling is requested) to not turn on or off the fan if something else is calling it.
AI Assisted Food Categorization
Using HA Voice and the Ollama integration, I have a local AI server running on my desktop computer that HA uses for the integrated assistant. In addition to general commands like βTurn on X lightβ, I wanted to use the reasoning skills of the AI assistant to help me keep track of my nutrition intake. This was a little more advanced than normal voice commands and fell outside of the built-in intents. So I needed an automation that would trigger on a voice command input, pass the information to the AI assistant, capture and parse the output, and then do an action based on keywords found in that output. Something like saying βCategorize a pepperoni pizzaβ will generate a response that says βSure. Categories: Starch, Proteinβ¦β ect.
I already had built my wife a dashboard in HA (as a Christmas present lol) that allowed her to record by category what she had eaten that day or week. Each category has an input helper entity that gets incremented as necessary by recognizing key words in the response output. Hereβs a section of that dashboard for those who are curious:
It works surprisingly well, but I want to do a lot more with this workflow, and itβs been tricky. Iβd like the AI to toggle the entities directly, instead of having to parse the output in a βdumbβ way. If I expose the food category input_number entities to the assistant, it does recognize correctly what it needs to toggle, but it canβt actually increment the input numbers because there isnβt an βintentβ for that. When Iβve tried to play with custom intents, I get stuck. It seems HA wants to grab an intent before it gets passed to the AI, whereas I want to determine intents based on the OUTPUT from the AI. More reading I need to do there.
Spotify Album Art Background Color
Finally, on a more easygoing note, I have an automation where I use the Color Picker integration to analyze the image displayed on my AndroidTV when Spotify is playing. Based on that hex color code, I set the accent light behind my TV to that album color. Itβs fun 