Keeping GitHub Actions Up to Date with Dependabot

Over the past couple of years, I’ve built a number of tools that are delivered as Docker containers. Part of the workflow I’ve setup involves automatic container builds using GitHub Actions.

It works great – I commit to the main or dev branches and I get a new container version tagged as :latest or :dev, respectively. I create a new release version, and I get a new container version tagged as :version-number.

BUT, and there’s a but. There’s always a but, right? I’m talking about automatically updating the individual actions in my actions scripts to keep pace with new releases. Doing that manually is work for just 1 container. For a bunch? Forget it.

Dependabot has entered the chat.

What does Dependabot do? Its purpose in life is to look through your repo and keep versions of various bits up to date. Simple, right? Ok, like I said before, I’ve got a number of containers I maintain. Between my container build and old version cleanup scripts, I use 7 actions. Multiply that times 14 container repos, and that’s a total of 98 action instances to keep up to date. Hands up, who wants to do that by hand? Nope.

The other thing I’m using Dependabot for is to keep certain bits in my Dockerfiles up to date as well. The main one I look out for is the python:3.x slim images. All of this is configured using a YAML file that I drop in the repo as /.github/dependabot.yml . Here’s an example dependabot.yml file:

version: 2

updates:
  - package-ecosystem: github-actions
    directory: /
    schedule: {interval: weekly}
    reviewers: [jcostom]
    assignees: [jcostom]

  - package-ecosystem: docker
    directory: /
    schedule: {interval: weekly}
    reviewers: [jcostom]
    assignees: [jcostom]

This example will review my actions scripts as well as my Dockerfiles weekly and propose updates in the form of pull requests.

Lots of great tutorials exist out there on Dependabot. Hopefully this piece has generated enough interest to get you started!

Synthetic Media, AI-Driven Scams, Deepfakes, and You.

Around the middle of last year, I had to roll up to my company’s offices in New England for a couple of days of meetings. As I often do on such drives, I tune into a podcast or two between calls. During that drive, I caught a couple of episodes of the Politcology podcast. They ran a really great 2 part series about politically-motivated deepfakes (Episode 1 & Episode 2) which touched on other areas that are more relatable to our everyday lives.

They kicked off the first episode talking about the now-famous video that Jordan Peele and BuzzFeed created, showing President Obama saying things he never actually said. Here, check it out if you’ve never seen it:

This video was made a few years ago – back in 2018. If you really pay attention, you’ll notice that the voice isn’t quite right. It’s not perfectly synchronized with the facial movements, plus you can tell it’s not actually Obama’s voice. But, it’s good enough to convince a lot of folks, particularly those who aren’t well-informed, or just not paying attention.

Fast forward 5 years to 2023. How has the tech changed? Naturally, software has improved, plus there’s more compute power than ever. There are open-source tools out there to analyze existing media and leverage it to synthesize additional media. We’ve even seen attempts to use these tools in the past year. Look at the Russian invasion of Ukraine – I can recall at least 2 instances of faked videos used as propaganda tools. While better, one can still tell these videos are fakes.

The much greater danger we all face today is less about faked videos of world leaders and more about faked audio leveraged by criminals. Consider the voiceprint authentication systems we’re now seeing many companies deploy to verify our identities when we’re calling in for some sort of service. With just a few seconds of voice sample, the technology now exists to “put words in your mouth” – making synthetic recordings of you saying pretty much anything.

Ok, am I being an alarmist? Have I joined the tin-foil hat squad? I’ll point you to this recent article from TechCrunch about Microsoft’s VALL-E research project. Input 3 seconds of real speech audio from a person, and now you’ve got the power to produce whatever you’d like to hear, but in that person’s voice, with their diction, even with normal-sounding background noise. Now, imagine your bank deploys a voice-based authentication system like this. Some crook takes a few seconds of your voice, uses a tool like VALL-E, and whammo! Access to your accounts is granted.

Ok, so that’s audio – how close are we to average folks having access to manipulate someone’s likeness in a reliably, reproducible way? That’s today. In the past 10 minutes, I started off by going to Phil Wang’s This Person Does Not Exist site. There, I generated 2 face images, 1 male, 1 female. Next, I dropped these images into my iCloud Photo Library, got on my phone and used the FaceApp app to change the gender of each of these people. Check out the results below. “Originals” on the left, gender transformations on the right. Remember – none of these images are real people. They’re all generated and manipulated through software that uses AI.

Nifty, huh? Now, how about I take Mr. Fake Man and make him sing to us using the Facedance app?

It’s Corn!

Ok, I’ll be first in line to point out how not-perfect that video is. But, consider that I made it in about 10 seconds, using a free app on an iPhone 13 Pro. That tech’s only going to continue getting better at pretending.

So what to do? It seems clear that the present danger surrounds audio. I saw an interview the other day with a man who claimed to have been nearly scammed by a crook pretending to be one of his friends who needed a quick financial bailout. Had he not taken a moment to call the friend’s wife to check up about that voicemail first, he could have easily been scammed. My best advice? Verify before you send any money, even if it’s your family or best friend. Maybe even establish some sort of pre-arranged password or other way to authenticate the person on the other end.

Washerbot, the Next Gen of Washer Monitoring

Ok, so I built plugmon a while ago. It worked great. I loved it – super reliable, none of the fiddly nonsense I’ve had to work through with my vibration sensor-based dryer monitoring solution even. Sadly, the Etekcity smart plugs I used before, which used the (really nice) VeSync API no longer seem to be able to be purchased, easily at least.

So, what to do? If the code is to be useful long-term, we’ll need to change the platform to something that’s actually able to be purchased. Without question there’s no shortage of smart plugs available. So, what are desirable features that I’m after when looking for a different platform?

Naturally, I’m after some sort of way to easily talk to the plugs to read data. Of course, it also (should at least) goes without saying that we need a plug that offers the ability to monitor power use, as well as exposing some sort of API to allow use to get at that data without using the vendor’s app directly.

There’s a ton of options out there, but eventually, I landed with the Kasa (formerly TP-Link) plugs. Why? Two things really pushed them over the top. First of course was remote API-style access to the plug’s power monitoring data. But with the Kasa plugs one didn’t even need to go outside the home LAN to capture the data.

At the end of things, I kept the majority of code from jcostom/plugmon, grabbed a few bits of code from other projects I’ve worked on, and in about 30 minutes, the Washerbot was born.

It’s a shame that the VeSync plugs are now so difficult (impossible?) to come by. The API was reasonably easy to work with, and they weren’t terribly expensive either. I’m hopeful that TP-Link / Kasa Smart will be around for longer. I really like the “no outside connectivity” needed part of the python-kasa module as well.

Some may point out that there was a brief dust-up a couple of years ago with TP-Link, when they announced their intention to stop allowing local access to their devices, and you’d be right to do so. Fortunately, TP-Link was smart enough to take the not-so-subtle hints from the community, and walked that change back.

Without further ado, head over to GitHub and check out the new Washerbot code & container. Obviously, you’ll need one of the TP-Link plugs that provides energy use stats, like the KP115.

Automatic Deployment of Let’s Encrypt Certs

Many of you already use Let’s Encrypt certificates in various capacities to provide secure connectivity to applications and devices. Most of the time, these apps and devices automatically reach out, get certs issued, installed and everything just works. That’s cases like traefik, or certbot with apache/nginx, etc.

Then there are those “other” use cases you’ve got. Like say, a custom certificate for a Plex server, or maybe even something more exotic like a certificate for an HP printer. How do you take care of those in an automated, “hands-off” sort of way? How do you make it work so that you’re not having to set reminders for yourself to get in there and swap out certs manually every 3 months? Because you know what’s going to happen right? That reminder’s going to go off, you’re going snooze it for a couple of days, then you’ll tick that checkbox, saying, “yeah, I’ll do it after I get back from lunch” and then something happens and it never gets done. Next thing you know, the cert expires, and it becomes a pain in the rear at the worst possible moment.

That’s where deploy-hooks come into play. If you’ve got a script that can install the certificate, you can call that script right after the cert has been issued by specifying the --deploy-hook flag on the certbot renew command. Let’s look at an example of how we might add this to an existing certbot certificate that’s already setup for automatic renewal. Remember, automatic renewal and automatic installation are different things.

First, we’ll do a dry-run, then we’ll force the renewal. It’s really that easy. Check it:

sudo certbot renew --cert-name printer.mynetwork.net --deploy-hook /usr/local/sbin/pcert.sh --dry-run
sudo certbot renew --cert-name printer.mynetwork.net --deploy-hook /usr/local/sbin/pcert.sh --dry-run

Once this process is completed, the automatic renewal configuration for printer.mynetwork.net will include the deploy-hook /usr/local/sbin/pcert.sh. But, what does that really mean? Upon successful renewal, that script will execute, at which point, you’re (presumably) using the script to install the newly refreshed certificate. In this case, the script is unique to that particular certificate. It’s possible to have deploy-hooks that are executed fro EVERY cert as well, by dropping them in the /etc/letsencrypt/renewal-hooks/deploy directory.

For some examples, check out the ones I’m using. Especially interesting (to me at least) is the HP Printer script. That one took a bit of hackery to get working. I had to run the dev tools, and record the browser session a couple of times to get all the variable names straight, and so forth, but once I had it down, it was a snap. Now when the Let’s Encrypt cert updates, within a few seconds, I’ve got the latest cert installed and running on the printer!

What certs will you automate the installation of?

The Dryer Update…

[Any Amazon Links below are Non-Affiliate Links that just go to Amazon Smile]

So, if you think back a bit, you may recall that I was using a Pi 4 for my IoT project that monitored the dryer, shooting out Telegram group messages to the whole family when the dryer was done with the laundry.

Times being what they are, it’s pretty difficult to come by a new Raspberry Pi these days, as I’m sure many of you know. I needed the power of the Pi 4 for something else, at least on a temporary basis. Meanwhile, back at the ranch, a couple of months prior, I’d received a ping from the Micro Center about 45 minutes away informing me that they had a handful of Pi Zero 2 W’s on hand. Those little suckers are super hard to find, so I snapped up my max of 2, along with the GPU I’d been dying to lay hands on for the longest time. For those who care, I finally got an EVGA 3080. Pandemics and supply-chain constraint conditions suck, by the way, in case you were wondering my position on that issue.

So, having my Pi Zero 2 W in the drawer ready to roll, I unscrewed the box from the way that housed the Pi 4, fitted the sensor I had directly onto the Pi Zero 2 W, and scaled down from a 2-project-box solution down to 1 box. Sadly, it sucked. But, it wasn’t the hardware’s fault. In reality it was totally a self-inflicted condition.

I modified (slightly) the pins on the old 801s sensor I had, fitted it onto that new Pi Zero 2W (since it didn’t have any GPIO pin headers soldered on), and sort of Rube-Goldberged it together using 3M VHB tape inside the project box. Total hack job. I thought about using a bunch of hot glue, but then I thought better of it. Why not solder? Honestly? I suck at soldering. One of these days I’ll get around to getting good at it. But that’s not today.

It was wildly unstable. The sensor kept on moving, losing contact with the side of the GPIO holes, it was awful. I all but gave up. I had a brief flirtation with the Aqara Smart Hub and one of their Zigbee Vibration sensors, and believe me, when I say brief, I mean like 12 hours. It just wasn’t fit for the job.

My grand plan with that was to mimic what I was doing over on the washer – write some Python code and run it in a container to query an API somewhere in the cloud every X seconds to see if the thing was vibrating or not, then based on that, work out the state of the dryer to determine if the dryer had started or stopped and then act accordingly. But alas, since step 2 in this plan was a klunker, steps 3 through infinity? Yeah, those never happened.

So, back to the drawing board. I found that I couldn’t easily lay hands on a new 801s again, and the project for the Pi4 was now finished, so I had that back. I did find a new vibe sensor – the SW-420. 3 pins instead of 4, but it’s still a digital output that works fine with the Pi, and my existing code worked as-is, so who cares, right? Yeah, I classed the thing up quite a bit more this time too. This time, instead of shoving the Pi inside a project box that’s mounted on the wall running from the SD card, I opted to run in one of those snazzy Argon One M.2 SSD cases booting Ubuntu 22.04 from an M.2 SSD in the basement of the case. I’ve got that sitting on a lovely little shelf mounted just above and behind the dryer, with my 3 GPIO leads running out of the top of the case, directly into the small project box that’s attached to the front of the dryer, inside which is the sensor, which is stuck to the inside of the box using 3M VHB tape. The box itself is stuck to the dryer using VHB tape as well.

In the end, all’s well that ends well. I’ve had to do a good bit more tuning on the SW-420 sensor. It’s been a bit more fiddly than the old 801s was. That one was definitely a plug and play affair. This has required a bit of adjustment on the little potentiometer that’s built into the sensor. Not too bad though. I’ve invested probably a total of 15 minutes of time standing next to the dryer, staring at telemetry, while the dryer is running, or not. But in the end, it’s all working, and the notifications are happening once again.

One Crazy Summer

Hey automators!

Summer’s been absolutely nuts. Between work stuff, family stuff, running here and there, and of course, the odd project or two, I’ve been just plain stretched for time.

Stay tuned. I’ll be coming back around shortly. I’m working on some things. Preview?

Well, Remember how Logitech decided that the Harmony Remote, one of the best things ever to happen to the world of universal remotes was going to be taken out back and killed? Yeah, I was pretty mad about that too. So, I went looking for something else to solve some automation challenges with that. So, that’s coming.

What else? Tried to buy a Raspberry Pi lately? Heh. Yeah, me too. I decided to try a different fruit for a change. So far, so good. More on that later.

More still? There’s an update on that printer situation. The dryer too.

How about a Raspberry Pi-based network console server for my network equipment?

Hang in there family, it’s coming.

Embracing Simplicity. Again. This time, it’s DNS.

Public Enemy #1

I, like many, hate DNS. I tolerate it. It’s there because, well, I need it. There’s just only so many IP addresses one can keep rattling around inside one’s head, right? So, it’s DNS.

For years, I ran the old standard, BIND under Linux here at home. My old BIND config did a local forward to dnscrypt-proxy, which ran bound to a port on localhost, and then in turn pushed traffic out to external DNS servers like Cloudflare’s 1.1.1.1 or IBM’s 9.9.9.9. I didn’t think my ISP was entitled to be able to snoop on what DNS lookups I was doing. They still aren’t entitled to those, so I didn’t want to lose that regardless of what I ended up doing.

Out in the real world, my domain’s DNS was hosted by DNS Made Easy. They’ve got a great product. It’s reliable, and it’s not insanely expensive. It’s not nothing, but we’re not talking hundreds a year either. I think it’s about $50 a year for more domains and queries than I could possibly ever use. But, like many old schoolers, they’ve lagged behind the times. Yes, they’ve got things like a nice API, and do support DNSSEC, but DNSSEC is only available in their super expensive plans that start at $1700+ a year. That’s just not happening. So, I started looking around.

I landed on Cloudflare. They’ve got a free tier that fits the bill for me. Plenty of record space, a nice API, dare I say, a nicer API even. DNSSEC included in that free tier at no cost even. How do you beat free? I was using a mish-mash of internal and external DNS with delegated subdomains for internal vs external sites as well. It was (again) complicated – and a pain in the rear.

So, I registered a new domain to use just for private use. I did that through Cloudflare as well. As a registrar, they were nice to work with too. They pass that through at cost. Nice and smooth setup. So, internal stuff now consists of names that are [host/app].site.domain.net. Traefik is setup using the Cloudflare dns-01 letsencrypt challenge to get certs issued to secure it all, and the connectivity, as discussed before in the other post is all by Tailscale. The apps are all deployed using Docker with Portainer. The stacks (ok, they’re just docker-compose files) in Portainer are all maintained in private GitHub repos. I’ll do a post on that in more detail soon.

Ok, so what did I do with the DNS at home? Did I just ditch the resolver in the house entirely? I did not. In the end I opted for dumping BIND after all these years and replacing it with Unbound. I had to do a bit of reading on it, but the configuration is quite a bit less complex, since I wasn’t configuring zone files any more. I was just setting up a small handful of bits like what interfaces did I want to listen to, what did I want my cache parameters to look like, and what did I want to do with DNS traffic for the outside world, which pretty much everything is? In my case, I wanted to forward it to something fast and secured. I was already crushing pretty hard on Cloudflare, so 1.1.1.1 and 1.0.0.1 were easy choices. I’m also using IBM’s 9.9.9.9 as well. All of those are forwarding out using DNS-over-TLS, and DoT, or sometimes DOT. It worked for me first try.

Then I grabbed the Ubuntu certbot snap and told it to grab a cert for dns.home.$(newdomain).net, which is attached to this moon. After I got the cert issued, it was a piece of cake to turn up both DNS over HTTPS and DNS over TLS, and DoH and DoT.

It was fairly easy to get DoH working on a Windows 11 PC. It was also super easy to craft an MDM-style config profile for DoT that works great on IOS and iPadOS devices. Microsoft has Apple beat cold in this department. Well, in the Apple wold, if you configure a profile for DoT (the only way you can get it in there) you’re stuck with it until you get rid of it – by uninstalling and reinstalling.

On Windows? It was as easy as setting your DNS servers to manual, then crack open a command prompt as Administrator and then (assuming your DNS server is 10.10.10.10)…

netsh dns add encryption 10.10.10.10 https://my.great.server/dns-query

Once you’ve done that, you’ll be able to choose from a list under where you punch in DNS settings in the network settings and turn on Encryption for your DNS connection. It’s working great!

Data Visualization and You…

Sometimes there’s data. You’ve got a bunch of it, you need to work out how to represent it in a way that not only makes sense to you, but is also appealing in some fashion. I’m going to talk about a couple of different use cases in this post, each with their own unique data presentations. First, the sensors.

I’ve got a couple of SwitchBot Meter Plus sensors around the house. One is in my office, and the other is in the garage. There isn’t much to them, small little things, battery powered. Pretty much it’s a little monochromatic LCD screen with a temp/humidity sensor and a bluetooth radio. That won’t do, on its own, of course. So, I added SwitchBot’s Hub Mini to the party. It’s a little bridge device that plugs into the house’s AC mains, and has both BT and WiFi radios inside. While I haven’t cracked it open, the device shows up with a MAC address that suggests it’s little more than an ESP32 or ESP8266 microcontroller inside. With the hub in place, connecting the sensors to the SwitchBot cloud, a really important thing happens – the sensors become accessible via SwitchBot’s REST API. So, I’m using some custom-written Python code that runs under Docker to read the sensors. Turns out it was all surprisingly easy to put the pieces together. It was also a pre-cursor to another project I went on to do, where I helped a friend using a similar sensor to control a smart plug to operate a space heater.

So, what does one do with a sensor like this? You read it, naturally. You keep reading it. Over and over at some sort of fixed interval. In my case, I’m reading it every 5 minutes, or 300 seconds, and storing the data in a database. This type of data isn’t particularly well-suited to living in a SQL database like MariaDB, Postgres, etc. This is a job for a time-series database. So, I called on InfluxDB here. It’s relatively small, lightweight, and very well understood. The Python modules for it are pretty mature and easy to work with even, so it was easy to implement as well. Total win. So, read sensor (convert C to F, since I’m a Fahrenheit kind of guy), store in database, sleep(300), do it again. Lather, rinse, repeat. Just keep on doing that for roughly the next, forever. Or until you run out of space or crash. That’s the code right there, in a nutshell.

Sensors Data Visualization
Sensors Data Visualization

So, what are we visualizing? At the right, you can actually see what I’m graphing. The InfluxData team were nice enough to include some visualization tools right there in the box with InfluxDB, so I’m happy to take advantage of them. Many folks would prefer to use something a bit more flashy and customizable like Grafana, and that’s totally cool. I’ve done it too, even with this same dataset, and the data looks just as good. Heck, probably even looks better, but for me, it was just one more container to have to maintain with little extra value returned. The visualization tools baked into InfluxDB are good enough for what I’m after.

LibreNMS WAN Metrics
LibreNMS WAN Metrics

Next up? Keeping an eye on what’s up with my WAN router’s Internet-facing link. Here at the homestead, I’m running LibreNMS to keep an eye on things. Nothing nearly as custom here. It’s more off the shelf stuff here. It all runs (again) in Docker containers, and as you’d likely expect, uses SNMP to do the bulk of its monitoring duties. at the right, you can see some sample graphs I’ve got stuck to the dashboard page that give a last 6-hours view of the WAN-facing interface of my Internet router, a Juniper SRX300. You see the traffic report as well as the session table size. Within LibreNMS, I’ve got all sorts of data represented, even graphs of how much toner is left in the printer and the temperature of the forwarding ASIC in the switch upstairs in the TV cabinet. All have their own representations, each unique to the characteristics of the data.

Bottom line? Any time you’re dealing with data visualization, there is no one-size-fits-all. Spend the time with the data to figure out what makes the most sense for you and then make it so!

A Journey To A Smarter Dryer

This one was much more difficult. A lot more difficult.

If you recall from my post about the washer, I was able to pull off some fairly useful stuff without a ton of effort. Read a smart plug’s API to see how much power the washer is using to figure out when it turned on, then wait for it to turn off again, then let the fam know that the washer finished, and go take action so that the laundry doesn’t sit around for days, get funky and need to get re-washed. This was of course pretty easy simply because we were able to rely on the fact that the 120V motor in the washer draws well under 15A, the top end of the smart plugs I’m using, the Etekcity ESW15.

Sadly, when we moved into our house, we had an electric dryer. We’ve got natural gas in the house. Heck, in the same room even for the furnace and water heater even. But, back when the last washer sprang a leak and we needed a new washer in a hurry, and unfortunately at the time it was going to be months to get the matching gas dryer back in stock, so we just punted and stuck with the electric model. Sadly, this means for us this means we can’t take the same approach we did with the washer, since nobody makes a smart plug that works on 240V AC 30A circuits.

Unwilling to settle for relying on setting timers with Alexa, having to remind the kids to set timers, or just plain forgetting to do it, I started Googling about, looking for ways to go about monitoring the dryer. Monitoring energy use is the natural fit. When the dryer is in use, it’s consuming loads of energy, and when the clothes are dry, the energy use falls right off. This really shouldn’t be that hard to figure out, right? Right? Sadly, it was.

My next move was to play around with a split core current transformer clamp, and build a circuit with a burden resistor, reading the thing with a microcontroller. I read about the whole process in a handful of places online and it didn’t seem to ridiculous to build the circuit, so I sourced the parts. I got a little breadboard, some jumper wires, the resistors, capacitors, and the CT sensor clamp, and a sacrificial extension cord, which I’d use for my proof of concept test. You see, the CT clamp goes around a single conductor, not the whole cable assembly, so I needed to modify the cable slightly. Relax, the real cable was one of those “flat, side-by-side” types, so it would only mean peeling them apart, not really cutting anything. Sadly, I never made it to that phase. During my POC phase, I was able to get readings back from the sensor, but they never made sense. I was using an ESP32 microcontroller with MicroPython, so maybe that’s related. Or maybe I had a bum CT clamp. Or something else was wrong. We’ll never know, since I gave up after several evenings of bashing my head against the desk.

Failing at the “point” solution of energy monitoring, I moved on to looking at whole-house power monitoring. Hey, if we can’t kill this fly with a newspaper, let’s try a missile, right? Sense landed at the top of the pile. It had the API I was after, though they sort of keep that on the DL. Not in love with that, since those sometimes disappear. If I’m going to drop a couple of hundred bucks on something to use the API for something, it better not just disappear on a whim someday. Plus, our panel is in my home office, recessed in the wall, and there’s not exactly a clean way to get the Sense WiFi antenna back out without it looking really weird. I could make it clean, but then there’d just be a random RP-SMA antenna sticking out of my wall. Interesting decor choice. Sure to be a selling point when we sell the house some day.

Which brings me to the vibration sensor. I was reading one day, still searching, and I came across Shmoopty, and my problems were (half) solved. Sure, I had a Pi Zero W already laying around and I could have just built exactly what he had done, but what’s the fun in that? Remember, I’m already invested. It’s overkill time. So, I ordered up a couple of those 801s vibration sensors and got to work. You know, it was surprisingly hard to get one that met my needs at the time. Why? Most of the 801s units out there are analog-only. Since I’m using a Raspberry Pi, I wanted a digital output, so I didn’t need to mess around with the need for extra ADC (Analog to Digital Conversion) circuitry, just to read a simple sensor. So, I had to order from AliExpress and wait the long wait for shipping from China.

After my sensors finally turned up, I worked out the arrangement of my project boxes and so forth in the laundry room. I landed on a wall-mounted box for the Pi with a 1m pair of wires connected to the sensor, with the sensor inside another small box, which is stuck to the top of the dryer using a little strip of 3M VHB tape. Shmoopty’s Python made it easy to figure out how to read the sensor, so I was happy to be able to draw my inspiration from that. His approach is to keep it small, run on a Pi Zero W, even make it renter-friendly, while mine is more of a “go big” approach – building a Docker container to run it inside of.

Well, at the end of it all, it shares a lot of common philosophy with the plugmon tool, in that it loops infinitely, looking for start and stop conditions. Instead of watching power consumption, it’s watching for the dryer to start vibrating a lot. When that starts an appreciable amount of time, the dryer is declared to be “on”. Once it transitions back to “off”, it fires an event that causes a Telegram message to get sent to the family group chat, again, much like when the washer finishes!

Well, if you’ve made it this far, you’re ready to go check it out. So, get going and get reading. Smarten up that laundry room, report back what you did, and how you did it!

The Robots are Speaking…

So many applications and processes involve notifications. As we begin to automate processes, it becomes increasingly important to know if tasks we’ve initiated succeeded or not. That’s where automated notifications come into play. In my case, often times, these take the form of chatbots that send messages to let me know that something happened, or that a process either succeeded or failed.

I’ve settled on Telegram as my platform of choice for this. What’s attractive about Telegram? It’s available for a wide variety of platforms – iOS/iPadOS, Android, macOS, Linux, Windows, there’s even a web client.

So, how does one create a bot? You have a conversation with the BotFather. BotFather is a Telegram bot that creates bots (how meta, right?) for you. The bot walks you through the whole process. When the process is complete, you’re given a token – this is a secret value that serves as the “keys” to the bot. Once you’re done this process, you’re almost ready to start using the bot in your automations. Next, you start having a chat with the bot, just like you would with another human. Send the /start command to the bot to start it up. Once you’ve done that, you want to figure out the chat_id for that chat. This is pretty straight forward to do, fortunately.

To get the chat_id value for a private chat with your bot, you visit this URL:

https://api.telegram.org/bot[your-bot-token]/getUpdates

Want to use the bot to notify multiple people? Create a Group Chat and add the bot to it. Send a message to the group and reload that URL up above. Check the updates for the Group Chat messages, and look for the chat_id for the Group Chat in there. How can you tell the difference between a regular chat with the bot vs a Group Chat that the bot belongs to? Easy. All chat_id values are 32-bit integers, but Group Chats ID values are negative, while individual chats are positive. Easy, right?

Ok, so you’ve got the Token, and a chat_id value. It’s time to put those bits to work. Here’s an example, in Python, using the python-telegram-bot module, which you can install using pip install python-telegram-bot.

#!/usr/bin/env python3

import telegram

myToken = '123456789:ABC-123456-foobarbaz9876543210'
chatID = -123456789

bot = telegram.Bot(token=myToken)
bot.sendMessage(chat_id=chatID, text="Hello World!")

This is just one of many ways possible to send notifications. Find the one that works best for you and your workflows and go with it! If you want to take a shot at using Telegram, give their docs a read before you start.