Warmle

Just what everyone needed - another Wordle spinoff.

Warmle

Father Ted Sellotape Dispenser

About eight years ago I had a notion that I wanted to build a real, working version of the sellotape dispenser from Father Ted. You know, the one that Dougal buys in Flight Into Terror that goes "You have used one inch of sticky tape. God bless you." I bought a sellotape dispenser for the project, and quickly ground to a halt because:

  1. I didn't know anything about programming, and;
  2. I didn't know anything about electronics.

Thankfully, in the intervening years, both of those things have changed, and the idea struck me again a few months ago. After a few requests, I'm going to give a very whistle-stop summary of how I made it.

Before I go on, if you want to own the device itself, I'm giving it away! I'm running a fundraiser on GoFundMe and sending it to a random donator. Update: I've now done the draw and given it away, but feel free to make further donations!

Onto the build itself; I wasn't quite sure where to start. So I left out a pencil and paper one night before bed, and hoped that by the morning God would have written down what I should do.

Sadly, he didn't. So instead, I broke the project requirements down into the following tasks:

  1. Measure the rotation of the sellotape wheel somehow
  2. Calculate the length of tape dispensed based on this rotation
  3. Trigger playback of the appropriate sound files
  4. Somehow fit all of this inside a tape dispenser

I decided I'd start with the easier parts of the project. I sacrificed an old, broken, bluetooth mouse and extracted the little component its scroll wheel was connected to—a rotary encoder—which essentially triggers two pins off and on as it turns. Experimenting a little with an Arduino, I figured out how to reliably read its data, and put together a rough function to convert that into the total angle the wheel rotates each time it's moved.

Next step was the audio. First I recorded myself saying all of the possible words that'd be required (deciding that twelve feet and twelve inches was a reasonable length limit). The Arduino wasn't an ideal device for audio playback, so I moved onto a Raspberry Pi (gen 2, I think). I connected a speaker to its audio jack, via a small speaker amplifier, and installed Raspbian to get started. After some playing around, I settled on using mp3wrap and omxplayer to actually output the sound. The former program joins up the required sound files into a temporary file with the specific measurement, which is then played with omxplayer, and then deleted. There's probably a smarter way to do this part, but this was quick, easy and effective.

I now had two of the key building blocks completed. With a little more experimenting it didn't take long to hook them up, and I was able to trigger a message by turning the scroll wheel. You can hear my inaugural, uncalibrated test run in the clip below.

I had bought a couple of sellotape dispensers for the project, but they all shared a common theme...

This was inconvenient, and I slightly wrecked one of the dispensers trying to extract the concrete with a hammer. I thought I had it at one stage, but like an eejit I kept banging away. I found another one that had a bit more room at the bottom though, and I was successfully able to grind away some rough edges and leave myself with enough room to get the job done. While it might not have been apparent above, the current prototype wasn't very small, it was just far away. As a result, I had to try and miniaturise things a bit. I swapped out the speaker for a much smaller one, and switched out the Raspberry Pi 2 for a Raspberry Pi Zero. I had to add a USB sound card into the mix to make up for the Zero's lack of audio output, but already things were looking fairly plausible.

I removed as many connectors and unnecessary parts as possible, then soldered everything together. Adding a battery, charging board, and switch, things were really starting to come together. Next was the misery of trying to fit the rotary encoder into the spindle that holds the tape. But maybe I like misery, and figuring out a good way to do that was pretty satisfying.

In the end I basically chopped the spindle in two, and mounted the scroll wheel inside it. This can then fit onto the rotary encoder, and I mounted a little section of the spindle onto it so it can slide into the holster. What remained was just to cram everything inside the dispenser and close it up. It may be shoddy, shoddy workmanship, but miraculously, it did work.

The last remaining step was to write Greetings from Kilnettle on the side, which may honestly have been the most challenging element of the whole thing. But it'll do.

If you want to check out the code that makes it all work, it's here.

PageTurn Universal

I've just released a new version of my app PageTurn. It's been rewritten from the ground up to be faster, easier to use, and most significantly to work on devices without Face ID - meaning it'll finally work on non-Pro iPads, and M1 Macs. It also has a simpler, better layout, as well as much-improved onboarding and help.

It's now free to download, with an in-app purchase to unlock all the functionality. Based on the previous paid-up-front model I think giving users a chance to try out the gesture recognition first, and confirm it works for them, is a more sensible option.

At a technical level, it's been rewritten using SwiftUI. It uses no third-party libraries, and collects no analytics (barring Apple's built-in ones), so it's about as robust as it can get in terms of privacy.

This is a separate app listing to the previous version, instead of an update. My logic here is that, since a lot of people using the existing app in rehearsals and performances, it's important to keep the previous version available. I didn't want to risk interrupting anyone's workflow, and this was the only option to keep both versions available. Existing users can try out the new one, and either move over or stick with the (still completely functional) old version.

You can download it here.

Learning SwiftUI

img

My new SwiftUI app, Acacia

Some context

I’ve just released Acacia, my first SwiftUI app. It’s a practice tracker aimed at musicians, for iOS, iPadOS and macOS. More info on it here and you can grab it on the App Store now. This post covers some of my ups and downs while learning SwiftUI.

TL;DR: SwiftUI is an intoxicatingly-pleasing and fast way to create apps. Polishing these apps is hard, and can quickly become energy-sapping when trying to finish projects. But, after pushing through a few failed attempts, it’s completely won me over and I can’t see myself returning to UIKit.

It should go without saying that everything below reflects my opinion and personal taste. I think any tools that empower you to create what you want to create are valid, and nobody should feel bad about their choices (or at least no individuals, corporations could probably try harder). This blog is written in PHP, which I learned in 2018 and love using. If your choice is between creating something using the most-derided tools, and not creating something at all, choose create.

Finally, I’m not going to be covering any specific code examples in this, so it's not going to be much use for fixing bugs. There are far better sources for that information than here.

Where I’m coming from

I started learning to program in 2016, which I’ve written about before. The short version, though, is that I cut my teeth writing iOS apps using Swift and UIKit. UIKit really clicked for me, though I never enjoyed using Interface Builder and for the first year or two I did any complex layouts programmatically. Eventually I relented and started using IB a bit more. At first primarily for scaffolding, but eventually used it more or less everywhere I could (practically) do so. In spite of the time saved on laying things out for multiple screens and the obvious wins it brought, it was always slow, frustrating and I felt like I wasn't so much using it as wrestling with it.

img

The storyboards for a past app, don't they just scream fun?

When it comes to the Mac, I’ve only dabbled briefly in AppKit in order to make utility apps for myself. As much as I love the general idiom of macOS, I find AppKit an old-fashioned and pretty unpleasant framework to use. No surprise, when I was already used to its newer, shinier sibling. I never really considered Catalyst an option for developing Mac apps; it seems to require as much or more finagling as SwiftUI to leap the (albeit increasingly narrow) idiomatic canyon between macOS and iOS. But more to the point, in the main, I don’t think Catalyst apps are very good (at least not without a tonne of extra work), and I'd rather spend that energy on something more exciting.

img

Extempo, an AppKit app I made years ago during my PhD, for doing computer-assisted improvisation.

I was really excited to see the SwiftUI announcement in 2019. It felt like an entirely different, yet familiar, approach to UI, and I couldn’t wait to try it. At the time my day-to-day work involved maintaining apps that were written using UIKit and IB, and given my preference for goal-oriented learning, I ended up not finding a good opportunity to learn SwiftUI that first year. Arguably that’s probably a good thing; Swift itself was already a couple of years old by the time I started learning it, and even then it was frequently changing in ways that made starting out as a programmer more cumbersome than it might otherwise have been.

Starting out with, and giving up on, SwiftUI

Eventually, around July 2020, I had an idea for an app that felt like a good fit for learning SwiftUI. There were now the shiny new improvements from WWDC 2020 that made me even more exited to use it. By that time I was also maintaining web projects for the most part, so wouldn’t have to switch back and forth between SwiftUI and UIKit/IB. Spoiler alert, this app was not Acacia, rather one I working-titled Headway. It was a tangentially similar idea that had that (im)perfect combination of a hazy idea of its target user, and an infinite, unordered list of features it would need to appeal to whoever that potential user might end up being.

Using Apple’s learning materials to get started, creating UI was just as fun and intuitive as I’d hoped it would be. It didn’t take long for me to have all the ‘screens’ for the app prototyped. While the ‘reactive’ approach makes a lot of sense, it was a major shift in mindset from what I’d done before. The advantage was, though, that the data model was more or less figured out in tandem with the UI.

The problem that happened at this point, in addition to the lack of direction within the app itself, was that I’d built a pretty elaborate UI that only mostly worked. Thus began the Sisyphean task of fixing all the jank. At every turn, new and unexpected UI bugs would pop up; navigation hierarchies would break, sheets and alerts would randomly trigger or dismiss, the UI wouldn’t update to match the data, and other times it would. I was never quite sure whether I was doing something wrong, or whether it was some SwiftUI bug (and there are plenty of those). In the end I lost momentum, set aside and eventually abandoned the app.

At a surface level, SwiftUI provides a very compelling illusion of being easy, but it’s not. The degree of conceptual simplicity is a huge draw, but despite having a pretty strong grasp of Swift there’s just enough syntactic newness—dollar signs, @Thingies and underscores—that I ended up relying on rote structures I didn’t really understand. So when things got beyond a certain level of complexity, and broke, I wasn’t able to fix things. Or at least I wasn’t able to fix them in a repeatable, useful way; I could change things until they worked but I’d not really have learned anything by doing so.

Trying again

In the final week of February 2021 I had a new idea (this one would turn out to be Acacia). It was much simpler, and I had a good sense of the scope of features v1 would need. So I got working, this time proceeding much more carefully. I’d learned a lot of SwiftUI from my previous excursion, and was starting with a much more logical idea of how the code should be structured. I wanted to use SwiftUI exclusively, and if possible to avoid the rabbit hole of bridging in tonnes of UIKit/AppKit views.

Within a week or so I had the app more or less working—animations, iCloud sync, the works—but there were still lots of rough edges. For one, the macOS version was a mess. I had as a general rule used only SwiftUI code that worked on macOS and iOS, but there were still lots of things that simply didn’t work or were chaotically laid out. Far from a magical way to get a macOS app ‘for free’. On top of that, there were lots of glitches on the iOS app; navigation views would randomly break, toolbar buttons disappeared, there was no way to dismiss the on-screen keyboard, and a lot of these things seemed to be outright impossible to sort without resorting to UIKit/AppKit.

Embracing #if

Acacia is about 3,200 lines of code according to cloc, and within that I have 49 blocks of code that are behind the #if os(macOS) compiler directive, and 55 that are behind #if os(iOS). At first I found this really clunky, because I didn’t have a good sense of where these fit; it took me a while to grok whether they could go inside parentheses, or closures, or even to figure out the cryptic error messages I’d get when I placed one wrongly. But after some persistence, it began to feel quite natural. Instead of trying to shoehorn as much of the iOS layouts into the macOS app as I could, I started to modularise Views into the smallest units that would work on both platforms. Then I composed parent Views that used these directives to display the right layout on the right platform.

The above is probably pretty obvious, and Apple’s own tutorials advise that breaking code into subviews is a very cheap and effective way to write clear SwiftUI code. But with Headway I had built so much of it by rapidly learning, iterating, and tweaking things till they looked right, that I didn’t even know where to begin when trying to decompose Views into sensible chunks. Switching the build target to macOS after the initial iOS structure was laid out was actually a big help here. Settings aside the dubious merits (both economic and functional) of actually having a Mac app, the places where I had to swap out code for each platform were a really good guide at achieving a happy medium between huge, monolithic Views, and unnecessarily trivial ones.

Respecting one’s elders

I mentioned that I wanted to keep Acacia as close to ‘pure’ SwiftUI as I could. The short version is that this, unless I’ve missed something, isn’t feasible yet. The lack of some things, like equivalents to resignFirstResponder() and registerForRemoteNotifications() meant I had to call out to the underlying NSApplication/UIApplication to do a few things. In the end, though, the only times I had to use UIViewRepresentable or UIViewControllerRepresentable were to use SFSafariViewController, and to (bafflingly) have text inputs that allowed for a ‘Done’ button so they could dismiss the keyboard. I’ll be shocked if that latter requirement survives past WWDC 2021.

Closing thoughts

I described SwiftUI as intoxicating at the beginning of this, and I think that’s the key theme I’ve felt when using it. That intoxication has its counterpart in the wearisome nature of pinning down and removing quirks, but on the whole this abated quickly as I got more proficient. My key takeaways for anyone wanting to try it out are:

  1. Proceed with a goal. Without one it's very hard to get through the slump in motivation that follows the initial development phase.
  2. Be prepared to start over. Probably more than once.
  3. Make it a Mac app, too.

I could've built Acacia using UIKit, but this was more fun and I learned lots. Likewise I could have built it for macOS via Catalyst, but I almost definitely wouldn’t have. While I feel like I've only just got the hang of the basics of SwiftUI, now that I’ve broken the back of it, it feels great to be equipped for what’s next.

In the meantime, check out Acacia on the App Store, and read more about it here.


Some music I listened to while writing this:

SwiftUI Core Data bug

I've been building an app using both SwiftUI and Core Data for the first time. It's been a journey. Among a lot of weird gotchas, bugs, and new paradigms, I've just pinned down a bug that's been causing me a tonne of grief.

I'll try and explain the setup as concisely as possible:

  1. A sheet is presented which takes some user input, and which has a Core Data object (let's call it ParentObject) passed to it from the parent view.
  2. The user taps save, and a new Core Data object (let's call this ChildObject) is instantiated. The values they've input are assigned to that object, and its relationship as a child of ParentObject is set.
  3. With seemingly no pattern, this save operation fails and crashes the app with the error "Illegal attempt to establish a relationship 'Parent' between objects in different contexts."

I went through everything I could find or think of to pin down the issue. I found plenty of red herrings—certain inputs or sequences of deleting and re-adding objects in quick succession—but none of these held up to repeated testing. Then I noticed one pattern, which persisted with any testing I did.

The crash happens any time the 'slide-to-dismiss' gesture was initiated and cancelled. It didn't matter if it was only by a few pixels, if the sheet experienced any downward movement at all, the Core Data context passed down through the @Environment wrapper was lost/changed and any subsequent save operation failed. Because the sheet in question contains a ScrollView I'd accidentally triggered the swipe-to-dismiss gesture countless times, and it was basically a flip of the coin whether I'd have done it each time I was running through the steps to replicate the bug, I've send a Feedback (FB9048688) so hopefully it gets sorted soon.

Update: I did figure out a workaround to this just now. Getting the parent object's managedObjectContext, rather than using the @Environment variable, and using that to create the child object seems to work fine.

One Shoesworth of Running

When starting out with a new hobby, I like coming to milestones and looking back on my progress. Often these are just arbitrary dates, like one year in, but in this case a worn-out pair of runners seems like as good a time as any. I went for my first run on 27 April 2020. It was near the start (or as I had thought at the time, the end) of Lockdown, and came at a point where I was probably the least fit I'd ever been in my life.

324 days later I've run a total of about 750km. I started off with two- or three-kilometre runs a couple of times a week, at first very early in the morning around a 200-metre loop of footpath near my house. I worked my way up to five-kilometre runs over the first few weeks, then felt confident enough to run a 5k route through town by the middle of the summer. As I find with a lot of new hobbies, after the initial feeling of success I got a bit less diligent; I was managing one or two runs a week by the end of the July, and by September I was letting weeks go by without running at all. On 27 September, conveniently yet coincidentally 100 days before 1 January, I went for a run in the evening. Then did the same thing on the 28th. And when I realised keeping that up would get me to a total of 500km by the new year, I decided I wanted to do just that.

The sense of progression I felt during this period was great. My 5k record had been around 32 minutes at the beginning, and was about 22 minutes by the end. I missed my first day during October, and decided to try running a 10k the next day to maintain my average. By November I'd run four 10ks in a single week, and over the remainder of the time my best time fell from 56 minutes to 48. In December I managed a new distance record, stopping at just over 16km or 10 miles.

I reduced my intensity a little in January—running in icy wind had lost its charm around the 501st kilometre—then took a two week break in February to recover from some leg pain I didn't want to exacerbate. In March I returned to my aim of 5k per day and have kept going since.

DMDB

As per its introductory paragraph, I often find myself wondering when watching movies, particularly old ones, how many of these people are still alive? So of course, a bit of code and some profuse use of The Movie Database's API later, DMDB was born.

It pretty much does what it says on the tin: you can search for movies and it'll tell you what percentage of their cast is alive or dead. I've also compiled a list of some interesting outliers and coincidences (Se7en is my favourite one so far).

Check it out here.

Vaccine Progress Tracker

Update 3: More tweaks

Since the age ranges for which vaccinations are approved in Ireland continues to drop, I've decided to switch the bot to use the whole population again (4,977,400 according to the CSO). Additionally, with the more infectious Delta variant, it seems like people aged under 16 are probably more of a factor in spread now, so it seems wise to move towards considering the full population. It would be nice to have total/12+/16+ etc breakdowns, but unfortunately the available data doesn't have that level of granularity (but correct me if I'm wrong!).

Another tweak is that I'm changing the label for the first bar from "dose one of two" to "at least one dose" and including the Janssen doses. This seemed to be causing a bit of confusion, so hopefully the switch will make things clearer.

Update 2: Revised Over 16s Figure

Using this tweet I've derived an estimate of 3,909,809 for the total over-16 population, so have updated the bot to use this going forward. It's nice to have something closer to an official figure now, and it's a relatively small change from what I'd been using before, so won't make a huge difference to the percentage so far.

Update: Switching to Over 16s

To reflect some changes to the vaccine rollout in Ireland, I've now slightly modified the tracker to measure against the population aged 16 and over. Using the same method as above—just taking 1/2 of the 13-18 age ground instead of 1/6—I arrived at a new figure of 3,863,147 to base the percentage on.


Original post

After seeing the UK Vaccine Progress Tracker Twitter account yesterday, and already being a fan of the Year Progress one, I had to make an equivalent for Ireland.

Much like its UK counterpart, it's written in Python using the Tweepy API. Mine scrapes data from Ireland's COVID-19 Data Hub and will tweet only when new data has been added. Annoyingly, this source data seems to lag behind the "headline" data that gets reported in the news, but I haven't found any better official source.

Gauging Population Over 18

Like the UK bot, I'm counting percentages of adult (18+) population, those being what the government roadmap focuses on. This was an interesting challenge because there isn't really a solid source for the number of people older than 18 living in Ireland. The most recent census was 2016, so its data is about as out of date as it's going to get. It states the population as 4,761,865 but uses a fairly inconvenient set of age groups—13-18 and 19-24—which creates a bit of grey area.

In the end I estimated that about 1/6 of the 13-18 age group would be 18, giving an approximate over-18 total of 3,572,000 or 75% of the total population. The most recent official estimate of total population comes from the Central Statistics Office and places the population at 4,977,400 as of April 2020. Wanting to incorporate this growth, I used the 75% proportion from the 2016 data to land on 3,733,679 as the number that I base the 'progress' percentage off. If anyone knows of a better estimate, let me know!

Typos

Time wasted today because of typing request.onReadyStateChange instead of request.onreadystatechange: a few hours more than I'd like to admit.

Incompletionist

I don't tweet often (I think about 5,000 times in the past 12 years), but I was very much a Twitter completionist. I curate the list of people I follow fairly tightly, and barring a few muted people, read every single tweet. Tweetbot was really handy for this, for keeping my position in the timeline and syncing it across my devices. After about a decade of this, I found myself checking Twitter reflexively: waiting for a file to download? Check Twitter; waiting for an app to compile? Check Twitter; waiting for someone in a coffee shop? Check Twitter.

Realising this was doing me no good at all I quit cold turkey on Sunday. I deleted Tweetbot from my devices, and aside from going on the website a few times to check something specific (like the delightful absence of a certain 'real' guy's account), I've not been on it since. While I like and am interested in most of what the people I follow have to say, for the most part I also don't really care about it, and was giving it undue presence in my mind.

I've replaced the habit with a combination of reading on the Kindle app, reading in NetNewsWire, or just being alone with my thoughts. I'm genuinely surprised how little I miss it, and whether I go back or not, the completionist days are over.

Running

I started running in April, during (the first) lockdown. I used to love going for long walks, but wanted to up my fitness and running seemed like a good option. One thing that bothered me about walking is that I'd often end up checking my phone, replying to messages, or even idly checking Twitter while I walked; walking in itself didn't necessarily require all of my attention. After a few months of running a couple of times a week, in the last week of September I decided to try running 5km every day for as long as I could.

I've been averaging pretty much exactly 5k per day, though with a few days off and a few 10ks to compensate. To my surprise, I actually love doing it. One big reason is that I have to do run and only run; there's no phone-checking, nor even the temptation to do that. It's just me, some music or a podcast, and the constant task of brushing my sweaty hair out of my eyes.

The Secret of Saint Nick

Twas the night before Christmas, when Sally and Jimmy,
Were sitting in silence and watching the chimney.
The whiskey was out, the stockings were hung,
And they patiently waited for Santa to come.
That morning they'd argued for hours on end,
About how he gets all of the toys that he sends.
"He makes them!", "He buys them!", "He gets them for free?"
Neither was sure what the answer could be.

So they hatched a grand plan, to pin down the facts;
They'd stake Santa out and catch him in the act.
They heard him arrive, with a thump from above.
Their peeled eyes soon saw soot, a boot, then a glove.
But sulking behind him, bedraggled and slack,
Was a sack that looked empty, which took them aback.
Nonetheless, he reached in, and produced with some joy,
Sally's chemistry set and Jim's action boy.

Gifts laid out, the jolly old elf looked around,
His whiskey and mince pie were easily found.
With seconds to spare, the kids' plan was foolproof;
They ran up the stairs and climbed onto the roof.
A sleigh sat as expected, sunk into the snow,
With eight of the happiest reindeer in tow.
They sneaked in the back of the rickety sled,
And hid with a mixture of fervour and dread.

Santa soon took his seat and got into position,
Threw his sack in behind him, to get on with his mission.
Wasting no time, the kids grabbed the sack,
Stuck their heads in the top but saw only black.
"On Dasher, on Dancer!" they suddenly heard,
With each name that followed, the craft further stirred.
With a mighty old jolt, the sled lurched ahead,
But Sally and Jimmy fell backwards instead.

Tumbling into the sack that had seemed so decrepit,
They fell for much longer than they had expected.
With a thud and a thump, the kids came to rest,
And found themselves, still, on the roof they had left.
Looking around just increased their confusion,
They wondered if Santa had been an illusion.
For there weren't any tracks where his sleigh had set off,
Nor any footprints in the snow from their socks.

It was like they had simply appeared on the roof,
And imagined the sleigh, and each antler and hoof.
They climbed to the window through which they got out,
But found it was shut, so they planned a new route.
Two hours of climbing, a mighty old chore,
Got them to the ground, so they knocked on the door.
It was morning, at last, and their presents were waiting,
They heard steps on the stairs, but much hesitating.

The door finally opened, after much of a dally,
Staring back at them, shocked, were Jimmy and Sally.
"How could this be!?" Shouted Jimmy in anger,
(Not the Jimmy you know, his doppelgänger).
They called for their parents in perfect rapport,
"Mum, dad, come quick we need you!", said all four.
Their parents arrived, and saw the newly-acquainted,
Their mother screamed and their father fainted.

Soon they calmed down, asking them to explain,
So Sally and Jimmy relayed their campaign.
As they told of the plan to see Santa's sack,
Their parents' brows furrowed as they tried to keep track.
"But Santa takes kids' toys, he doesn't bring them."
Said dad with a note of remorse for his income.
"We buy children presents in pairs every year,"
"Because our Santa steals one set of the gear."

Sally realised, with horror, now what had occurred.
Their Santa, a thief, used his sack, she inferred,
To reach into this parallel universe,
And steal toys for the kids that lived back on their Earth.
When they fell in the sack, they left their own world,
And into this place they had been promptly hurled.
In spite of this breakthrough, they were no less adrift,
It would be a full year before Santa's next shift.

Three-hundred and sixty-four days had now passed,
Sally and Jimmy would be home soon at last.
On Christmas Eve this year, the gift box enclosed,
Our Sally and Jimmy - no toys, books or clothes.
While waiting with patience for Santa's thief hands,
Our Jimmy thought something that threatened their plans.
"Well we haven't been home, so there won't be a list,"
"So Santa won't reach here when looking for gifts."

"You're right" said our Sally, as she shouted for help,
They explained to the parents who let out a yelp.
But quick as a flash, mum hatched a new plan,
They picked up the gift box, to the neighbours' they ran.
They opened the window as quick as could be,
And put the kid-gift-box under the tree.
As soon as they'd done it, with a crackle and flash,
A gloved hand appeared and perused the gift cache.

It picked up the gift box with uncanny ease,
And like that, it was gone, and the parents were pleased.
Sally and Jimmy were suddenly back,
In the world they had left, thanks to that evil sack.
When they heard Santa leave, they jumped out of the box,
And ran to the window that this time was locked.
They opened it, climbed out, and ran to their dwelling,
Planning the story that they would be telling.

They banged on the door and the lights all awoke,
And they grinned ear to ear waiting for their old folk.
And they heard in the sky, as they did reunite,
"Happy Christmas to all, and to all a good-night."

all posts
©2022 Stephen Coyle
Page generated in 0.05 seconds.
Beware the insidious inflow of indifference. Keep doing silly stuff.