Hyperbolizer is my new app for iOS 10. It’s a silly, fun idea. It has a standalone app, but the main focus is its iMessage app, which allows you to type a message, and generate a banner image from it. You can then press the Hyperbolize! button and have the app create a more… hyperbolic version of your message. Think lots of !!, ?? and totes amazeballs phrases. It also saves your ‘Hyperbs’ as stickers for later use.

It’s now live along with iOS 10! Check it out on the App Store, or at http://hyperbolizer.com.

It was a really fun experiment in trying out some text processing in Swift. I hadn’t really done any before, Tapt required very little of that sort of work. I’m pretty happy with the results so far! I’m intending to do a few posts outlining some interesting things I’ve learned from it. Stay tuned.

Tapt – a new iOS game

Since January I’ve been learning Swift to let me create an app which I had an idea for late last year. It’s called Tapt – it’s a rhythm-based music game. It’s very simple, you’re given the name of a song, and you have to tap its rhythm. You hear a note from the song each time you tap, so the aim is basically to play the song. It’s a pretty good way of practising rhythms too! I’ll be updating it regularly with some more features, and more importantly more packs of songs to play.

You can get it on the App Store (Android version coming soon!) by searching for Tapt, or by clicking the link below. Give it a try, and share it if you like it!

Download Tapt on the App Store by clicking here!

Colour – a free iOS Game

I’m currently developing a music game for iOS (and my first app, ever). It’ll be out soon I hope, but I’m stuck waiting for what feels like months on some tax stuff from the US before I can start selling paid apps… fun. In the meantime, I wrote a quick, free game over the course of about 24 hours a few weeks back. Basically, all you have to do is watch the circle on the screen, and tap it when its colour matches the background.

You can download it here: Colour on the App Store

It’s been a useful insight into process of submitting an app to iTunes; especially app-review and the stuff that comes with that. I’d love to hear any feedback on what you think of it (my twitter is probably the best place to do that, but you can find my email on the about page here too). I’ve a big update coming for Colour in the next week or two, that’ll flesh it out a bit more than the 24 hours allowed. All this while I continue to wait in tax purgatory…

The Pi Zero Simpsons Shuffler

The Pi Zero Simpsons Shuffler1 is simple – you press the yellow button, and it plays a random episode of the Simpsons.2 I’d wanted to make it since hearing Will Smith talk about the idea on the Tested podcast years ago, so here it is.

It’s based on a Raspberry Pi Zero, so it’s pretty cheap to make. Breaking down the cost, the Pi Zero costs £4, it uses a 32GB micro SD card which costs about £9, a Mini HDMI cable cost about £3, and it also uses few pence worth of electronic components and other bits and pieces. Naturally, I chose a bright yellow button; it’s just a simple momentary switch which pulls down pin 11 to ground (there are plenty of great tutorials for this around the web, so I won’t bother repeating them).3 So, the total cost is about £17 (plus the cost of buying and ripping your own DVDs of the Simpsons, of course).

Here’s a quick shot of a ‘prototype’ using a Pi Model A, before soldering it all together on the Zero.

On the software side, it uses a simple python script. It waits for a button press, then selects a random file in a given folder, and plays the file with omxplayer. If the button is pressed while an episode is already playing, it stops it and starts a new episode. I’ve uploaded the scripts here. If you try it out, make sure that the only files in the video directory are video files, since it doesn’t check that they’re videos before attempting to play them. I’m using the Raspbian Jessie Lite image, it already contains all the software you need.

Do let me know if you enjoyed the project, or especially if you give it a go yourself! Finally, here’s a short video of it in action:


Update: Big improvements in Pencil latency

A few days ago I posted a video comparing note taking apps on the iPad Pro 9.7-inch. Since then, there have been updates to OneNote and Notability which improve things tremendously. I’ve made another video comparing them again, using pretty much the same protocol as before.1 Notability has improved hugely, and OneNote is only a tiny bit slower. While it seems small, this is a huge improvement and makes each of them much more pleasurable to use. Notability in particular feels just as responsive as Apple Notes, and has completely won me over as a notebook-replacement app.

Benchmarking and comparing video rendering times on the iPad Pro

I do a lot of video editing, mostly using Premiere Pro on my desktop or laptop, though I’ve been increasingly using my iPad/iPhone for small edits like sharing with friends or on social media, etc. As time goes on I’m beginning to wonder how feasible it is to do some more serious editing on iOS. Each of my iOS devices handles 4K video from my Panasonic GH4 or Phantom 3 quadcopter with ease. They play it perfectly, and hardly drop a frame even when scrubbing through videos. Rendering video projects is also impressively fast, and with that in mind I decided to run a few benchmarks.

I’m comparing quite a few devices, basically anything I own that will run Premiere or iMovie. The devices are:

- Desktop 'Hackintosh', Core i7 2600, 16GB RAM and (an ancient) AMD 5770 GPU
- MacBook Pro, Core i7 4850HQ, 16GB RAM, Nvidia GeForce GT750M
- Surface Pro 3, i3 4020Y, 4GB RAM, 64GB storage
- iPhone 6S Plus, 2GB RAM, 64GB storage
- iPad Pro 9.7-inch, 2GB RAM, 32GB storage

Each device is running the latest version of its OS.


I made a simple video project using two 4K videos, shot on the Panasonic GH4 at 24fps at ~80mbit/s. I imported the videos straight from the SD card on each device (using the Apple SD Card to Lightning Adapter for the iPad/iPhone).1 I created as similar a project as I could on each device/program, using iMovie on the iOS devices and Macs, then Premiere on the Macs and the Surface. The bitrate for rendering on each device was set to ~23.5mbit/s, exporting at 4K/UHD. I tried to use the fastest possible settings on each device; in iMovie on the Macs, I chose the ‘fastest’ option; in Premiere I chose ‘VBR, 1 pass’ and left ‘maximum render quality’ unticked. Additionally, for Premiere I rendered twice on each device – first using the GPU/OpenCl/CUDA and again using the CPU.

I also made sure to close all other open programs on each device, and leave some time between repeat tests to account for any thermal changes.


The results are pretty interesting, and once again the iOS devices impress me. Here they are from fastest to slowest:

- 00:38.15 - MacBook, iMovie
- 01:24.47 - iPad, iMovie
- 01:29.46 - iPhone, iMovie
- 04:11.41 - Desktop, GPU, Premiere
- 04:40.05 - Desktop, CPU, Premiere
- 04:44:48 - MacBook, CPU, Premiere
- 04:56.95 - MacBook, GPU, Premiere
- 05:43.34 - Desktop, iMovie
- 13:00.59 - Surface, CPU, Premiere
- 13:03.54 - Surface, GPU, Premiere


The MacBook with iMovie clearly beats the rest. However, while the iOS devices are a bit slower, they beat the other combinations by a wide margin. My assumption here is that there’s a hardware h.264 encoder of some kind, that’s getting used by iMovie on the iOS devices and the MacBook. The Hackintosh is at a disadvantage in that respect, and it seems like Premiere doesn’t make use of it on the MacBook either. While this test focuses on render times, I feel that the results are largely in line with my experience of the performance of each app in general usage. Scrolling through clips, trimming and doing other basic edits feels much smoother on the iPad than in Premiere.

Looking at the rendered files, I don’t see much appreciable difference in quality between them. It would be nice to have the option to output at a higher bitrate on the iPad, but the video output by each device still looked pretty good.

While the MacBook is comparable to my ageing desktop in speed using Premiere, its fans run at full speed the entire time. The desktop feels better-suited to longer workloads as a result – to avoid both thermal throttling, and excessive wear and tear from the (quite intense) temperatures the MacBook reaches. The iPad did heat up a little, though considering the differences in TDP between the chips, it’s never going to get as hot as the i7s.2 It would be interesting to test it over a longer time period to see if it does have to throttle eventually.

Coming back to the iOS devices though, I really would like to see Final Cut Pro or a fuller version of Premiere become available for them. It’s surprisingly un-cumbersome making edits with a touchscreen (helped even more by the Pencil), so it would be great to have the ability to do some of the more advanced things that one would expect in Premiere.3 I’m aware there are lots of other video editing apps for iOS, and I’ll maybe do a follow-up post comparing speed/features between them.

Here’s a quick comparison shot of the three versions. I’ve also uploaded a screenshot from the iPad render, the Desktop+Premiere+CPU render, and the MacBook+iMovie render. If anyone is interested in the actual video, here’s the one rendered by the iPad.



Comparing Pencil latency between apps on the iPad Pro 9.7

I made a quick video to compare some writing/drawing apps on the iPad Pro. I used the built-in Notes app, OneNote (which has been my note-taking app of choice to date) and Notability. I measured how long it took for the ink to catch up to where the Pencil tip was at a given point, while drawing lines at the same speed. I guess unsurprisingly, Apple’s Notes app had the shortest lag at 33ms.1 Notability was slowest for me, at 112ms, and OneNote was in the middle at 87ms. While I’m starting to like Notability enough to use it as my primary notes app, I can definitely perceive its increased latency compared to Notes. It’s responsive enough not to be a problem, but it just doesn’t feel quite as nice as Notes. Hopefully more apps will achieve lower latencies as they get updated.

Single-Core Studies

I recently started doing a lot more coding, and even more recently bought an iPad Pro 9.7-inch. Out of curiosity, I wrote a quick-and-dirty program to benchmark single-core performance on my different devices. It’s simple – it runs through the numbers 0 – 100,000, checking if they’re prime, and measures the time it takes to run. It’s quite possible I’ve made a mistake in the program – mathematical or logical – but it doesn’t really matter as long as it’s consistent, since I’m not actually looking for prime numbers. It isn’t a particularly efficient way of looking for primes either, but again, that doesn’t really matter here.

The devices I tested were:

  • an i7 2600-based Hackintosh (16GB RAM, running El Capitan)
  • an i7 4850HQ-based 15-inch MacBook Pro (16GB RAM, running El Capitan)
  • a Surface Pro 3 i3 (4GB RAM, running Windows 10)
  • an iPhone 6S Plus (2GB RAM, running iOS 9.3)
  • an iPad Pro 9.7-inch (2GB RAM, running iOS 9.3)
  • an iPad 2 (iOS 9.3, because why not)

For the iOS devices I wrote the program in Swift (which is what I’ve been primarily coding in lately). For the Surface and Macs, I used C++. I initially tried running the Swift version on the Macs, but used the C++ version in the end.1 I tried keeping the two versions as close to each other as possible, though I’m not entirely sure what sort of difference in overhead there may be between the Swift and C++ code. I ran the program three times on each device, and got the averages. 2

Here are the times (and a ratio of how they compare to the top result):

18.814 seconds - 1.00x - i7 2600 Desktop
20.350 seconds - 0.92x - i7 4850HQ MacBook Pro
21.499 seconds - 0.88x - iPad Pro 9.7-inch
25.815 seconds - 0.73x - iPhone 6S Plus
40.654 seconds - 0.46x - i3 Surface Pro 3
281.21 seconds - 0.07x - iPad 2

The results are pretty interesting.

The iPad performs extremely well against the desktop and laptop considering the differences between the devices in terms of size, power consumption etc. Geekbench single-core results jibe fairly well with my results; I got a score of 3151 for the desktop vs 3041 for the iPad.3 The Surface comes in at about half the speed of the iPad Pro, which is also surprising. While iPad Pro feels a lot faster than the Surface in general use like web browsing, my gut feeling had been that x86 would still beat ARM. It’s interesting to see about 13x improvement over the iPad 2.

Without putting too much stock in what is a pretty artificial benchmark, the iPad obviously has a fair bit of grunt. I’m hoping more ‘pro’ apps will appear over time, letting it take up more of the functionality of a laptop or desktop, if it ostensibly has the power to do it.

Here’s the source code for the Swift and C++ versions of PrimeChecker, if anyone wants to give them a shot. They may need a little tweaking depending on what you want to run them on.

Update: There have been quite a few suggestions for how the code could be optimised. Checking the range 2..<sqrt(n)+1, and avoiding writing the prime bool excessively do indeed vastly improve performance. However, as I said in the intro, it doesn’t really matter for my purposes; I’m not trying to find primes faster, I’m just looking to run comparable code on each device. For curiosity’s sake, after adding in those optimisations it takes ~7.6 seconds to find all the primes between 1 and 100,000,000 on the desktop, ~8.4 on the iPad and ~10.3 seconds on the iPhone (larger range just to get more palpable numbers). The optimised code gives a set of results in a pretty similar ratio to the unoptimised code.


Compact Motion Detection with Raspberry Pi (part 1 – hardware)

A while ago I built a fairly basic Raspberry Pi project, to use as a motion detector that would look through my DSLR viewfinder and trigger the shutter accordingly. With respect to hardware (I’ll talk about the software in a follow-up), the three main things I wanted to achieve were; good battery life, remote-controllability, and a compact size.

Continue reading “Compact Motion Detection with Raspberry Pi (part 1 – hardware)”