Ben Dodson

Freelance iOS, macOS, Apple Watch, and Apple TV Developer

Reaction Cam v1.4

Over the past few weeks I’ve been working on a big update for the Reaction Cam app I built for a client a few years ago. The v1.4 update includes a premium upgrade which unlocks extra features such as pausing video whilst you are reacting, headphone sound balancing, resizing the picture-in-picture reaction, and a whole lot more.

The most interesting problem to solve was the ability to pause videos you are reacting to. Originally, when you reacted to a video the front-facing camera would record your reaction whilst the video played on your screen; it was then a fairly easy task of mixing the videos together (the one you were watching and your reaction) as they both started at the same time and would never be longer than the overall video length. With pausing, this changes for two reasons:

  1. You need to keep track of every pause so you can stop the video and resume it at specific timepoints matched to your reaction recording
  2. As cutting timed sections of a video and putting them into a AVMutableComposition leads to blank spaces where the video is paused, it was necessary to capture freeze frames at the point of pausing that could be displayed

This was certainly a difficult task especially as the freeze frames needed to be pixel perfect with the paused video otherwise you’d get a weird jump. I was able to get it working whilst also building in a number of improvements and integrating in app purchases to make this the biggest update yet.

I’m really pleased with the update and it looks like the large userbase is too with nearly 500 reviews rating it at 4 stars.

If you haven’t checked it out, go and download the free Reaction Cam app from the App Store. You can remove the ads and unlock extra functionality such as the video reaction pausing by upgrading to the premium version for just £0.99/$0.99 - it’s a one-off charge, not a subscription.

Foodim

Back in 2014, I was approached by a team representing Nigella Lawson to work on an app centered around food photography. As a big fan of Nigella, I jumped at the chance and spent several months working on the Foodim app. Nearly five years have passed since then but the app is now finally live in the App Store!

It is probably best to let Nigella explain what the app is all about:

It has always been vexing to me that there is no dedicated food photography app, and so many of the filters and so on that are meant to applied on general photography apps do food no favours. So, based on the principle that if something you want doesn’t exist, just go ahead and make it, I’ve been working for some time with my longtime cameraman to develop a food photography app with a built-in filter designed to optimise food and a back-of-shot blur dependent on the angle of the phone (as well as a draw-to-blur feature) to give depth of field.

When I first joined the team, there was a basic app that had been built but it wasn’t anywhere near polished enough for launch. The custom made blur filter was working but the app would crash from memory constraints after you took a few photos. I started by rebuilding the photo memory subsystem and working on the fundamental basics of the networking. For example, I worked with the API developer to develop a patch system that pushed short bursts of data to the app when changes were made ensuring that the local cached copy was always up to date and that there was no loading time when answering push notifications1. I also created a system for the background uploading of images; the image would appear in your feed instantly but the image would update in the background before silently reloading in the feed to use the online copy.

Over time I helped work out UX issues, redesigned various aspects, and helped move some of the camera code over to a newer image processing system including working on the draw-to-blur functionality and improving the gyroscopic tilt mechanic to adapt depth of field. I also used my contacts with Apple Developer Relations to setup a meeting between Apple and Foodim to showcase the app and get their opinion on improvements that could be made.

My work on the app was complete in 2015 but I’ve had the odd bit of correspondence in the mean time as minor issues were resolved. Since then, I believe a new team has been working on some camera improvements and further changes to the app to accommodate newer devices and the changing landscape of iOS development that has occurred since iOS 7 was released. I’ve no idea why it has taken quite so long to launch the app but I’m extremely happy to see it available now in the UK, Australia, and New Zealand.

The app is totally free and can be downloaded from the App Store. You can find out more details about the app over at foodim.com.

And, in case you were wondering, I never did get to meet Nigella in person. I was meant to meet her in London but a printing error at the train station meant I missed my train and had to join the meeting via Skype instead. From that day onward, I never travelled by train without having printed my ticket days in advance…

  1. In most apps of this nature, you’ll get a push notification when a new photo is uploaded; when you tap on the notification, the app is opened but you then need to wait for the post and image to load as they haven’t been prefetched. With this project, a silent push notification was sent that would wake up the app in the background; it would then fetch all of the relevant information and cache it locally before sending a local notification to the user. When that notification was tapped, the post was opened and was ready and waiting for them with no additional downloading required. This is far more common in apps today but was something of a rarity back in the days of iOS 7 when I originally built it! ↩︎

Announcing the Apple Music Artwork Finder

When I launched my iTunes Artwork Finder a few years ago, I had no idea how popular it would become. It is currently used thousands of times per day to help people find high resolution artwork for their albums, apps, books, TV shows, and movies. Since the launch of Apple Music, I’ve had regular emails from users that wanted to access the artwork used for playlists across the service; I’ve finally done something about it!

Today I’m happy to announce the Apple Music Artwork Finder which grabs ultra high resolution artwork of albums, playlists, and radio stations from Apple Music. It’s ridiculously easy to use and just requires you to paste in an Apple Music URL. With that, it can make some requests to the Apple Music API to retrieve the artwork.

Whether you want the artwork for your New Music Mix Playlist, the latest Panic! At the Disco album, or for the Beats1 banner, the artwork finder should be able to get you the highest quality artwork. Oh, and it’s totally free as well!

Try out the Apple Music Artwork Finder »

Hawkker

If you’re an entrepreneur that is looking to get an app built by a large digital agency, how do you ensure you are staying on top of the project if you don’t have any knowledge of how app development works? That was the predicament that Zeid Bsaibes had when he came to me in September 2017 with his project Hawkker, an app to find the best independent food from street markets. He had ruled out working with a sole freelancer as the project was too large comprising of two apps, a website, and a complex server infrastructure, but he didn’t feel comfortable outsourcing to a large agency without some form of oversight. To that end, he hired me in a consultancy role to act as a sounding board for functionality whilst also being able to act as a middle-man between himself and the agency he chose, Hedgehog Lab.

The result is two apps, Hawkker and Hawkker Vendor, both now available on the App Store.

To begin with, Zeid and I looked through his copious wireframes and documents to pinpoint any issues, especially with the QR code redemption system he was proposing for Hawkker Points, a rewards scheme that benefits both eaters and vendors. I had extensive experience with QR codes from my work on Chipp’d and was therefore able to alter the designs so that everything would work smoothly in an environment where network connectivity may not be perfect.

From October 2017 until September 2018, I acted as a liason and mediator between Hawkker and Hedgehog Lab as the app was developed. I performed code reviews, inspected contracts, acted as a constant point of contact to discuss functionality and ideas, and helped mediate where necessary. I was able to assist the agency by explaining to Zeid in depth why something may take x amount of time to develop but I was able to assist Zeid by pushing back at the agency when they were providing unrealistic timelines and also use my technical knowledge to speak with their developers directly rather than going through a non-technical account manager. My role can basically be boiled down to being someone able to translate between entrepreneur and technical staff whilst also providing my own suggestions based on my vast experience of app development.

Towards the end of the project, I acted as a QA performing extensive testing and was able to provide code-level bug reports for Hedgehog Lab to work on.

Once the app was completed, I was asked to take over the development of the iOS app and was tasked with cleaning up some of the remaining bugs that had been left unresolved due to lack of time. I rebuilt the vendor detail pages within the eater app to include a fluid animation system and improved the photo gallery to ensure that eaters were getting the very best experience. Now that the app has launched, I am periodically called on to work with the rest of the Hawkker team to resolve issues and improve the apps.

Whilst this is not the sort of work I usually do, it has opened my eyes to the need for some clients to have a consultant alongside them when engaging with large agencies. Had I not been a part of this process, I have no doubt that the apps would have been far poorer and that Zeid would not have had the wide knowledge he now has of how the apps actually function behind the scenes.

It has been a real pleasure working with Zeid and the rest of Hawkker over the past few years. I’d encourage you to check out the free eater app on the App Store or recommend the vendor website to your favourite street food sellers. You can also learn more about the entire platform at hawkker.com.

If you are considering hiring a large agency to deliver your product, I would strongly advise hiring a consultant to sit in on meetings and keep track of the development process. I’d obviously like you to choose me (you can contact me to find out more) but having any technical consultant along with you is going to make the process far easier and help you navigate the sometimes awkward world of agency development.

DrinkCoach Updates

Over the past few weeks I’ve been working on some big updates to the DrinkCoach+ app that I developed for Orbis Media and the HAGA last year:

The big change is a new ‘month-at-a-glance’ screen with a scrollable calendar giving you a great overview of your alcohol intake over time. This is enhanced with a ‘Zero-Alcohol Days’ badge that increases each day to show your current streak. In addition to this, a new summary PDF is available which can be generated from a range of dates (i.e. everything from past week and past month to specific from and to dates); this PDF will show you the total number of units, calories, and cost along with an average units per day count and total number of zero-alcohol days. The PDF can be easily downloaded or shared with your healthcare professional. Finally, a number of UX changes were made to improve the layout of the app, support was added for the most recent Apple devices, and the code was updated to Swift 4.2.

In show business, it is often said that you should never work with children or animals. In software development, the equivalent is that you should never work with date formatting. I certainly found building this calendar system from the ground up a challenge and keeping it performant when the local Realm database is full of data was definitely not easy. That said, I’m incredibly pleased with how the update has turned out and it seems the users of the app are too; to date, the app has received over 1200 reviews on the App Store averaging a 4.8 rating whilst also being featured by publications such as The Observer, The Guardian, and The Huffington Post.

It was really great to work with Orbis Media again and I look forward to working with them again in the future. You can download DrinkCoach+ on the App Store for free and learn more about it at drinkcoach.org.uk.

Building a Twitch Panel Extension

A couple of months ago I started streaming some of the many video games I play on Twitch. For those that aren’t aware, your Twitch profile can be customised with a number of text or image based panels along with a relatively new “extension” panel which is essentially an iframe. I was spending some time adding the type of wine I was drinking on each stream in a text-based panel and decided it would be more efficient to build a simple panel extension to display this information in a more customised format.

Thus the “Currently Drinking” extension was born which allows users to add a name, type, location, price, ABV%, description, notes, and an image about the drink they are currently enjoying. I also added the ability to provide a URL for a website such as vivino.com, distiller.com, or untappd.com which is then screen-scraped to provide the information automatically.

This post isn’t going to be a complete tutorial on how to build an extension as I didn’t expect it would be that complex and so didn’t write down the instructions on how to get everything working as I was going along. Suffice to say, the process was a lot more difficult than I initially anticipated! That said, if you have any specific queries, do feel free to get in touch and I’ll try and help as best I can.

To start with, a panel extension is basically a website comprising of HTML, CSS, and JavaScript. This is all loaded into an iframe with some restrictions and you can choose to make use of the Extension Backend Service (EBS) which is essentially a NodeJS instance that sends notifications via PubSub. It is recommended that you download the developer rig that Twitch provides which allows you to run and test your extension on a localhost such that when you load your profile page from Twitch.tv you can see your extension loaded in. The setup process for the developer rig is fairly arduous, especially on macOS, and I found myself having to run a fair few more npm install commands than I would like that weren’t documented. I also found that there were a number of differences between the locally hosted version of the extension compared to when it was run on a server, most noticably in that the local version allows the external loading of assets which the hosted version does not1.

In terms of coding, as you are essentially just writing HTML there isn’t much to be aware of when writing an extension. A panel is always 300px high and you can set a global height for the panel within the Twitch settings2 and make use of vertical scrolling if you need to show more content. To configure your extension, you supply another HTML file which is loaded whenever the configure button is pressed but again this is just loaded within an iframe.

As I haven’t really used JavaScript since 2009, I decided to forego the EBS in favour of a PHP backend that I could control. Thankfully this is fairly easy as you can get a unique identifier for the channel that is being viewed via the window.Twitch.ext.onAuthorized(auth) callback. With this, I was able to use a simple AJAX POST request to send the data in my config page to my PHP backend along with the ID of the currently authorized channel. When a panel is loaded, an AJAX GET request with the same channel ID is used to load a JSON response of the data stored in my database. Using this PHP system was also more useful as I was able to add my screen-scraping library to rip out details of the drinks I was enjoying from Vivino, Distiller, and Untappd. Whilst my initial version provided a link back to these pages, I found that each URL needed to be whitelisted on the Twitch extension otherwise they wouldn’t work. As I would have liked to let people link to other websites I ultimately decided to drop the ability as it wouldn’t be feasible to maintain a whitelist that would please everyone.

Once the basic panel was built, I was able to test it on Twitch’s own servers by performing an asset upload. With this, you basically zip up your directory containing your HTML, JS, and CSS code and upload it to their servers at which point it will let you use that code as your panel on your live Twitch page. Crucially, this is only seen by you and accounts you whitelist. As I’d set up the developer rig on my laptop and didn’t want to get it all set up again on my Mac Pro, I ended up tweaking some of my extensions by editing the files locally and just uploading them directly in this way to test them - it took slightly longer but that way I knew what I was looking at is how the extension would look to others.

With the extension looking good, it was then time to submit it to Twitch for approval. I’ve been through this sort of process with Apple before hundreds of times so I’m no stranger to a review process but this one was slightly more arduous. To start with, there are a lot of restrictions on what you can and cannot do such as not being allowed to have any JavaScript console logging, you aren’t allowed to load into the DOM directly from AJAX (i.e. loading HTML from a remote location), you can’t obfuscate your code nor use a double-click as an input. Your code is inspected by Twitch as part of the review and so everything has to be provided locally except for the Twitch Extension Helper which must be loaded from a specific location. At first I found the idea of an actual code review to be slightly strange as even Apple doesn’t do this3 but it makes sense given the nature of JavaScript embedded in an iframe.

After a couple of days, the extension was approved and I was then given the option to release it publicly at which point it shows up in the extension directory with screenshots you provide. As this first extension was relatively easy, I decided to produce a number of “wishlist” panel extensions which would initially be for Steam, Humble, and GOG. These worked in much the same way using a PHP backend to send the URL of the users wishlist; my server would then screen-scrape these pages and store the games in my database where the panel extension could request them in order to load the list. As each extension was for a specific store, I used the URL whitelisting feature to whitelist each domain so you could click on the game to go to the relevant store page.

Whilst the extensions were relatively quick and easy to write, the approval process took several weeks as a bug in the process meant they got stuck in limbo for violating one of the rules, namely that “extensions may not transact or encourage the transacting of monetary exchange in relation to any non-Twitch/Amazon commerce instruments”. In essence, an extension could not link to a Steam store page as it is a competitor to Twitch/Amazon. I find this to be slightly silly, especially as a user can happily just write up a list of links in a text-based panel without issue, but those are the rules and the team at Twitch Dev were incredibly helpful at resolving the issue reaching out to me via Twitter DM. I was able to re-submit the three extensions provided that they didn’t link to the external storefronts; this seemed like a reasonable compromise and so I re-submitted and they were approved within several minutes.

The only other thing to mention is the process of updating an extension. I foolishly didn’t test my original “Currently Drinking” extension with the Twitch “dark theme”4 and received a complaint from a user. Updating an extension is thankfully very easy requiring only that you bump up the version number and upload a new zip file. This goes through the review process again but was approved in under an hour for me. As far as I can tell, there is no forward facing “What’s New” notes or a way for a user to see an extension has been updated; it just happens automatically. It would be nicer if there were a way for users to see when an update has occurred and what has changed but I guess that will be something for the future.

Overall the process of creating a Twitch extension was slightly longer than I would have liked but now that everything is set up and I’ve been through it a few times I think it’ll be very easy to add new ones in future. I’m tempted to try my hand at a video overlay extension but haven’t yet found a compelling enough reason to do so. For now though it has been a pleasant diversion from building iOS apps and so far the extensions have been installed by far more users than I expected.

If you’d like to give them a try, you can find some direct links on my Twitch Extensions page. You can also follow me on Twitch if you’d like to see some of my extensions in action!

  1. To be fair, the guidelines do state that you should “include all JavaScript and CSS files in the extension’s uploaded assets” but this is not enforced by the developer rig so I spent a lot of time wondering why jQuery worked locally but not on the Twitch site. ↩︎

  2. The default is 300px but you can choose anything from 300px to 500px. Unfortunately it isn’t possible for an extension to say at runtime how high it wants to be - it is something that is set globally in advance. ↩︎

  3. Aside from the automated checks when compiling in Xcode to ensure you aren’t using private frameworks. ↩︎

  4. It’s a bit of a pain to check if you are in dark mode or not. You need to run the window.Twitch.ext.onContext(context) callback and then check that for the context.theme. I do this and then append or remove a .dark class to my \<body\> to make it a bit simpler to work with. ↩︎

Scalable bulleted lists with UILabel or UITextView

I’ve recently been implementing auto-renewable subscriptions for a client and came across the need to create a bulleted list of notes1. There are numerous tutorials available that show how you can do this but all of the ones I found had a flaw of some kind be it using fixed values for bullet widths or not taking variable font sizes from Dynamic Type into consideration.

Here, then, is a quick primer on how you can add correctly aligned bullets to a list be it in a UILabel or UITextView and have it scale correctly dependent on the users text size preferences.

class ViewController: UIViewController {

    @IBOutlet weak var label: UILabel!
    
    override func viewDidLoad() {
        super.viewDidLoad()
        NotificationCenter.default.addObserver(self, selector: #selector(updateUI), name: .UIContentSizeCategoryDidChange, object: nil)
        updateUI()
    }

    @objc func updateUI() {

        let bullet = "•  "
        
        var strings = [String]()
        strings.append("Payment will be charged to your iTunes account at confirmation of purchase.")
        strings.append("Your subscription will automatically renew unless auto-renew is turned off at least 24-hours before the end of the current subscription period.")
        strings.append("Your account will be charged for renewal within 24-hours prior to the end of the current subscription period.")
        strings.append("Automatic renewals will cost the same price you were originally charged for the subscription.")
        strings.append("You can manage your subscriptions and turn off auto-renewal by going to your Account Settings on the App Store after purchase.")
        strings.append("Read our terms of service and privacy policy for more information.")
        strings = strings.map { return bullet + $0 }
        
        var attributes = [NSAttributedStringKey: Any]()
        attributes[.font] = UIFont.preferredFont(forTextStyle: .body)
        attributes[.foregroundColor] = UIColor.darkGray
        
        let paragraphStyle = NSMutableParagraphStyle()
        paragraphStyle.headIndent = (bullet as NSString).size(withAttributes: attributes).width
        attributes[.paragraphStyle] = paragraphStyle

        let string = strings.joined(separator: "\n\n")
        label.attributedText = NSAttributedString(string: string, attributes: attributes)
    }

}

Bulleted List for UILabel and UITextView

The first thing to determine is the bullet you want to use. I like to have a • (press option + 8) with two spaces afterwards. We store this in a variable and then build a String array with which we’ll populate each line of our list2. These are then mapped to append the bullet we chose to the front of each string.

let bullet = "•  "        
var strings = [String]()
strings.append("First line of your list")
strings.append("Second line of your list")
strings.append("etc")
strings = strings.map { return bullet + $0 }

Next we create the base attributes of our label or text view such as the font size and colour. As we want the text to scale dependent on the users own text preferences, we use Dynamic Type via preferredFont(forTextStyle: .body) although you can obviously use any font. The bulk of the heavy lifting is done by an NSParagraphStyle attribute called headIndent which adds a fixed amount of padding to all but the first line of a paragraph. We can determine the size that this indent should be by casting our bullet as an NSString and then providing our previously created attributes to the size method. This gives us the width of the bullet and any spacing you added afterwards in the exact font and size you have chosen.

var attributes = [NSAttributedStringKey: Any]()
attributes[.font] = UIFont.preferredFont(forTextStyle: .body)
attributes[.foregroundColor] = UIColor.darkGray

let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.headIndent = (bullet as NSString).size(withAttributes: attributes).width
attributes[.paragraphStyle] = paragraphStyle

Finally we join our string with line breaks (strings.joined(separator: "\n\n")) and create an attributed string with the attributes including the new paragraph style.

This all works but there are two more things you’ll need to do to support dynamic font scaling. First of all you’ll want to ensure that the ‘Automatically Adjusts Font’ checkbox is selected in Interface Builder for your label or text view3. Secondly, you’ll want to be notified when the content size changes (i.e. when the user goes to the Settings app and increases or decreases the text size) by subscribing to the UIContentSizeCategoryDidChange notification and regenerating your label. I prefer to do this in a method named updateUI but your personal preference may vary.

The nice thing about this setup is that it is entirely fluid, doesn’t require any 3rd party dependencies, and can be used with any mixture of bullet types be they a single character, a word, or even emoji:

Bulleted List with custom bullets for UILabel and UITextView

I’ve uploaded a basic project to GitHub to demonstrate this code in action. Hopefully this article will serve as a reminder that you don’t need to import 3rd party libraries to achieve basic text formatting and that you should always be wary of text code that doesn’t take font scaling into account.

  1. Sourced from the excellent tutorial by David Barnard. ↩︎

  2. Don’t forget to use NSLocalizedString - I didn’t bother for the sake of brevity in this article. ↩︎

  3. Alternatively you can use the adjustsFontForContentSizeCategory boolean on UILabel and UITextView↩︎

Solcaster

I’m pleased to announce the release of a new client app I’ve been working on for the last few weeks: Solcaster, the reverse weather app.

I worked as the only iOS developer on Solcaster for Reidefine LLC working remotely from the UK. The app is an interesting concept in that it helps you find weather you might be looking for rather than telling you what the weather is like right now. When you open the app, you can choose the type of weather you are looking for, the area you are searching from, and the maximum amount of travelling you are willing to do. It then runs this through a sophisticated algorithm of my own design to show you where you can find that weather. This is useful for a wide range of people be they searching for sun, snow, storms, wind, or even rain1!

Once you’ve found an area you are interested in, Solcaster provides even more information including the previous and future weather2, interesting hikes and climbs, and any events that may be in the local area. This is all powered by a number of APIs with an intelligent caching layer that makes requests as fast and responsive as possible. In addition, there are deeply customisable settings including temperature thresholds, wind speeds, snow accumulation, and of course the option to choose between Celsius and Fahrenheit.

The app is written entirely in Swift 4.1 and is optimised for all iPhone sizes including the relatively new iPhone X. There are a number of interesting Apple technologies being used including iCloud sync which helps keep the settings the same across all of your devices as well as syncing your favourited trips and recent searches. Whilst the app is free to download, it is limited to searching within a 1 hour radius and does contain ads from Admob. For a single in-app purchase of $1.99, the ads are removed and the radius can be extended up to 8 hours. This unlock is also synced automatically through iCloud sync so that you don’t ever need to hit the “restore purchases” button that is ubiquitious with in-app purchases3.

I really enjoyed working with Reidefine LLC on this project and building something a little bit different to other weather apps on the market. You can download Solcaster on the App Store for free and learn more about it at solcaster.com.

  1. I’m currently in the middle of a heatwave in Leicester. Unfortunately my nearest rain is in 300 miles away in France… tempting! ↩︎

  2. Why the previous weather? If you’re a snowboarder the amount of snow that fell yesterday is probably more important to you than the weather today. ↩︎

  3. It is mandatory for iOS apps to provide a “restore purchases” button when using in-app purchases so that functionality can be restored should you delete and reinstall the app, restore from a backup, or update to a new device. My thinking was that it was far better to sync this information through the secure iCloud sync such that this unlocking is seamless. You can try it by downloading and upgrading Solcaster on one device then opening it on another; within a second or two of opening the app you’ll notice the ads pop away without the need to tap a button and sign in to your Apple account. Wonderful! ↩︎

"iPhone Only" apps on the iPad

Since the launch of the iPad in 2010, any app that runs on the iPhone will also run on the iPad in a scaled format. This is normally1 in the 3:2 aspect ratio which the original iPhone had all the way up until the iPhone 5 when they changed to the 16:9 ratio2. Many developers forget about this fact and happily build their iPhone apps unaware that the App Store Review team will test them on an iPad and reject you if something doesn’t work. The fact that this 3:2 aspect ratio is used has been the bane of my iPhone development life for the past couple of years as even though iOS 10 dropped support for the iPhone 4S you still had to make your iPhone apps run at 3:2 so they could work on the iPad. This is particularly frustrating when designers provide designs at the 16:9 size and you have to find a way to make fixed size assets work on the smaller height without resorting to scrollbars…

It looks like this is set to change with iOS 12 as beta 2 is now running iPhone apps in their 16:9 ratio on all iPads rather than than the 3:2 ratio. Whilst there aren’t any App Store guidelines changes to go along with this, it follows that if you build an iOS 12 app you no longer need to support the 3:2 screen size.

Here is an example of one of my upcoming iPhone only apps running on a 9.7” iPad with iOS 11 and one with iOS 12 beta 2:

As ever, things could still change and Apple may reverse this decision in later betas but it seems unlikely bearing in mind this has been standard practice on the 12.9” iPad Pro since launch.

This is still not a perfect solution. I am desperate for Apple to finally stop the ridiculous notion that iPhone apps need to run on iPad. The App Store actively fights against this edge case by requiring you to choose “iPhone Only”3 when searching for iPhone apps so why not allow developers to choose if their app should be able to run on an iPad or not?

The thing that irks me most about this is that iPhone apps on the iPad are not a good experience. Phil Schiller even said as much at the launch of the iPad Mini:

We’ve learned […] that customers love the ones written for iPad, designed for that screen. What does the other platform have? They have phone applications stretched up; not tablet applications.

He even shows the Android phone version of Yelp scaled up on an Android tablet versus the native iPad app saying:

You get a great experience on iPad mini, you get a scaled up phone experience on that other product. It’s a big difference.

This continues with a number of apps with the message being clear that scaled up phone apps on a tablet are worse than native tablet apps. I couldn’t agree more, Phil.

For now, the change from 3:2 ratio apps to 16:9 ratio is a big win for developers and will avoid a lot of design problems. I can only hope that later Xcode builds will finally allow an option for truly “iPhone Only” builds.

  1. The 12.9” iPad Pro was an outlier that always used the 16:9 ratio for some reason. Doesn’t really make sense that it got this change first as all iPads are the same 4:3 ratio anyway… ↩︎

  2. Unfortunately 16:9 wasn’t the end of it and the iPhone X has a new aspect ratio of 39:18 (or 19.5:9 if you prefer). ↩︎

  3. Which is a ridiculous name. Why not “iPad Only” and “iPad and iPhone”? If they were truly “iPhone Only” they shouldn’t even show up in the same way that Apple TV apps don’t appear on an iPad! ↩︎

Custom Intents with SiriKit on iOS 12

Yesterday was the start of WWDC 2018 and one new feature in iOS 12 caught my attention amongst the many that were demonstrated:

Hey Siri, I lost my keys

The demo was part of the new Shortcuts system and showed that apps could reveal some functionality to Siri, in this case the Tile app being able to search for your keys. Once the keynote was over and the documentation went live, I had a dig through and was intrigued to find a new “custom intent” within SiriKit exposed as INObject. This is paired with a full demo app in the form of Soup Chef that shows how you can create these custom intents and use them as shortcuts for Siri. The most interesting thing about this is the following concept from the Soup Chef overview:

These types define the schema that Siri uses to identify requests the user makes; for example, “Order tomato soup with cheese.” The parameter combination for this example is: soup and options.

When I dug into the code, I found this new Intents.intentdefinition file with which you can create custom intents complete with paramater binding.

Custom Intent creation with SiriKit

This sure looks like the much anticipated ability to write your own Siri code!

What this isn’t

It turns out that isn’t the case. The new custom intents are for “Siri the all-seeing widget assistant” not for “Siri the thing you control with your voice”. These custom intents are designed to be created for very specific use cases and then exposed as shortcuts so that you can access them quickly from your lock screen, add them to a workflow, or activate them with a custom voice command that the user creates. Despite the schema being present and the documentation alluding to voice control, you cannot create your own custom commands such as “Order tomato soup with cheese”.

By way of an example, I have my own app that I use to update my gaming time on my ShyGuys gaming website and I was hoping to be able to use this system to say “Hey Siri, add 0.8 hours of gaming time to Skyrim on the Switch”. In parameter based terms this would be “add [hours:decimal] of gaming time to [title:string] on the [console:enum]”. Unfortunately this is not yet possible although the system shows promise for this future.

Before I go into how this system works and the intended use case, there is one extra thing in SiriKit that will please many developers; Media Intent Domain which effectively allows you to use Siri to control media apps such as Spotify, Audible, or Overcast once the developers add the necessary updates.

Custom Intents and Shortcuts

If you are unable to write custom Siri scripts, what then is the point of the new custom intent? It is designed to give you a quick shortcut to commonly used tasks.

In many ways, the Tile app is the perfect demo as it really only does one thing which is to find a specific object. The developers of Tile could create a custom intent of the sort “Find [tile:custom]” and when the app first launches on iOS 12 they can donate an INIntent for every Tile that you own; this basically registers the shortcut with the system so you are telling Siri1 that there is a “Find Keys” intent, “Find Remote” intent, “Find Dog” intent, etc. These intents are exposed to the user as Shortcuts both within the Settings app and in the new Shortcuts app2. Every time you use the Tile app to find something, the specific intent for that device can be re-donated to the system which helps Siri learn and enables it to prompt you when you may need to do this. For example, if you open your Tile app every morning at 8am and tap on your “Keys” Tile to find it, then that “Find Keys” intent is donated to the system helping Siri realise it should probably show you that intent just before 8am. How does it show an intent? By displaying it as a Shortcut on your lock screen, notification centre, Apple Watch, or within the Shortcuts app where it can then be paired with other Shortcuts from other apps (i.e. you could have an “I’m running late” workflow which sends an iMessage to your boss, activates your find keys intent, loads up your route to work in Maps, and opens the garage ready for you to jump into your car).

The piece that makes this slightly more confusing is that you can add a custom Siri voice command to a Shortcut. When Craig demonstrated saying “Hey Siri, I lost my keys”, that is really just a voice command on the “Find Keys” custom intent and is highly specific to that particular Tile; you’d have to record a new one if you wanted to find your TV Remote Tile. These Shortcut Phrases can be created either from within the Settings app or an app can present a view controller3 (complete with a suggested command text) that lets the user record their custom snippet.

When a Shortcut is invoked (either by a Shortcut Phrase or by tapping on a Shortcut) it can either launch your app in the foreground or fire up your INExtension that will allow you to then return a custom UI directly within Siri. Both have their uses although again they are fairly specific.

By way of an example, lets say I order a Chinese takeaway every Friday night4 via the Just Eat app. When I place my order, the app could create two custom intents:

  1. A generic intent for the takeaway venue: "Order from [name:string]"
  2. A specific intent for my meal: "Order [menuitems:array] from [name:string]"

The first one could launch the Just Eat app and take me directly to the menu for the takeaway I order from so I can peruse and then place my order. The second one would instead be able to place my regular order without opening the app and even provide custom UI to perform an Apple Pay transaction within Siri.

This is super powerful when combined with other Shortcuts as I could then record a Shortcut Phrase “Hey Siri, it’s Friday Friday got to get down on Friday”5 which would turn on my living room lights, open up the Netflix app on my Apple TV6, lock the front door, and place my Chinese order.

The fact that these Shortcuts can be created silently by the app and then donated to Siri so it can then suggest them to you at certain points is also super interesting. Siri already knows to show the Just Eat app in my Siri App Suggestions on a Friday night so having it in the future automatically prompt me to place a repeat order will cut out some time. Once lots of apps add support for this it will be cool and perhaps a little scary to see what regular habits we have that we didn’t even realise.

(Update: It turns out that apps have been creating Shortcuts since as far back as iOS 8. If you make use of NSUserActivity then these are donated automatically when calling becomeCurrent() or you can use the donate(completion:) method of INInteraction since iOS 10 to donate any of the standard SiriKit interactions such as starting a workout, initiating a voip call, or booking a ride. Any app that has done this, regardless of whether it has been updated for iOS 12, will show in the Shortcuts system.)

To be clear, this system is not yet at the same stage as Alexa or Google Home. You can’t say “Order half crispy aromatic duck and some egg friend rice from Peking House” without first having already placed that order and assigning a Shortcut Phrase to it. However, the jump to that system suddenly doesn’t seem so far. Siri is already getting all of the data it needs thanks to the intents parameter builder and I can’t shake the feeling that these custom Shortcut Phrases are just going to be used to train Siri to lots of different words over the coming year. There is going to need to be some clever work to avoid collisions7 but on the whole I’m excited to see where this heads next.

All of the topics above are due to be covered at WWDC today and tomorrow at the following sessions:

  • Tuesday 5pm: “Introduction to Siri Shortcuts” [link]
  • Wednesday 10am: “Building for Voice with Siri Shortcuts” [link]
  • Wednesday 11am: “Siri Shortcuts on the Siri Watch Face” [link]

Also, don’t forget to check out the Soup Chef demo app.

  1. Siri the widget master, not the voice. ↩︎

  2. Which is not available in iOS 12 Beta 1. ↩︎

  3. INUIAddVoiceShortcutViewController ↩︎

  4. And maybe Tuesday night as well. Sometimes. ↩︎

  5. It’s a classic. ↩︎

  6. I can dream - this might be possible with the Shortcuts app but we won’t know until it appears in a later beta seed! ↩︎

  7. Even saying “Hey Siri, use Just Eat to order half crispy aromatic duck and some egg fried rice from Peking House” isn’t great as an app name is not unique (only the app name on the App Store is and even then it can’t distinguish between “Just Eat”, “JustEat”, and “Just Eat!”). It’s a solvable problem but it does add an extra layer of difficulty. ↩︎

« Older Entries Newer Entries »