Ben Dodson

Freelance iOS, macOS, Apple Watch, and Apple TV Developer

Custom Intents with SiriKit on iOS 12

Yesterday was the start of WWDC 2018 and one new feature in iOS 12 caught my attention amongst the many that were demonstrated:

Hey Siri, I lost my keys

The demo was part of the new Shortcuts system and showed that apps could reveal some functionality to Siri, in this case the Tile app being able to search for your keys. Once the keynote was over and the documentation went live, I had a dig through and was intrigued to find a new “custom intent” within SiriKit exposed as INObject. This is paired with a full demo app in the form of Soup Chef that shows how you can create these custom intents and use them as shortcuts for Siri. The most interesting thing about this is the following concept from the Soup Chef overview:

These types define the schema that Siri uses to identify requests the user makes; for example, “Order tomato soup with cheese.” The parameter combination for this example is: soup and options.

When I dug into the code, I found this new Intents.intentdefinition file with which you can create custom intents complete with paramater binding.

Custom Intent creation with SiriKit

This sure looks like the much anticipated ability to write your own Siri code!

What this isn’t

It turns out that isn’t the case. The new custom intents are for “Siri the all-seeing widget assistant” not for “Siri the thing you control with your voice”. These custom intents are designed to be created for very specific use cases and then exposed as shortcuts so that you can access them quickly from your lock screen, add them to a workflow, or activate them with a custom voice command that the user creates. Despite the schema being present and the documentation alluding to voice control, you cannot create your own custom commands such as “Order tomato soup with cheese”.

By way of an example, I have my own app that I use to update my gaming time on my ShyGuys gaming website and I was hoping to be able to use this system to say “Hey Siri, add 0.8 hours of gaming time to Skyrim on the Switch”. In parameter based terms this would be “add [hours:decimal] of gaming time to [title:string] on the [console:enum]”. Unfortunately this is not yet possible although the system shows promise for this future.

Before I go into how this system works and the intended use case, there is one extra thing in SiriKit that will please many developers; Media Intent Domain which effectively allows you to use Siri to control media apps such as Spotify, Audible, or Overcast once the developers add the necessary updates.

Custom Intents and Shortcuts

If you are unable to write custom Siri scripts, what then is the point of the new custom intent? It is designed to give you a quick shortcut to commonly used tasks.

In many ways, the Tile app is the perfect demo as it really only does one thing which is to find a specific object. The developers of Tile could create a custom intent of the sort “Find [tile:custom]” and when the app first launches on iOS 12 they can donate an INIntent for every Tile that you own; this basically registers the shortcut with the system so you are telling Siri1 that there is a “Find Keys” intent, “Find Remote” intent, “Find Dog” intent, etc. These intents are exposed to the user as Shortcuts both within the Settings app and in the new Shortcuts app2. Every time you use the Tile app to find something, the specific intent for that device can be re-donated to the system which helps Siri learn and enables it to prompt you when you may need to do this. For example, if you open your Tile app every morning at 8am and tap on your “Keys” Tile to find it, then that “Find Keys” intent is donated to the system helping Siri realise it should probably show you that intent just before 8am. How does it show an intent? By displaying it as a Shortcut on your lock screen, notification centre, Apple Watch, or within the Shortcuts app where it can then be paired with other Shortcuts from other apps (i.e. you could have an “I’m running late” workflow which sends an iMessage to your boss, activates your find keys intent, loads up your route to work in Maps, and opens the garage ready for you to jump into your car).

The piece that makes this slightly more confusing is that you can add a custom Siri voice command to a Shortcut. When Craig demonstrated saying “Hey Siri, I lost my keys”, that is really just a voice command on the “Find Keys” custom intent and is highly specific to that particular Tile; you’d have to record a new one if you wanted to find your TV Remote Tile. These Shortcut Phrases can be created either from within the Settings app or an app can present a view controller3 (complete with a suggested command text) that lets the user record their custom snippet.

When a Shortcut is invoked (either by a Shortcut Phrase or by tapping on a Shortcut) it can either launch your app in the foreground or fire up your INExtension that will allow you to then return a custom UI directly within Siri. Both have their uses although again they are fairly specific.

By way of an example, lets say I order a Chinese takeaway every Friday night4 via the Just Eat app. When I place my order, the app could create two custom intents:

  1. A generic intent for the takeaway venue: "Order from [name:string]"
  2. A specific intent for my meal: "Order [menuitems:array] from [name:string]"

The first one could launch the Just Eat app and take me directly to the menu for the takeaway I order from so I can peruse and then place my order. The second one would instead be able to place my regular order without opening the app and even provide custom UI to perform an Apple Pay transaction within Siri.

This is super powerful when combined with other Shortcuts as I could then record a Shortcut Phrase “Hey Siri, it’s Friday Friday got to get down on Friday”5 which would turn on my living room lights, open up the Netflix app on my Apple TV6, lock the front door, and place my Chinese order.

The fact that these Shortcuts can be created silently by the app and then donated to Siri so it can then suggest them to you at certain points is also super interesting. Siri already knows to show the Just Eat app in my Siri App Suggestions on a Friday night so having it in the future automatically prompt me to place a repeat order will cut out some time. Once lots of apps add support for this it will be cool and perhaps a little scary to see what regular habits we have that we didn’t even realise.

(Update: It turns out that apps have been creating Shortcuts since as far back as iOS 8. If you make use of NSUserActivity then these are donated automatically when calling becomeCurrent() or you can use the donate(completion:) method of INInteraction since iOS 10 to donate any of the standard SiriKit interactions such as starting a workout, initiating a voip call, or booking a ride. Any app that has done this, regardless of whether it has been updated for iOS 12, will show in the Shortcuts system.)

To be clear, this system is not yet at the same stage as Alexa or Google Home. You can’t say “Order half crispy aromatic duck and some egg friend rice from Peking House” without first having already placed that order and assigning a Shortcut Phrase to it. However, the jump to that system suddenly doesn’t seem so far. Siri is already getting all of the data it needs thanks to the intents parameter builder and I can’t shake the feeling that these custom Shortcut Phrases are just going to be used to train Siri to lots of different words over the coming year. There is going to need to be some clever work to avoid collisions7 but on the whole I’m excited to see where this heads next.

All of the topics above are due to be covered at WWDC today and tomorrow at the following sessions:

  • Tuesday 5pm: “Introduction to Siri Shortcuts” [link]
  • Wednesday 10am: “Building for Voice with Siri Shortcuts” [link]
  • Wednesday 11am: “Siri Shortcuts on the Siri Watch Face” [link]

Also, don’t forget to check out the Soup Chef demo app.

  1. Siri the widget master, not the voice. ↩︎

  2. Which is not available in iOS 12 Beta 1. ↩︎

  3. INUIAddVoiceShortcutViewController ↩︎

  4. And maybe Tuesday night as well. Sometimes. ↩︎

  5. It’s a classic. ↩︎

  6. I can dream - this might be possible with the Shortcuts app but we won’t know until it appears in a later beta seed! ↩︎

  7. Even saying “Hey Siri, use Just Eat to order half crispy aromatic duck and some egg fried rice from Peking House” isn’t great as an app name is not unique (only the app name on the App Store is and even then it can’t distinguish between “Just Eat”, “JustEat”, and “Just Eat!”). It’s a solvable problem but it does add an extra layer of difficulty. ↩︎

Developers who work from spectacular locations

I was recently asked some questions about how I work for an article about developers who work from spectacular locations. You can read the full piece over at InfoWorld but I’ve put their full questions and my answers below:

What work do you do?

I am a freelance app developer working on apps for iPhone, iPad, Apple Watch, and Apple TV.

When/how did you go remote?

I used to be a Development Manager in a London-based digital agency back in 2009 and when I moved to another agency it wasn’t a good fit (far too many meetings, not enough actual work). I made the decision to go freelance as a PHP developer which necessitated me working from home essentially forcing me into remote work. I was intrigued by the recently launched iPhone and so started to develop for that - thanks to a lucky appearance on The Gadget Show, I’ve been able to do that ever since.

How did you end up in the location you are in?

I’ve always worked from home as a freelancer so the location I’m in now is really just because I live here (which was due to meeting my wife). The greatest joy of remote working is that you can literally work from anywhere so whilst I could theoretically work in a hammock on a beach I’m much happier sat in bed with a cup of tea and my laptop!

What did it take to go remote? (Buy a laptop? Outfit an office? Argue with a boss/team? Something else?)

The only thing I needed was the confidence to quit my job and base all of my income on freelancing. That wasn’t an easy decision and it was fairly difficult for the first few months but it all worked out in the end.

Have you compared the economics of your remote situation to your previous one? How do they compare?

Initially I was earning a lot less money due to having very few clients and I was paid irregularly compared to a monthly salary. This changed fairly quickly though and now there is no question that financially it was the right decision to make. Even more importantly is the economics of happiness; I am far happier in my life as a freelance remote worker able to choose when and where I work than I ever was working in an office. This is so true that I refuse to work as a contractor in a clients office even if only for a few days as I find I just can’t produce the same quality of work when stuck in an open plan office on a fixed time schedule.

Are there hassles you didn’t expect?

The biggest problems are distractions and motivation. It is very easy, especially in the first few months, to kick back and do very little work as you don’t feel the need to rush. Then, when you run out of money you panic and work ridiculously long hours to try and get some invoices sent out. Freelancers typically have periods they call “feast and famine” or “rollercoaster dips” but they are generally referring to having enough advance work; my problem was always that I introduced those periods myself by putting things off as sometimes I’d wake up and just want to play on the Xbox rather than writing an algorithm for a social feed. Thankfully I have now gotten myself into a comfortable routine (especially now that I don’t live on my own) and so things tend to be smooth sailing but at the start it was very difficult to stay motivated when surrounded by nice distractions!

DrinkCoach+

I’m pleased to announce the release of a new client app, DrinkCoach+, that I worked on over the summer:

I worked on DrinkCoach+ for Orbis Media as a freelance iOS developer. The app was designed for the Haringey Advisory Group on Alcohol (HAGA) and is designed to help people keep a track of their alcohol intake with measurable goals and monitor any events caused by drinking such as moodswings, cravings, or accidents. This is actually version 3.0 of the app with the previous versions having been created by a different developer. The old code was neither available or desireable and so I rebuilt the app from scratch using Swift 4.0 and the latest iOS SDKs to ensure it was future proofed for any future developer to work on.

The app is optimised for all iPhone and iPad devices (including the iPhone X) and makes use of AutoLayout to scale perfectly across the growing number of screen resolutions and aspect ratios. In terms of functionality, there are a number of interesting technologies used in the app including:

  • Local notifications: the app can notify you at certain times and even locations to help keep you on track with your goals. This was all done locally on the device so that notifications could be triggered without an internet connection or unreliable push notifications.
  • Infographic generation: an infographic detailing how many calories and units you’ve consumed along with your total alcohol expenditure can be displayed in the app and shared as an image.
  • Full customisation: the drinks in the app can be customised with specific units, calories, and pricing information so that they accurately reflect your usual tipple. Drinks can also be dragged and dropped so that your regular drinks are easier to access.

It was really great to work with Orbis Media again and I’m confident that people will love the improvements to this version of the app. You can download DrinkCoach+ on the App Store for free and learn more about it at drinkcoach.org.uk.

Forcing left-to-right text in iOS localizations

Since iOS 6 it has been the case that any localization that utilises a right-to-left language (such as Arabic) will automatically flip your views so that everything scans from right-to-left. Usually this is desireable but there are certain instances where you may want to disable this functionality (such as with a media player that should scrub from left-to-right). I was recently asked by a client to completely disable the right-to-left functionality for all languages as it was causing too many display issues within the app and customers were specifically saying they’d prefer it to scan that way.

After a bit of searching, the general consensus was that I’d need to manually alter1 all of my horizontal constraints in order to force them to be left to right rather than leading to trailing which will flip based on localization. In a project with 1000s of these constraints this did not seem a suitable course of action and would require any future developers on the project to keep this in mind when creating new constraints.

Instead, I came across a property added to UIView in iOS 9 named semanticContentAttribute. This allows you to choose unspecified (the default which flips based on localization), playback and spatial which are special cases for media controls or directional controls, and forceLeftToRight and forceRightToLeft which work as their names would suggest. Thanks to the UIAppearance protocol, disabling the flipping globally is a simple one-liner:

UIView.appearance().semanticContentAttribute = .forceLeftToRight

This not only flips all content back to left-to-right but also ensures that your UINavigationController will animate from left-to-right as well. Of course, you can also use the appearanceWhenContainedIn: method to limit this global change to specific view controllers of your app should you wish to or to set certain controls to other directions.

The only other thing I needed to change in my project to get this all working was some paragraph styles for attributed strings. I frequently use NSMutableParagraphStyle to set custom line heights and I leave the other properties to their defaults. One of these is alignment which is always left on my devices due to my English language but the default is actually natural which means it renders depending on the language. Searching through my project and finding the few places I’d left out a default and setting it was fairly trivial:

let paragraph = NSMutableParagraphStyle()
paragraph.lineSpacing = 4.0
paragraph.alignment = .left

In total, I only needed to make 8 edits to my project; much easier than trying to edit every horizontal constraint in your storyboard!

  1. If I had gone down that route I likely would have written a build script that would go through every xib and storyboard file and do this for me but I have been burned by manually editing xib files in the past. That’s a young man’s game! ↩︎

MPMediaItem+CanAddToLibrary.swift

Since iOS 9.3 it has been possible to add Apple Music tracks to the media library as such:

let library = MPMediaLibrary()
library.addItem(withProductID: id) { (entity, error) in
	if let error = error {
		NSLog("Error: \(error.localizedDescription)")
	}
}

This is powerful as you can use a simple identifier to both play a song and add it to the library but it is likely that your UI will want to show an add to library button similar to the Music app on iOS. To remedy this, I’ve created a simple Swift extension1 for MPMediaItem that tells you if a currently playing track is available in your library:

import UIKit
import MediaPlayer

extension MPMediaItem {

    var canAddToLibrary: Bool {
        let id = MPMediaPropertyPredicate(value: persistentID, forProperty: MPMediaItemPropertyPersistentID)
        let query = MPMediaQuery(filterPredicates: [id])
        let count = query.items?.count ?? 0
        return count == 0
    }
    
}

Why an extension on MPMediaItem? The only way to know if a track is in your library is to search the user library with an MPMediaQuery. Unfortunately you can’t search on the MPMediaItemPropertyPlaybackStoreID (as some tracks may not be on Apple Music) so instead you need to use the persistent ID property. If you try and play an Apple Music track using an identifier, then you can retrieve an MPMediaItem and use that to get the persistent ID for searching the media library. I use this in my own apps2 by listening to the MPMusicPlayerControllerNowPlayingItemDidChange notification and then checking if there is a nowPlayingItem on my MPMusicPlayerController instance; if there is then check it to find the current status:

var player = MPMusicPlayerController()

override func viewDidLoad() {
    super.viewDidLoad()
    
    player.beginGeneratingPlaybackNotifications()
    NotificationCenter.default.addObserver(self, selector: #selector(playbackStateDidChange), name: .MPMusicPlayerControllerNowPlayingItemDidChange, object: nil)
}


@objc func playbackStateDidChange() {
    
    guard let item = player.nowPlayingItem else {
        return
    }

    // at this point we do not know if the track can be added - any UI for adding should be hidden

    if item.canAddToLibrary {
    	// show your "Add to library" button
    } else {
    	// show some UI to explain "Already in library"
    }
}

I am fairly sure this is how the Music app works on iOS as you’ll notice when skipping tracks that the UI for the track status is completely hidden until the track is ready to play at which point either an add button or a tick will appear. One thing that caught me out was listening for MPMusicPlayerControllerPlaybackStateDidChange but this seems to fire inconsistently both on iOS 10 and iOS 11 (unless you are on an iPad and run the Music app in split-screen mode in which case it always works) - checking for MPMusicPlayerControllerNowPlayingItemDidChange works consistently and will still yield an MPMediaItem with which to work with.

IMPORTANT: The user will need to have granted permission to access their media library in order for this extension to work. It will crash your app if you do not have NSAppleMusicUsageDescription in your Info.plist (although that is the bare minimum - you should actively check for capabilities before using this as no point showing an “Add to library” button if the user doesn’t have that capability!)

Checkout the MPMediaItem+CanAddToLibrary Swift extension on GitHub

  1. I wrote the extension using Swift 4 with Xcode 9 / iOS 11 SDK but it should work just fine in Swift 3 as it isn’t using any new language stuff. ↩︎

  2. Including an exciting iOS 11 only Apple Music app I’m working on. If you’re an Apple Music subscriber with the iOS 11 beta installed (developer or public), contact me for a test version before it launches in September… ↩︎

Flawless

One of the great things about Twitter is the way it can connect you to other developers. For many years, I’ve been chatting with Lisa Dziuba and she got in touch with me last year along with her colleague Ahmed Sulaiman to tell me about a new app they were working on, Flawless, which has now officially launched.

Flawless is an absolute godsend for developers working on pixel perfect designs. It is a plugin for the iOS simulator that allows you to compare what is rendering on screen with a static image via various different modes and a slider for opacity. In this way, you can make sure that what you have built matches the image precisely. The app is a plugin for the simulator itself so you don’t need to add any extra code or frameworks to your project. It also works with static images1 so you don’t need to worry how your designs are provided be that by Sketch, Photoshop, or other2.

I’ve been lucky enough to be both a beta tester and to be interviewed by Lisa and Ahmed about my workflows to help them tweak the app. If you are an app developer that has to work to a fixed design, you should definitely check it out. Flawless is available for macOS at the bargain price of $153 and you can find out more (and get a free trial) on their website at flawlessapp.io.

  1. I believe the original idea was to be integrated with Sketch but after they spoke with me and many other developers who don’t work exclusively with Sketch they decided to pivot to being an iOS simulator plugin instead. ↩︎

  2. I have one client who provides me designs via InDesign and he is also the most likely to create a GitHub issue for a dividing line being 0.5px out of alignment - love ya Niki 👊 ↩︎

  3. This has saved me so many subsequent bug follow ups that $15 is almost criminally low. ↩︎

Great British Bee Count 2017

For the past couple of years, I have worked with Two Thirds Water on the Great British Bee Count iOS app for Friends of the Earth. Today, an updated version of the app has gone live to support this years count which runs from 19th May until 30th June 2017:

The main update has been a completely new design which fits the more modern “flat” design which was made popular by iOS 7 whilst also putting the navigation within easy thumb reach even on plus-sized devices. There are also many new functionality changes such as an improved bee picker and fact files on each type of bee that can give you a lot more information.

As I also needed to migrate the app to Swift 3.1, I took the decision to completely rebuild the app from scratch1 so I could make use of some newer iOS features such as stack views and improved auto layout constraints. I also made improvements to the way in which content is stored on the device locally in a Realm database making the whole app feel even faster whilst increasing the reliability of sending count information in the background2.

You can check out the Great British Bee Count on the App Store (it’s free) or learn more about the bee cause.

  1. This was a decision the client was not made aware of and whilst it cost me more time (as I worked to a fixed budget) the end result is an app I can be really proud of. I’d much rather spend a bit more time and money from my own pocket to make something perfect than try and hack something together quickly especially if it is a full redesign of an existing app. Due to changes in the Swift language and a move away from separate xib files to storyboards, I was able to reduce the overall file size by 20% and the amount of code by 45%. ↩︎

  2. For example, if you try and submit a count when you have no network connection, the app can automatically upload this information once connectivity is restored even if it is no longer in the foreground. ↩︎

Reaction Cam

I’m very pleased to announce the release of a new client app I’ve been working on over the past couple of months: Reaction Cam.

I was hired by Elliott Brock to build Reaction Cam, an easy-to-use app that allows you to quickly record and share your reaction to anything from video to content in a browser. Due to the limitations of iOS sandboxing, this was an immense technical challenge but the end result is incredibly slick with the power to record a reaction and the content on screen at the same time. Once recording is complete, you can edit your reaction and the content by trimming and rotating as well as swapping what is used as the “picture in picture” recording (or disabling it altogether)1. A single file is exported at the end of the process for easy sharing with friends on social media.

Supported media for reacting to includes video (both saved on device and online), photos (complete with a swipeable scrolling interface), and a full browser for viewing everything from tweets to blog posts. In addition to the app, I also built an API and an admin system so that Elliott could handpick recommended videos to react to as well as publishing some of the reactions that had been recorded with the app.

I was given a very loose spec and so I designed the app and worked out all of the UX myself; I even designed the app icon! Reaction Cam is built in Swift 3.1 and makes use of the latest features in iOS 10 to allow it to run quickly on all of the various iOS devices with full scaling support for every screen size.

I really enjoyed working with Elliott on Reaction Cam and hope that people will find it to be a best-in-class app for reaction recording. You can download Reaction Cam on the App Store and learn more about it on the Reaction Cam website.

  1. Unlike some other reaction apps, I actually record both the front-facing camera and the content onscreen to separate files rather than simply outputting the front-facing camera into the page and capturing just what is on screen. This is massively important as it allows for editing such as swapping which video is shown in the smaller view (i.e. you might want your reaction to be more prominent), it allows you to move the smaller view around, and it means you can disable one or the other after recording. Of course, capturing both live video and the content on screen separately is a technical challenge but I’m very happy I was able to maintain a 30fps recording from both streams even on the oldest supported device (an iPhone 5). ↩︎

AlcoPath

I’m pleased to announce the release of a new client app I’ve been working on for the past few weeks: AlcoPath.

I worked on AlcoPath for Orbis Media as a freelance iOS developer. The app was designed in consultation with the the Nottinghamshire Healthcare NHS Foundation Trust and features a WEKP Cognitive Assessment (incorporating 6CIT, Ataxia test, Opthalmoplegia test, and other associated risk factors) to diagnose Wernicke’s encephalopathy, a Withdrawal Assessment for Alcohol using a revised CIWA-Ar scale, and industry recommended pathways all in-line with NICE Guidance.

The app is available for free and can be used by clinical staff on both iPhone and iPad thanks to a scaling interface suitable for all device sizes. An A4-sized PDF can be generated with the personalised results of each assessment and this can be printed directly from the app. Push notifications were also integrated to inform users quickly of any updates.

In order to render the various assessment questions efficiently and accurately, I built a local PHP-based tool to input the various questions and output a JSON file that the app would then interpret to build each question and the various ways of answering be that with a toggle, multiple selection, or text entry. This prevented the need for the app to connect to an online database but also enabled me to make prompt updates should new questions need to be added or existing questions be edited in the future.

It was a great experience working with Orbis Media on this app and the feedback from clinicians has been great so far. You can download AlcoPath on the App Store and learn more about it at AlcoPath.co.uk.

Building tools for Kylo Ben

I’ve been running my Kylo Ben website about video games since October 2016 and this year I decided to start doing a weekly update about gaming news and what I’ve been playing. Whilst it is fun to do, it is very time consuming as I need to collate interesting links I’ve found, my articles, podcasts, game releases, games I’ve played, and games I’ve purchased which means an average post will take between 1.5 to 2 hours to write. Being a developer means I can build my own digital tools to help me out and so last week I built a few little tools to help cut that time dramatically.

One of the bigger pieces of the weekly roundup is a list of interesting news that has happened in the world of video games. Initially I would copy and paste the URLs of interesting links I found and save them into the Notes app1 on iOS or Mac. This worked fine but it was a little clunky and getting the data back out took time as I’d need to open each one to see what it was and then add Markdown syntax to each URL I wanted to use. To solve this, I wrote an iOS app and a macOS app that would provide extensions for URLs allowing me to quickly save them to my database.

[app name] would like to access Apple Music

The iOS app is purely a blank view controller with a bundled share extension that looks a little like this2:

class ShareViewController: SLComposeServiceViewController {

    override func isContentValid() -> Bool {
        return true
    }

    override func didSelectPost() {
        if let item = extensionContext?.inputItems.first as? NSExtensionItem, let itemProvider = item.attachments?.first as? NSItemProvider {
            if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeURL as String) {
                itemProvider.loadItem(forTypeIdentifier: kUTTypeURL as String, options: nil, completionHandler: { (url, error) in
                    if let shareURL = url as? URL {
                        APIClient.post("save", parameters: ["url": shareURL.absoluteString, "title": self.textView.text], onCompletion: { (error, response) in
                            if let error = error {
                                let controller = UIAlertController(title: "ERROR", message: "That didn't work: \(error.localizedDescription)", preferredStyle: .alert)
                                controller.addAction(UIAlertAction(title: "OK", style: .default, handler: { (action) in
                                    self.extensionContext!.completeRequest(returningItems: [], completionHandler: nil)
                                }))
                                self.present(controller, animated: true, completion: nil)
                            } else {
                                self.extensionContext!.completeRequest(returningItems: [], completionHandler: nil)
                            }
                        })
                    }
                })
            }
        }
    }

    override func configurationItems() -> [Any]! {
        return []
    }

}

This is paired with an NSExtensionActivationSupportsWebURLWithMaxCount entry in the Info.plist file so that it will activate whenever I try and share a URL anywhere within iOS. If I’m in an app reading an article that I want to save, I simply tap the share icon and then choose the Kylo Ben app from the list as shown in the screenshot above. The URL and title will then be sent to my server for retrieval later on.

[app name] would like to access Apple Music

I’ve written a number of Safari extensions in JavaScript before but El Capitan added the option to write native extension in Swift via a Safari Extension bundled with your macOS app. To avoid having to make AJAX calls in JavaScript3, I chose to build a simple macOS app with a Safari Extension that looks like this:

class SafariExtensionHandler: SFSafariExtensionHandler {
    
    override func toolbarItemClicked(in window: SFSafariWindow) {
        window.getActiveTab { (tab) in
            tab?.getActivePage(completionHandler: { (page) in
                page?.getPropertiesWithCompletionHandler({ (properties) in
                    if let properties = properties, let url = properties.url {
                        let title = properties.title ?? "Unknown Title"
                        APIClient.post("save", parameters: ["url": url.absoluteString, "title": title], onCompletion: { (error, response) in
                            if let error = error {
                                NSLog("PROBLEM! \(error)")
                            } else {
                                page?.reload()
                            }
                        })
                    }
                })
            })
        }
    }

}

When I click on the controller icon, the method above is called and the URL and title are sent to my server; once completed, the page reloads to show me it has been successful. I spent a long time trying to just get a simple alert to display on either success or failure but I couldn’t get it to work correctly. It is possible to interact with JavaScript and I was able to log to the console but any alert would silently fail. If anybody has any tips on that, I’d love to know how to improve it.

The basic template of my weekly update is the same every week and I used to use a number of custom MySQL queries to pull out the various information I needed and then write it up manually. Now that I have my links stored in my database, I decided to write a PHP script to generate as much of my update as possible so all I need to do is fill in some of the blanks that aren’t automatically provided (i.e. upcoming game release dates) and write my own thoughts and opinions around the news articles. I have a basic PHP script which runs a number of MySQL queries and then generates a Markdown document like this:

Introduction...

###News

[Final Fantasy 15's PS4 Pro Update Out Now, Improves Frame Rate And More - GameSpot](http://www.gamespot.com/articles/final-fantasy-15s-ps4-pro-update-out-now-improves-/1100-6448025/)

[New PlayStation 4 Pro patch for Final Fantasy XV makes it look worse | Ars Technica](https://arstechnica.com/gaming/2017/02/new-playstation-4-pro-patch-for-final-fantasy-xv-makes-it-look-worse/#p3)

[This tiny Nintendo Switch feature is already making fans super happy - Polygon](http://www.polygon.com/2017/2/20/14668988/nintendo-switch-click-sound-effect-joy-con)

[Alto's Odyssey awaits, Summer 2017](http://blog.builtbysnowman.com/post/157488116747/altos-odyssey-summer-2017)

[never gonna give you up - What’s In the Box?2?! Take 2](http://tyrod.com/post/157494246009/whats-in-the-box2-take-2)

[Steam Community :: Group Announcements :: Orwell](http://steamcommunity.com/games/491950/announcements/detail/484538095747263770)

[Nintendo tag teams with John Cena for living room-inspired Switch demos - Polygon](http://www.polygon.com/2017/2/21/14682742/nintendo-switch-john-cena)

[Look What Mega Bloks Is Doing To Pokémon ](http://kotaku.com/look-what-mega-bloks-done-to-pokemon-1792555348)

[Pillars of Eternity 2 campaign clears $3 million - Polygon](http://www.polygon.com/2017/2/21/14689394/pillars-of-eternity-2-deadfire-funded-3-million-fig)

[Take a look at how itty-bitty the Nintendo Switch cartridge is - Polygon](http://www.polygon.com/2017/2/21/14691596/nintendo-switch-cartridge-size-comparison)

[Australia Is Coming To Civilization VI](http://kotaku.com/australia-is-coming-to-civilization-vi-1792599435)

[Rocket League Original Minis toys expanding with light-up cars - Polygon](http://www.polygon.com/2017/2/21/14692528/rocket-league-original-minis-light-up-cars)

[Hot and heavy Mass Effect pack comes to Cards Against Humanity - Polygon](http://www.polygon.com/2017/2/22/14698798/cards-against-humanity-mass-effect-pack)

And finally, 

###My Posts
- Making the earth move with Stagehand — "I really like the premise of a "reverse platformer" but there simply isn't enough content to keep me coming back when it is stood next to _Tiny Wings_, _Alto's Adventure_, and _Super Mario Run_" [[link](https://kyloben.co.uk/stagehand-review)]

###Podcasts
- Podcast #xx: Title [[link]()]
- Another Podcast #xx: Title [[link]()]

###Upcoming Game Releases
- _Game Title #1_ (date - platforms) [[link]()]
- _Game Title #2_ (date - platforms) [[link]()]
- _Game Title #3_ (date - platforms) [[link]()]
- _Game Title #4_ (date - platforms) [[link]()]
- _Game Title #5_ (date - platforms) [[link]()]

###Gaming Time
This week I spent 9.6 hours playing six different games:

- **Stagehand** (0.5hrs): Text...
- **Rocket League** (0.6hrs): Text...
- **Pokémon Moon** (0.7hrs): Text...
- **Forza Horizon 3** (1.1hrs): Text...
- **SteamWorld Heist** (2.8hrs): Text...
- **Night in the Woods** (3.9hrs): Text...

This week I added 2 new games to my library: _Crusader Kings II_, _Night in the Woods_.

Details on games I'm planning on playing this week...

Until next time, have a great week!

---

_Did you enjoy this weekly roundup? Make sure you don't miss one by subscribing to [Kylo Ben Weekly](https://kyloben.co.uk/weekly) - it's this post in email form every Monday!_

The news URLs are simply pulled from the database and wrapped up so that each link uses the title of the page as provided by the macOS and iOS extensions. I will nearly always change the link title (as it’ll be part of a sentence) but it allows me to quickly see what an article is about without needing to open it up and re-read it. The “my posts” section requires no editing at all as it pulls the title, link, and a pull quote directly from the articles I’ve published in the previous week. The podcasts and upcoming game release sections can’t be automatically populated (yet) so I just use placeholder text for these to reduce the amount of effort required. The final section on my gaming time uses a number of queries to get the exact amount of time I’ve spent playing in the past week, adds placeholders for each game so I can write about them, and then lists out any new games I’ve added to my library; all of this is thanks to some scripts I wrote a while back that scrape my Steam and Xbox One libraries to track changes and allow me to render a page showing my gaming time for the past few months.

Once I’ve finished writing, the Markdown file is uploaded to my server and the weekly update will then appear on the website. I then use Byword’s “copy as HTML” feature to generate a HTML version and use that with Mailchimp to write and send out the email version of the update.

With these tools, I can now write my weekly update pretty quickly and only have to focus on what I want to say rather than spending time on copying, pasting, and formatting. If you’re interested in video games, sign up to the weekly email as it is the best way to get a digest of what has been happening over the past week as well as seeing what new games are arriving.

  1. I could have used a service like Pocket to do this but then I’d have to either use two Pocket accounts or fill my personal account with links that I don’t want to read later. ↩︎

  2. This is not what I would call production code quality so don’t just wildly copy and paste this into an app or you’ll likely regret it. Works well enough for my own personal use though! ↩︎

  3. Which is a nightmare when you start hitting cross domain restrictions. ↩︎

« Older Entries Newer Entries »