Ben Dodson

Freelance iOS, macOS, Apple Watch, and Apple TV Developer

Adding teachable moments to your apps with TipKit

When TipKit was first mentioned during the WWDC 2023 State of the Union, I assumed it was going to be a way for apps to appear within the Tips app and maybe appear within Spotlight. Instead, it’s a built-in component for adding small tutorial views to your own app across all platforms complete with a rules system for condition-based display and syncing across multiple devices via iCloud! Even better, it’s something Apple are using themselves throughout iOS 17 such as in the Messages and Photos apps.

Having built a fair few popover onboarding systems in the past, this was quickly my most anticipated feature from WWDC 2023. I was slightly disappointed then when Xcode beta after Xcode beta was missing the TipKit framework. Fortunately, Xcode 15 beta 5 (released last night) now includes the relevant framework and documentation allowing me to integrate tips into my own apps.

Before I demonstrate how TipKit works and how you can incorporate it into your own apps, here is a really key piece of advice from Ellie Gattozzi in the “Make features discoverable with TipKit” talk from WWDC 2023:

Useful tips have direct action phrases as titles that say what the feature is and messages with easy to remember benefit info or instructions so users know why they’d want to use the feature and are later able to accomplish the task on their own.

With that said, let’s create our first tip!

Note: I’ve included code for both SwiftUI and UIKit below but Apple also provided a way to display tips in AppKit. It should be noted that the UIKit versions are not available on watchOS or tvOS. It’s also worth noting that there are a few bugs in the TipKit framework in beta 5, particularly around actions which I’ve documented below.

1. Creating a Tip

First we need to initiate the Tips system when our app launches using Tips.configure()1:

// SwiftUI
var body: some Scene {
    WindowGroup {
        ContentView()
        .task {
            try? await Tips.configure()
        }
    }
}

// UIKit
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
    Task {
        try? await Tips.configure()
    }
    return true
}

Next, we create the struct that defines our tip:

struct SearchTip: Tip {
    var title: Text {
        Text("Add a new game")
    }
    
    var message: Text? {
        Text("Search for new games to play via IGDB.")
    }
    
    var asset: Image? {
        Image(systemName: "magnifyingglass")
    }
}

Finally, we display our tip:

// SwiftUI
ExampleView()
    .toolbar(content: {
        ToolbarItem(placement: .primaryAction) {
            Button {
                displayingSearch = true
            } label: {
                Image(systemName: "magnifyingglass")
            }
            .popoverTip(SearchTip())
        }
    })


// UIKit
class ExampleViewController: UIViewController {
    var searchButton: UIButton
    var searchTip = SearchTip()

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        Task { @MainActor in
            for await shouldDisplay in searchTip.shouldDisplayUpdates {
                if shouldDisplay {
                    let controller = TipUIPopoverViewController(searchTip, sourceItem: searchButton)
                    present(controller)
                } else if presentedViewController is TipUIPopoverViewController {
                    dismiss(animated: true)
                }
            }
        }
    }
}

This code is all that is required to display our provided tip the first time the view appears:

A popover tip using TipKit
A popover tip using TipKit

There are two kinds of tip views:

  • Popover: appears as an overlay on the app’s UI which lets you direct users without changing the view
  • In-line: temporarily adjusts the app’s UI around it so nothing is covered (this is not available on tvOS)

If we wanted to display an in-line tip instead, our code would look like this:

// SwiftUI
VStack {
    TipView(LongPressTip())
}

// UIKit
class ExampleViewController: UIViewController {
    var longPressGameTip = LongPressGameTip()

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        Task { @MainActor in
            for await shouldDisplay in longPressGameTip.shouldDisplayUpdates {
                if shouldDisplay {
                    let tipView = TipUIView(longPressGameTip)
                    view.addSubview(tipView)
                } else if let tipView = view.subviews.first(where: { $0 is TipUIView }) {
                    tipView.removeFromSuperview()
                }
            }
        }
    }
}
An in-line tip using TipKit
An in-line tip using TipKit

UIKit also has a TipUICollectionViewCell for displaying tips within a collection view which should be the route used for table-based interfaces as well. The SwiftUI code is definitely less verbose 🤣

2. Making your tips look tip-top 🎩

You can customise your tips with changes to text colour and fonts along with background colour, corner radius, and icons. The tip views are also fully compatible with dark mode.

Fonts and text colour

These are customised within the Tip structs themselves as you are returning instances of SwiftUI.Text even if you are ultimately rendering your tip in UIKit or AppKit.

struct LongPressTip: Tip {
    var title: Text {
        Text("Add to list")
            .foregroundStyle(.white)
            .font(.title)
            .fontDesign(.serif)
            .bold()
    }
    
    var message: Text? {
        Text("Long press on a game to add it to a list.")
            .foregroundStyle(.white)
            .fontDesign(.monospaced)
    }
    
    var asset: Image? {
        Image(systemName: "hand.point.up.left")
    }
}

As the title and message both use Text, you can use any modifiers that return a Text instance such as foregroundStyle, font, and convenience methods like bold(). The icon is returned as an Image so if we want to change anything like the icon colour we have to do this from the Tip view itself:

Icon colour, background colour, and dismiss button colour

// SwiftUI
TipView(LongPressGameTip())
    .tipBackground(.black)
    .tint(.yellow)
    .foregroundStyle(.white)

// UIKit
let tipView = TipUIView(LongPressGameTip())
tipView.backgroundColor = .black
tipView.tintColor = .yellow

A method is provided to change the colour of the tip background itself but to change the icon colour we need to use a global tint whilst the dismiss button colour is affected by the foregroundStyle; note that this button appears to be 50% opaque so if you are using a dark background you’ll struggle to see anything other than white. There does not appear to be a way to alter this button with UIKit.

Whilst there are no Human Interface Guidelines for tips yet, looking through the iOS 17 beta and the WWDC 2023 talk shows that Apple uses un-filled SF Symbols for all of their tips. For this reason, I’d suggest doing the same!

Corner Radius

// SwiftUI
TipView(LongPressGameTip())
    .tipCornerRadius(8)

The default corner radius for tips on iOS is 13. If you want to change this to match other curved elements within your app, you can do this with tipCornerRadius() in SwiftUI. UIKit does not have a way to change the corner radius of tip views.

A customised tip view with new colours and fonts
A customised tip view with new colours and fonts. I know it's not pretty!

I was pleasantly surprised by how flexible the design was for this first version of TipKit. However, I’d urge caution in customising tips too far as having them match the default system tips is surely a boon in terms of user experience.

3. Lights, Cameras, Actions!

Tips allow you to add multiple buttons known as actions which can be used to take users to a relevant setting or a more in-depth tutorial. This feature is not available on tvOS.

To add an action, you first need to adjust your Tip struct with some identifying details:

// SwiftUI
struct LongPressGameTip: Tip {
    
    // [...] title, message, asset
    
    var actions: [Action] {
        [Action(id: "learn-more", title: "Learn More")]
    }
}

Note that the Action initialiser also has an option to use a Text block rather than a String which allows for all of the colour and font customisations mentioned earlier.

An action button within a Tip View
An action button within a Tip View

With this in place, we can alter our tip view to perform an action once the button has been pressed:

// SwiftUI
Button {
    displayingSearch = true
} label: {
    Image(systemName: "magnifyingglass")
}
    .popoverTip(LongPressGameTip()) { action in
        guard action.id == "learn-more" else { return }
        displayingLearnMore = true
    }

// UIKit
let tipView = TipUIView(LongPressGameTip()) { action in
    guard action.id == "learn-more" else { return }
    let controller = TutorialViewController()
    self.present(controller, animated: true)
}

Alternatively, we can add action handlers directly to the Tip struct:

var actions: [Action] {
    [Action(id: "learn-more", title: "Learn More", perform: {
        print("'Learn More' pressed")
    })]
}

Important: Whilst you can add actions in Xcode 15 beta 5, the handlers do not currently trigger when pressing the button regardless of whether you use the struct or view method to attach them.

One final thing to note on actions is that they can be disabled if you wish to grey them out for some reason (i.e. if a user isn’t signed in or subscribed to a premium feature):

var actions: [Action] {
    [Action(id: "pro-feature", title: "Add a new list", disabled: true)]
}

4. Laying down the rules

By default, tips appear as soon as the view they are attached to appears on screen. However, you may not want to show a tip in a certain view until some condition has been met (i.e. the user is logged in) or you may want a user to have to interact with a feature a certain number of times before the tip is displayed. Luckily Apple has thought of this and added a concept known as “rules” to let you limit when tips will appear.

There are two types of rules:

  • Parameter-based: These are persistent and are matched to Swift value types such as booleans
  • Event-based: Defines an action that must be performed before a tip is eligible for display

Important: In Xcode 15 beta 5 there is a bug which will prevent the @Parameter macro from compiling for simulators or for macOS apps. The workaround is to add the following to the “Other Swift Flags” build setting:

-external-plugin-path $(SYSTEM_DEVELOPER_DIR)/Platforms/iPhoneOS.platform/Developer/usr/lib/swift/host/plugins#$(SYSTEM_DEVELOPER_DIR)/Platforms/iPhoneOS.platform/Developer/usr/bin/swift-plugin-server

Parameter-based Rules

struct LongPressGameTip: Tip {
    
    @Parameter
    static var isLoggedIn: Bool = false
    
    var rules: [Rule] {
        #Rule(Self.$isLoggedIn) { $0 == true }
    }
    
    // [...] title, message, asset, actions, etc.
    
}

The syntax is relativelty straightforward thanks to the new Macro support in Xcode 15. We first define a static variable for the condition, in this case a boolean detailing if the user is logged in or not. Next we provide a rule based on that condition being true.

If we ran our app now, the tip would no longer be displayed on launch. However, once we mark the static property as true the tip will show up the next time the relevant view is displayed:

LongPressGameTip.isLoggedIn = true

Event-based Rules

struct LongPressGameTip: Tip {
    
    static let appOpenedCount = Event(id: "appOpenedCount")
        
    var rules: [Rule] {
        #Rule(Self.appOpenedCount) { $0.donations.count >= 3 }
    }
    
    // [...] title, message, asset, actions, etc.
    
}

The event-based rules are slightly different in that instead of a parameter we use an Event object with an identifier of our choosing. The rule then checks the donations property of this event to determine if the app has been opened three or more times. In order for this to work, we need to be able to “donate” when this event has occured. We do this by using the donate method on the event itself:

SomeView()
    .onAppear() {
        LongPressTip.appOpenedCount.donate()
    }

The donation on an event contains a date property that is set to the time at which the event was donated. This means you can add rules to check if somebody has opened the app three times or more today:

struct LongPressGameTip: Tip {
    
    static let appOpenedCount: Event = Event(id: "appOpenedCount")
        
    var rules: [Rule] {
        #Rule(Self.appOpenedCount) {
            $0.donations.filter {
                Calendar.current.isDateInToday($0.date)
            }
            .count >= 3
        }
    }
    
    // [...] title, message, asset, actions, etc.
    
}

Important: Whilst this code should be possible according to the WWDC 2023 talk, it gives a “the filter function is not supported in this rule” when run on Xcode 15 beta 5.

5. To display, or not to display?

Whilst rules can limit our tips to displaying at the optimal time, there is always the possibility that multiple tips might try to display at the same time. It may also be that we no longer want to display a tip if the user interacts with our feature before our tip was displayed. To get around this, Apple provides us with ways to manage frequency, display count, and to invalidate tips. They also provide a mechanism for syncing the display status of your tips across multiple devices.

Frequency

By default, tips appear as soon as they are allowed to. We can change this by setting a DisplayFrequency when initiating our Tips store on app launch:

try? await Tips.configure(options: {
    DisplayFrequency(.daily)
})

With this in place, only one tip will be able to appear each day.

There are several predefined values for DisplayFrequency such as .daily and .hourly but you can also provide a TimeInterval if you need something custom. Alternatively, you can restore the default behaviour by using .immediate.

If you have set a non-immediate display frequency but have a tip that you want to display immediately, you can do so by using the IgnoresDisplayFrequency() option on the Tip struct:

struct LongPressGameTip: Tip {
    
    var options: [TipOption] {
        [Tip.IgnoresDisplayFrequency(true)]
    }
    
    // [...] title, message, asset, actions, etc.
    
}

Display Count

If a tip is not manually dismissed by the user then it will be reshown the next time the relevant view appears even after app launches. To avoid a tip being shown repeatedly to a user, you can set a MaxDisplayCount which will limit the number of appearances until the tip is no longer displayed:

struct LongPressGameTip: Tip {
    
    var options: [TipOption] {
        [Tip.MaxDisplayCount(3)]
    }
    
    // [...] title, message, asset, actions, etc.
    
}

Invalidation

Depending on our rules and display frequency, it may be that a user interacts with a feature before our tip has been displayed. In this case, we would want to invalidate our tip so that it is not displayed at a later date:

longPressGameTip.invalidate(reason: .userPerformedAction)

There are three possible reasons for a tip to be invalidated:

  • maxDisplayCountExceeded
  • userClosedTip
  • userPerformedAction

The first two are performed by the system depending on whether the display count or the user caused the tip to be dismissed. This means you will always want to use .userPerformedAction when invalidating your tips.

iCloud Sync

During the “Make features discoverable with TipKit”, Charlie Parks mentions:

TipKit can also sync tip status via iCloud to ensure that tips seen on one device won’t be seen on the other. For instance, if someone using the app has it installed on both an iPad and an iPhone, and the features are identical on both of those devices, it’s probably best to not educate them on both devices about the feature.

This feature appears to be enabled by default with no options for disabling it meaning you’ll need to provide custom identiers for each tip on the platforms you support if you want to make sure tips are re-displayed on every device for some reason (i.e. if the UI is significantly different between devices).

6. Debugging

TipKit provides convenient APIs for testing, allowing you to show or hide tips as needed, inspect all the tips without satisfying their rules, or purge all info in the TipKit data store for a pristine app build state.

// Show all defined tips in the app
Tips.showAllTips()

// Show the specified tips
Tips.showTips([searchTip, longPressGameTip])

// Hide the specified tips
Tips.hideTips([searchTip, longPressGameTip])

// Hide all tips defined in the app
Tips.hideAllTips()

If we want to purge all TipKit related data, we need to use the DatastoreLocation modifier when initialising the Tips framework on app launch:

try? await Tips.configure(options: {
    DatastoreLocation(.applicationDefault, shouldReset: true)
})

Conclusion

A tip displayed on tvOS
A tip displayed in my "Chaise Longue to 5K" tvOS app

Tips are instrumental in helping users discover features in your app be it on iOS, iPadOS, macOS, watchOS, or tvOS. Remember to keep your tips short, instructional, and actionable, and make use of the rules system, display frequency, and invalidation to ensure tips are only shown when they need to be.

  1. Note that this differs from the TipsCenter.shared.configure() that was previewed in the WWDC 2023 talk “Make features discoverable with TipKit”. ↩︎

Attempting to connect a tvOS app to an iOS app with DeviceDiscoveryUI

As we get to the final month before WWDC 2023, I’m reminded of all the new APIs that were released at WWDC 2022 that I haven’t made use of yet. One of those new APIs was the DeviceDiscoveryUI framework which allows an Apple TV app to connect and communicate with an iPhone, iPad, or Apple Watch.

A good example of this would be how the Apple Watch communicates with the Apple Fitness app:

It’s not necessarily a fair comparison as whilst you might expect them to be the same, the DeviceDiscoveryUI framework has a number of restrictions:

  • It only works on tvOS (so you can’t communicate between an Apple Watch and an iPad like Apple Fitness can)
  • It only works on Apple TV 4K (Apple Fitness can work with Apple TV HD)
  • The tvOS app can only connect to one device at a time (i.e. you couldn’t make a game with this that used two iPhones as controllers)
  • The tvOS app can only connect to other versions of your app that share the same bundle identifier (and are thus sold with Universal Purchase)
  • This will not work on either the tvOS or iOS simulators. You must use physical devices.

The UI for the connection setup is also different to Apple Fitness as we will see shortly.

My use case for this technology is a bit convoluted as I was really looking for an excuse to use it rather than the best fit. I have a personal app named Stoutness that I use on my Apple TV every morning to give me a briefing on my day whilst I do my chiropractic stretches. Using shortcuts and various apps on my iPhone, I send a ton of data to my server which the Apple TV app then fetches and uses. The app also communicates directly with some 3rd party APIs such as YouTube and Pocket.

One of the main reasons for the app is to get me to work through my backlogs of games, books, videos, and articles by having the app randomly pick from my various lists and presenting them to me; I then know “out of the 4 books I’m currently reading, I should read x today”. The problem is that later in the day I often forget what the app had decided I should use, a particular problem when it suggests 5 articles for me to read from a backlog of about 200 😬. Whilst I cache this information daily in the Apple TV app, it’s a bit of a pain to fire it up just to skip through a few screens and remember what I should be reading. Surely this information would be better on my phone?

The obvious way to do this would be for the server to make the calls to Pocket and YouTube and then store the daily cache in my database along with the random choices of games and books. An iOS app could then download that in the same way the tvOS app does. This is true, but it’s not as fun as learning a new framework and having my phone connect to the Apple TV to a) send all the data that my shortcuts used to do directly and b) have the cache be sent back in response ready to be used on iOS.

After a brief look at the docs, I naively assumed this would be done in an hour as it looked vaguely similar to the way in which an iPhone app can talk to an embedded Apple Watch app or a Safari extension via two way messaging. After 4 hours, I finally got something working but it does not feel as solid as I would like…

Apple provide a developer article titled “Connecting a tvOS app to other devices over the local network” that sounds like it should be exactly what we need. It details how we present the connection UI (in both SwiftUI and UIKit), how to listen for the connection on iOS / iPadOS / watchOS, and how to initiate the connection. However, there are two issues with this article.

First of all, most of the code in it doesn’t actually compile or is being used incorrectly. The SwiftUI code references a “device name” variable which isn’t present1, fails to include the required “fallback” view block (for displaying on unsupported devices like the Apple TV HD), and presents the device picker behind a connect button failing to notice that the picker itself has it’s own connect button which sits transparently above the one you just pressed.

For the UIKit code, it references an NWEndpointPickerViewController which doesn’t exist. The correct name is DDDevicePickerViewController.

Once the actual picker is presented, things start to look very promising. You get a fullscreen view that shows your app icon with a privacy string that you define within Info.plist on the left hand side whilst any applicable devices are listed on the right hand side:

An important thing to note here is that the devices do not necessarily have your app installed, they are merely devices potentially capable of running your app.

When we initiate a connection to an iPhone, a notification is displayed. The wording can’t be controlled and will be different depending on whether the corresponding app is installed or not:

Connection notification request for iOS from tvOS both with and without the app installed. If the app is installed, the notification uses the Apple TV name for the title (“Office” in this case).

You seem to have around 30 seconds to accept the connection otherwise the tvOS interface goes back a step and you need to send a new request. If you do not have the app installed, tapping the notification will take you to the App Store page.

We now come to the second problem in Apple’s documentation:

As soon as the user selects a device, the system passes you an NWEndpoint. Use this endpoint to connect to the selected device. Create an NWConnection, passing it both the endpoint and the parameters that you used to create the device picker view. You can then use this connection to send or receive messages to the connected device.

The emphasis above is mine. This is the extent of the documentation on how to actually use the connection to send and receive messages. It turns out that the connection uses classes from the In-Provider Networking that was introduced in iOS 9 specifically for network extensions. In fact, this is still the case according to the documentation:

These APIs have the following key characteristics:

  • They aren’t general-purpose APIs; they can only be used in the context of a NetworkExtension provider or hotspot helper.

There is zero documentation on how to use these APIs in the context of Apple TV to iOS / iPadOS / WatchOS communication 🤦🏻‍♂.

In terms of sending messages, there is only one method aptly named send(content:contentContext:isComplete:completion:). This allows us to send any arbitrary Data such as a JSON-encoded string.

The real problem is how to receive those messages. There is a method named receiveMessage(completion:) which, based on my work with watchOS and iOS extensions, sounds promising. Apple describes it as “schedules a single receive completion handler for a complete message, as opposed to a range of bytes”. Perfect!

Except it isn’t called, at least not when a message is sent. In a somewhat frustrating act, the messages only appear once the connection is terminated either because the tvOS app stops or because I cancel the connection. I tried for multiple hours but could not get that endpoint to fire unless the entire connection was dropped (at which point any messages that were sent during that time would come through as one single piece of data). I can only assume the messages are being cached locally without being delivered yet when the connection drops it suddenly decides to unload them 🤷🏻‍♂.

It turns out you need to use the more complex receive(minimumIncompleteLength:maximumLength:completion:) which requires you to say how big you want batches of data to be. You also need to resubscribe to this handler every time data appears on it. The problem here is that whilst there is a “completion” flag to tell you if the whole message has arrived this is never true when sending from tvOS, even if you use the corresponding flag on the send method. In the end, I limited the app to 1MB of data at a time as everything I send is well below that. I’ve never run into a problem with only partial data being sent but it is a potential risk to be aware of.

If you were using this for critical data, I’d probably suggest only sending encoded text and providing your own delimiter to look for i.e. for each string that comes in batch them together until one ends in a “|||” at which point you will know that was the end of a message from tvOS.

On the positive side, the connection setup and data sending are near instantaneous and the user facing UI works well. However, as there were already low-level network solutions to send data between devices (including non-Apple devices) it’s incredibly odd to me that Apple went to the effort of creating a beautiful device pairing API and UI for both SwiftUI and UIKit but didn’t extend that to the basics of sending data. Local networking is hard. I have no interest in diving into the minutia of handling UDP packets; I just want to send some basic strings between devices!

In order to get this all working for my own app, I created a class named LocalDeviceManager that handles this all for you along with a SwiftUI demo project for both tvOS and iOS that demonstrates how it works. The call site on tvOS is very simple:

@ObservedObject private var deviceManager = LocalDeviceManager(applicationService: "remote", didReceiveMessage: { data in
    guard let string = String(data: data, encoding: .utf8) else { return }
    NSLog("Message: \(string)")
}, errorHandler: { error in
    NSLog("ERROR: \(error)")
})

@State private var showDevicePicker = false

var body: some View {
    VStack {
        if deviceManager.isConnected {
            Button("Send") {
                deviceManager.send("Hello from tvOS!")
            }
            
            Button("Disconnect") {
                deviceManager.disconnect()
            }
        } else {
            DevicePicker(.applicationService(name: "remote")) { endpoint in
                deviceManager.connect(to: endpoint)
            } label: {
                Text("Connect to a local device.")
            } fallback: {
                Text("Device browsing is not supported on this device")
            } parameters: {
                .applicationService
            }
        }
    }
    .padding()
    
}

Similarly, it’s trivial to set up an iOS app to communicate with the tvOS app:

@ObservedObject private var deviceManager = LocalDeviceManager(applicationService: "remote", didReceiveMessage: { data in
    guard let string = String(data: data, encoding: .utf8) else { return }
    NSLog("Message: \(string)")
}, errorHandler: { error in
    NSLog("ERROR: \(error)")
})

var body: some View {
    VStack {
        if deviceManager.isConnected {
            Text("Connected!")
            Button {
                deviceManager.send("Hello from iOS!")
            } label: {
                Text("Send")
            }
            Button {
                deviceManager.disconnect()
            } label: {
                Text("Disconnect")
            }
        } else {
            Text("Not Connected")
        }
    }
    .padding()
    .onAppear {
        try? deviceManager.createListener()
    }
}

There are more details on how this works on GitHub.

Judging by the complete lack of 3rd party apps using this feature or articles detailing how to use this API I’m going to go out on a limb and say it’s unlikely we’ll see any improvements to this system in tvOS 17. Regardless, I’ve filed a few bug reports in the hopes that the documentation can be tidied up a bit. Just be aware that this is not the robust solution I was hoping it would be!

  1. I have been unable to divine a way to get the name of the device you are connected to. ↩︎

Postmortem of the launch of a Top 10 Paid iOS App

It’s been 4 weeks since the v2.0 update for Music Library Tracker launched so I thought now was a good time for a retrospective to detail how I promoted the app and how well it performed.

By way of a bit of background, the app originally launched back in January 2016 at a $0.99 price point making $1368 in it’s first month before dropping off significantly to roughly $20 a month. In January 2021, I was accepted into the App Store Small Business Program which meant the amount Apple took from sales fell from 30% to 15%; I had also increased the price and released a few more updates so the average profit for the half year prior to the v2.0 update in February was sitting at around $80 a month1. This is by no means an income (especially as I have to pay corporation tax on it in the UK and then if I want to actually take the money for myself rather than my business I’ll have to pay some more tax) but it was fine for an app that didn’t have any running costs nor require much maintenance.

And then v2.0 happened.

With a new feature set built around Spatial Audio, v2.0 was released on 13th February 2023 after a 9 month development period, 3 months of which was open development via my newsletter. It was reported on by a couple of tech sites (I’ll detail how shortly) and ended up being the #8 Paid app in the US!

So how much money does an app need to make to be in the Top 10 of all paid apps on the App Store?

Daily profit in USD over the past 28 days peaking at $1534 on February 15th

Not as much as you might think! You can download a full breakdown but the key figures are:

  • Profit of $82 on 13th Feb (launch day), $1449 on 14th Feb, $1534 on 15th Feb, and $414 on 16th Feb
  • Total profit of $5351 over 28 days
  • An average daily profit of $191
  • Only a single sale on March 6th 😭

I use Daily Sales Email to find out how much I’ve made each day but the figures typically arrive around lunchtime on the following day. That meant I could see the app in the Top 10 of all paid apps but had no idea what that would translate into2. I’ll confess that whilst I was pleased with the numbers, I was a little disappointed that I’d made less than what I charge for 2 days as a freelance iOS developer.

That said, the app has settled down into making roughly $40 per day which works out at around $1200 per month, not bad for something that will hopefully only need minor maintenance.

With the financial breakdown out of the way, I thought it might be interesting to detail exactly how I promoted the app. I will be completely honest and say it is not my strong suit at all. I hate doing app promotion work; it is abhorrent to me. I’m not sure if it’s the Englishman in me or something else but I absolutely hate having to email people saying “please look at my app” followed by the waiting and hoping that somebody will feature it. However, that’s what I had to do as an app of this nature likely isn’t going to generate enough revenue to make hiring a marketing person cost effective.

Reviews

The key thing for an app like this is for it to be written about by a tech site. I’ve had a couple articles in the past from sites like 9to5mac and MacRumors so my first port of call was to send them an email. As previously mentioned, I hate doing this stuff but I felt on slightly firmer ground with these sites as they’d written about the app before so that seemed like a good “in”:

Hello,

Back in 2016 you were kind enough to review an app of mine, Music Library Tracker (https://9to5mac.com/2016/03/15/music-tracker-large-libraries/).

I’m getting in touch as I’ve just released a large v2.0 update to the app which includes some features around Spatial Audio. In short, the app can quickly scan your library and show you exactly which songs have been upgraded to Spatial Audio and generate a playlist containing just those tracks; it will then run in the background periodically and notify you as and when tracks are upgraded and keep that playlist up to date.

This is all possible due to a database of Dolby Atmos and Dolby Audio tracks I’ve created over the past 9 months to run my Spatial Audio Finder website (https://bendodson.com/projects/spatial-audio-finder/) and the @NewSpatialAudio Twitter account (https://twitter.com/NewSpatialAudio) which tweets whenever a new track is upgraded. This database is sourced from a minor update to the Apple Music API at WWDC 22 - you can see how this all works in a blog post I wrote last year (https://bendodson.com/weblog/2022/06/27/spatial-audio-finder/) but suffice to say I do not believe there is anyone outside of Apple with a dataset such as this.

Apple Music does not yet have a clear strategy for displaying Spatial Audio tracks. Whilst they have some playlists and collections that get updated weekly, the only way to tell which tracks in your own library are upgraded is to play them and see. This is obviously not ideal and not a great way to showcase what is a genuine leap in musical quality and the hundreds of thousands of tracks that have been upgraded. I created this feature as I was determined to find a way to see which tracks had been updated. From the response I’ve received via @NewSpatialAudio it seems I’m not alone!

The app is still a single cost download (25% off for the next week) with no in-app purchases, subscriptions, or adverts so anybody who downloaded the app in the past 7 years will get this new feature for free. I’ve provided a few promo codes below in case you or anyone at the MacRumors team are interested in taking a look:

CODE1
CODE2
CODE3

You can see some more information about the app at https://dodoapps.io/music-library-tracker/ and there is a full media kit with screenshots, etc, at https://dodoapps.io/music-library-tracker/media-kit/

The update is available now on the App Store at https://apple.co/3XtdAga

If you have any questions at all about the app, my Spatial Audio database, or anything else relating to Spatial Audio then just let me know.

All the best,

Ben

I sent this email on the 13th February to the reviews@9to5mac.com address (as my previous contact had since moved elsewhere) and a very similar version with a different link directly to the Senior Editor at MacRumors who wrote a previous article. I got a very strange bounceback email from 9to5mac and I didn’t get a reply at all from MacRumors. As the bounceback was so odd, I waited a day and then sent a follow up email to tips@9to5mac.com; it was a good thing I did as Chance Miller got in touch within 30 minutes and shortly afterwards there was an article published. This is undoubtedly what led to the spike in sales on the 14th and afterwards.

In addition to those two outlets, I sent similar emails to:

  • TechRadar (via news@techradar.com): their Entertainment Editor replied after a couple of hours and we had a few emails back and forth discussing various aspects of Spatial Audio such as how Apple bans AI upscaling. They published an article the next day but it didn’t show up in their RSS feed and is not visible on their website unless you follow the direct link (which was tweeted but had practically zero engagement)
  • AppleInsider (direct to one of their writers who had written a recent article about Spatial Audio): I didn’t get a reply and there hasn’t been an article
  • MacStories (direct email to John and Federico): I didn’t get a reply but it was listed in the “App Debuts” section of the Club MacStories newsletter for paid subscribers

The following week I sent an email to iMore as I’d noticed an interesting article relating to Spatial Audio. I couldn’t find an email address for the author, Tammy Rogers, so instead sent an email direct to the Features Editor, Daryl Baxter, who was listed as a contributor:

Hi Daryl,

I came across a recent article you contributed to, “Apple Music is showcasing non-Spatial Audio albums in it’s Spatial Audio page”, and had two things that may be of interest to you and Tammy (I couldn’t find an email address for her so my apologies for not including her as well).

First of all, the reason that those albums are being listed within Apple Music’s Spatial Audio playlists is because they have some tracks on them that are available in Spatial Audio. The referenced No Pressure by Logic has two tracks that have been upgraded (GP4 and Perfect) whilst McCartney (2011) remaster has the first 13 tracks available in Spatial Audio. I know this because I created something called the Spatial Audio Finder which lets you find which tracks have been updated for a particular artist (I’ve got a blog post at https://bendodson.com/weblog/2022/06/27/spatial-audio-finder/ which explains how that all works). I also publish when tracks are upgraded to the @NewSpatialAudio Twitter feed.

You also mentioned in the article that it’s quite hard to find Spatial Audio tracks within Apple Music. This is a huge bugbear of mine and so I recently updated an app of mine, Music Library Tracker, with some new features around Spatial Audio. The app was originally designed to help notify you when Apple changes your music (i.e. if a song is deleted due to licensing changes, etc) but it can now scan your library and show you which tracks you have that are available in Spatial Audio along with creating a playlist in Apple Music containing only those tracks. It can then keep monitoring your library and send you notifications as and when new tracks are updated.

The rest of the email is similar to the initial one above

I received a reply a few days later and then after 2 weeks an article appeared.

In addition to the sites I reached out to, a few sites published articles organically including:

I’d like to give a big thank you to all of the people who did get back to me or wrote about the app - I’m very grateful! However, the experience of doing this is easily the worst part of being an independent app developer. I absolutely hate having to hawk the app around and then have the long period of waiting and hoping for an article to appear. I always try and craft my emails to be very specific to something the site has covered before or to provide some kind of story so it’s a bit easier to form a narrative other than “please talk about my app”. It’s incredibly disappointing when you don’t even get an email back. As I hated doing it, I’d typically send an email and then think “that’ll do” and by the time I realised a site wasn’t going to pick it up then the launch window had passed and it felt even more awkward to email in (especially as it had already been covered by 9to5mac so other sites could have potentially already seen that article and not wanted to cover something which is now old news).

A few things I should have done differently:

  1. I should have contacted people before the launch of the app rather than afterward. I don’t like contacting anyone before Apple have approved an app as that can lead to all sorts of problems. I’d already public committed to a date and didn’t give myself much room between approval and release so just sent the messages out post-launch. In an ideal world, I should have had a week or even two with the app approved within which I could have sent out promo codes or TestFlight invites so the app could be reviewed and embargoed. That would lead to a much bigger “splash” and also avoids the issue of sites potentially not wanting to promote an app that has already been promoted elsewhere.

  2. I should have written to more sites rather than just the ones I typically read. I did do some research to find sites that had talked about Spatial Audio (as I wanted some kind of an “in” when writing to someone who’d never heard of me before) but I probably should have just gone with a scattergun approach to anybody that is even vaguely app adjacent.

  3. I had no idea if the promo codes I was sending out were being used so couldn’t really tell if my emails were getting through. Once you’ve generated a promo code within App Store Connect, the only way to see if it has been redeemed or not is to try and redeem it (which is obviously not a good idea). I could easily just provide a link to my site which, when accessed, gives out a promo code and can then tell me that has happened but it just doesn’t sit right with me and I’d be afraid it would be something that would put people off.

  4. I should have followed up with the sites that didn’t reply to me. I did that with 9to5mac which definitely paid off but I felt more comfortable doing that as it seemed clear there was a technical error; sending a “sorry but did you get my email?” shouldn’t really be anxiety inducing but I couldn’t bring myself to do it.

If you are a writer for a tech site with any insight or a developer that has had any success stories with this then I’d absolutely love to hear from you!

When you’re looking at ways to promote an app, getting featured by Apple on the App Store is obviously a high priority goal. There have been several articles recently about using the dedicated form on the Apple Developer website with the key takeaway being to submit the form for every app update.

I have never used this form before, mostly because my apps tend to either be very niche or are something like this app which I’m always somewhat surprised makes it through App Review in one piece 😆. However I did it use it and something unexpected happened… I got an email from the Apple Services Performance Partnership Team3:

We are currently recruiting new partners to promote the latest of Apple’s products to join the programme: Apple MusicKit.

MusicKit lets users play Apple Music and their local music library from your app or website. So, when your users provide permission to access their Apple Music account, they can use your app or website to create playlists, add songs to their library, and play any of the millions of songs in the Apple Music catalog! If your app detects that the user is not yet an Apple Music member, you can offer a trial membership from within your app. The Apple Music Affiliate Program allows you to earn a commission on eligible referred Music memberships (new sign-ups only)! You can find more detailed information here as well as in the document attached.

We have noticed that you already use the Apple Music API and we believe adding in MusicKit would be an easy process for you and a great benefit! We offer generous compensation models and would like to talk you through this opportunity in more detail.

Please let us know your avails, so we can go ahead and schedule a call with you. 😊

I did take the call4 and it is effectively outreach to try and get developers to promote Apple Music within their apps in exchange for a commission on any new subscriptions. You can already apply for this directly but I guess Apple saw that I was using MusicKit on the form I filled out and so set this up. Unfortunately it’s not really a good fit for this app (you’re likely not using it if you don’t have Apple Music) but it may be useful for another app I have in the pipeline in which I’d already added the “Subscribe to Apple Music” interstitial that this hooks into.

Going back to the form, the app has not been featured anywhere on the App Store but I had very little expectation of that happening.

App Store Ads

I took a look at promoting the app using Apple Search Ads and found that it was recommending a suggested “Cost-per-Install” of £5.61. This is not ideal bearing in mind the app cost £2.49 at the time 🤣

After I posted that on Twitter the developer of the excellent Marvis Pro music app, Aditya Rajveer, reached out and said “It almost never reached the suggested amount per install for my app, not even close”. That pushed me to give it a try and they were right! I’ve had it running for a few weeks now and have had 22 installs on an average Cost-Per-Install of £0.89. That’s not exactly setting my sales alight but it’s better than nothing. On a more positive note, I’m not actually being charged for these installs as I have a promotional balance apparently. I seem to remember I claimed a free $100 of advertising years and years ago so evidently that is still in use 🤷🏻‍♂

App Store In-App Events

I created an In-App Event on the App Store to coincide with the release of the update which ran for 1 week:

The irony is that you can't listen to Spatial Audio on those headphones but it was the only decent royalty-free image I could find...

This had 4700 impressions leading to 9 downloads and 24 app opens. Again, not terribly exciting but extra sales are extra sales.

Other Promotions

I obviously promoted the app on my own Mastodon and Twitter accounts but I also tweeted about it on the @NewSpatialAudio account which I believe led to the article on Tom’s Guide. There’s also my newsletter and my website which mentioned the app. Finally, it was mentioned in both the Indie Dev Monday and SwiftlyRush newsletters.

So what actually worked?

App Store Connect provides a metrics panel which roughly details where your downloads have come from. Rather astonishingly, it turns out that 43.4% of all my downloads in the past month came from “App Store Browse”. This is followed by “Web Referer” at 28.3%, “App Store Search” at 13.4%, and “App Referer” at 12.8%.

If I dig into that a little more I can see that most of the app referer traffic was either Facebook, Google, or Google Chrome (so likely clicking on links from one of the published articles). With web referer, the vast majority is 9to5mac.com followed by my own Dodo Apps website. Everything else is single digits.

My assumption is that the 9to5mac article created enough downloads to catapult the app up the Paid App charts and it was there that it was discovered by those just browsing the App Store who then made up the majority of my sales. This seems incredibly backwards to me as I’d assume the technical readership for whom this app is more likely aimed at would be the majority of downloaders but I suspect that with the billions of iOS devices in the world even a fractional percentage of users browsing the App Store is going to be magnitudes larger than the number of followers that the tech sites have.

In terms of next steps, I’m at a slight loss as to what to do as I don’t have any big splashy features that would merit the coverage that is clearly key to increasing the number of downloads. Having looked at what other developers are doing, it looks like I should try finding an influencer on TikTok but I know absolutely nothing about that world. I could also look at direct advertising on some of the tech sites or podcasts that would be relevant but doing so is likely going to be thousands of pounds worth of investment and feels like a bit of a gamble given this is a low-cost paid app rather than a subscription based service that can recoup large advertising costs over months of later usage.

If you’ve got any thoughts or insights then I’d love to hear from you. I’d also love it if you downloaded the app 😉

  1. You can download my historic monthly breakdown if you’re interested. With the change from a 70/30 split to an 85/15 split for the last 2 years, the actual amount I’ve given to Apple over the past 7 years has been around 26% leaving an average monthly profit of $59.83. ↩︎

  2. I don’t use any analytics in my apps so I couldn’t see any realtime usage information. ↩︎

  3. It definitely came as a result of submitting that form as the email was sent to my personal address which I’d used on the form, not my Apple Developer account email address. ↩︎

  4. I nearly didn’t as they inexplicably used Microsoft Teams 🤣 ↩︎

Side Project: Back Seat Shuffle

This is part of a series of blog posts in which I showcase some of the side projects I work on for my own use. As with all of my side projects, I’m not focused on perfect code or UI; it just needs to run!

If I’m going on a long drive with my two young children, I’ll load up an iPad with some videos and stick it in a pouch on the back of a seat to keep them entertained. Initially this started as a few films and a couple of their TV series on a USB-C stick but I’ve gradually started putting a few shows directly onto the iPad so they can be played via VLC. Why? Well, when using an external drive you’re limited to using the Files app which uses Quick View for video playback; this is fine for a film but for TV you have to go and start a new episode after the previous one finishes (and that involves my wife precariously leaning into the back without a seatbelt which isn’t ideal). I moved to using VLC for TV shows as they then play sequentially avoiding that problem but it can’t play from an external drive so I have to put things directly onto the limited storage of the device.

For a couple of weeks I’ve been toying with the idea of whether I could build a better app, one that would let me:

  • Plug in an external drive
  • Show each series with a nice image
  • Play episodes randomly without needing to copy the video to the device

After a 3 hour drive to visit my mother, the priority for this has now increased exponentially 😂

To begin with, I needed to know if it is even possible to view external files within an app on iOS. It is, and has been since the introduction of UIDocumentPickerViewController in iOS 13 however the documentation left me a little confused:

Both the open and export operations grant access to documents outside your app’s sandbox. This access gives users an unprecedented amount of flexibility when working with their documents. However, it also adds a layer of complexity to your file handling. External documents have the following additional requirements:

  • The open and move operations provide security-scoped URLs for all external documents. Call the startAccessingSecurityScopedResource() method to access or bookmark these documents, and the stopAccessingSecurityScopedResource() method to release them. If you’re using a UIDocumentsubclass to manage your document, it automatically manages the security-scoped URL for you.
  • Always use file coordinators (see NSFileCoordinator) to read and write to external documents.
  • Always use a file presenter (see NSFilePresenter) when displaying the contents of an external document.
  • Don’t save URLs that the open and move operations provide. You can, however, save a bookmark to these URLs after calling startAccessingSecurityScopedResource() to ensure you have access. Call the bookmarkData(options:includingResourceValuesForKeys:relativeTo:) method and pass in the withSecurityScope option, creating a bookmark that contains a security-scoped URL.

External files can only be accessed via a security-scoped URL and all of the tutorials I’d seen online relating to this were demonstrating how you could access a file and then copy it locally before removing that scope. I was therefore unsure how it would work in terms of streaming video (as it would go out of scope and lose security clearance) nor if I’d be able to retain access after displaying a directory and then wanting to start playback.

It turns out that it is all possible using a system known as “bookmarks”. In practice, a user will be shown their external drive in an OS controlled modal view and can select a folder, the URL of which is returned to my app. I then call the “start accessing security scoped resource” and convert that URL to a bookmark which is stored locally on my device and then close the security scoped resource. That bookmark can be used at any point to gain access to the drive (so long as it hasn’t been disconnected in which case the bookmark tells the app it is “stale” and therefore no longer working) and you can then interact with the URL the bookmark provides in the same way as you would with a local file.

func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
    guard let url = urls.first else { return }

    // make sure we stop accessing the resource once we exit scope (which will be as soon as the video starts playing)
    defer { url.stopAccessingSecurityScopedResource() }

    // we don't care about the return value for this as we'll try to create a bookmark anyway
    _ = url.startAccessingSecurityScopedResource()

    // store the bookmark data locally or silently fail
    bookmark = try? url.bookmarkData()

    // try to play the video; if there is an error, display an alert
    do {
        try playVideos()
    } catch {
        let controller = UIAlertController(title: "Error", message: error.localizedDescription, preferredStyle: .alert)
        controller.addAction(UIAlertAction(title: "OK", style: .default))
        present(controller, animated: true)
    }
}

private func playVideos() throws {
    guard let bookmark else { return }

    // get the local url from our bookmark; if the bookmark is stale (i.e. access has expired), then return
    var stale = false
    let directoryUrl = try URL(resolvingBookmarkData: bookmark, bookmarkDataIsStale: &stale)
    let path = directoryUrl.path
    guard !stale else {
        throw BSSError.staleBookmark
    }

    // get the contents of the folder; only return mp4 and mkv files; if no files, throw an error
    let contents = try FileManager.default.contentsOfDirectory(atPath: path)
    let urls = contents.filter({ $0.hasSuffix("mp4") || $0.hasSuffix("mkv") }).map({ URL(filePath: path + "/" + $0) })
    guard urls.count > 0 else {
        throw BSSError.noFiles
    }

    // present the video player with the videos in a random order
    presentPlayer(urls.shuffled())
}

private func presentPlayer(_ urls: [URL]) {
    // set the audio session so video audio is heard even if device is muted
    try? AVAudioSession.sharedInstance().setCategory(.playback)

    // create a queue of player items from the provided urls
    let items = urls.map { AVPlayerItem(url: $0) }
    player = AVQueuePlayer(items: items)

    // present the player
    let playerController = AVPlayerViewController()
    playerController.player = player
    present(playerController, animated: true) {
        self.player?.play()
    }
}

This would also work in other contexts such as local files or even cloud-based services that work with the Files app such as iCloud or Dropbox.

I had originally planned on reading the contents of the USB stick and using a single .jpg file in each directory to render a nice thumbnail. In the end I abandoned that as it would have meant building the whole interface when in fact it works perfectly well just using UIDocumentPickerViewController to pick the show I’m interested in:

Selecting a directory of videos in Back Seat Shuffle

In the end the only extra code I added was to strip out any files that were not in the .mp4 or .mkv format and to have it automatically return to the file selection screen once the full queue of randomised videos had finished.

Whilst I could potentially put it on the App Store, this is one of those weird edge cases that likely wouldn’t get through App Review as they’ll look at it and say “this is just the Files app” and completely miss the point. As this would be a free app, it’s not worth the hassle of doing screenshots, App Store description, etc, only to have it be rejected by App Review.

The full app code is available on GitHub.

Return to Dark Tower Assistant

Return to Dark Tower is a really cool app-driven board game that comes with a physical tower that you connect via Bluetooth to an iPad. The tower lights up, makes sounds, and spins internally to shoot little skulls you place into it over the outlying map.

You put little skulls in the top of the tower to end your turn...

As much as I love it, there are a ton of cards in the game so you can easily forget what abilities you have available to you or miss crucial triggers at key phases in the game. You can also take most actions in any order so it’s easy to forget what you’ve done in the current turn, especially if you’re playing solo. To that end, I built myself a very niche app to keep track of all of the cards I had and all of the moves I’d made. It’s called Return to Dark Tower Assistant and is optimised for use in an iPad Slide Over panel so you can use it on top of the app you use to run the game:

Running the assistant in a Slide Over panel so you can still access the board game app beneath!

There’s a very slim chance that any of you reading this have a copy of this board game and even if you do it’s likely a small minority that would find utility in this app. That said, I think it demonstrates what I’ve said for over a decade about making apps for yourself; build things for yourself on the off-chance that somebody else finds it as useful as you do. I spent far too much time matching the colours to the player boards, getting the fonts just right, and doing things like perfectly embedding the right icons for warrior and spirit tokens but I did that because I want it to look good when I’m playing. It will likely only get single-digit downloads, but it’s an app I’m proud of.

The assistant is available now on the App Store. You can also read more about it on my Dodo Apps website.

Using a Stream Deck for iOS development

The Elgato Stream Deck is a fun device with 15 LED buttons that can be programmed to do whatever you want through an app that runs on PC and Mac. It was designed for streamers to be able to quickly switch scenes or present overlays but it has quickly become popular in other areas thanks to its flexible design. I picked one up in 2018 when I was dabbling with streaming but then mostly used it as control box for my Cessna 152 in Flight Simulator 2020 thanks to FlightDeck. I eventually replaced this with a bigger flight sim setup1 so the Stream Deck was sitting idle until I had the idea to integrate it into my app development workflow.

The Stream Deck running alongside my Mac Studio.

I typically work on multiple projects per day as I have a number of active client projects at any one time along with my own independent apps. This means I often waste time getting set up for each project so my initial idea was to have a single button press to get my workspace configured.

A single button per platform to start a project.

For example, I may want to start a specific Toggl timer, open a Jira board, and open the Xcode project. To do this, I created a single AppleScript file that is opened by the Stream Deck that will look something like this:

tell application "Timery"
	activate
	tell application "System Events" to keystroke "1" using command down
end tell

do shell script "open https://example.com/jira-board/"

tell application "System Events" to tell process "Safari"
	set frontmost to true
	if menu item "Move to DELL P2415Q" of menu 1 of menu bar item "Window" of menu bar 1 exists then
		click menu item "Move to DELL P2415Q" of menu 1 of menu bar item "Window" of menu bar 1
	end if
	set value of attribute "AXFullScreen" of window 1 to true
end tell

do shell script "open ~/Files/Clients/UKTV/iOS/App\\ Files/UKTV.xcodeproj"

In the first block I activate the Timery app and tell it to perform the keyboard shortcut ⌘+n where n is the project as it appears in Timery’s list. This will start a timer going for the project so I can track my time. I typically only use this for clients that I’m working with on a large project or have a regular maintenance contract with; for smaller ad hoc work I’ll instead throw an alert to remind me to start a timer manually.

The second block will open a URL in Safari to any website I might find relevant. This is typically a Jira or Trello board but can sometimes be to some API documentation, a GitHub issue pages, or even a local URL to open up a list in Things.

The third block is very specific to my hardware setup. I have an ultrawide monitor that I use as my primary display and then a 4K Dell monitor in portrait orientation to the side that I typically use for browsing and iOS simulators. This code tells Safari to move to that portrait monitor and then switch to full screen mode.

The final line opens up the Xcode project. I usually work in fullscreen mode on my primary monitor so it’ll typically move to a new space automatically without me needing to program that in.

With this simple script, I can press a single button to get everything configured. It’s probably only saving me 20 seconds of time but psychologically it lets me jump immediately into a project.

Opening a project directory.

Another minor hassle I encounter on a daily basis is opening up the directory where all of a project’s files are stored. I’ll typically do this if I need to look at some artwork I’ve saved or some documentation so I have a very simple script to open up the current project directory:

do shell script "open ~/Files/Clients/UKTV/"

This is again a psychological improvement as I hate wasting time digging down through Finder to get to the location I need.

Building and exporting iOS / tvOS apps.

So far I’ve only made minor improvements to my productivity but this last button saves me a huge amount of time; automated building. Whilst many developers will handle this task with some form of Continuous Integration or using the new Xcode Cloud feature, this typically doesn’t work well for me due to the number of projects I’m involved with at any one time. Instead, I use Fastlane to perform a wide array of tasks at once such as increasing build numbers, pushing to GitHub, building, and uploading to TestFlight.

Here is a typical Fastfile2 for one of my client projects:

# Config
xcode_version = "14.1.0"
targets = ["UKTV", "NotificationServiceExtension"]
git_remote = "origin/main"

# Import shared Fastfile
import "~/Files/Scripts/SharedFastfile.rb"

lane :distribute do

  ensure_git_status_clean()

  xcode_select("/Applications/Xcode-" + xcode_version + ".app")

  shared_increase_version(
    targets: targets.join(","),
    push_to: git_remote
  )

  version = File.read("shared-tmp.txt")
  UI.important(version)

  build_app(
    output_directory: "builds",
    output_name: version
  )

  upload_to_testflight(
    ipa: "builds/" + version + ".ipa",
    skip_submission: true,
    skip_waiting_for_build_processing: true,
  )

  upload_symbols_to_crashlytics(dsym_path: "builds/" + version + ".app.dSYM.zip", binary_path: 'scripts/upload-symbols')

end

To start with I specify the Xcode version I want to use, the targets of the project, and the name of the remote git repository. I then import a Ruby file which I’ll come to shortly.

The only lane is distribute and the first check is to ensure the Git repository is clean. If there are any uncommitted changes, the script will exit out and present an error. I then select the correct version of Xcode3.

The next section includes a shared_increase_version() function which comes from the imported Ruby file:

##
# INCREASE_VERSION
# 
# Prompts the user for a version number. If new provided, update all targets and reset
# builder number to 1. Otherwise, just bump the build number.

private_lane :shared_increase_version do |options|

  # Fetch all targets as comma-separated string and convert to array
  if !options[:targets]
    UI.user_error!("You must provide at least one target in 'targets'")
  end
  targets = options[:targets].split(",")

  # Fetch current version using default Fastlane action with the first target
  version = get_version_number(target: targets.at(0))

  # Prompt for new marketing version
  new_version = UI.input("New marketing version? <press enter to keep it at v#{version}>")

  if new_version.strip == ""
    # No change to version so just increase build number
    increment_build_number() 
  else
    # Loop through each target and increment version number with "versioning" plugin
    # The native 'increment_version_number' action does not work with recent versions of Xcode
    targets.each do |target|
      increment_version_number_in_plist(version_number: new_version, target: target)
    end
    version = new_version

    # Set build number to 1 using default Fastlane action (shows a warning about ${SRCROOT} but it does work)
    if options[:alwaysIncrementBuildNumber]
      increment_build_number()
    else 
      increment_build_number(build_number: 1)  
    end
    
  end

  # Fetch build number
  build_number = get_build_number()

  # Write the new version number to the shared-tmp.txt file so calling lane can pick it up
  # This is a limitation of Fastlane not being able to return values in a shared lane
  version_string = "v" + version + "-b" + build_number
  File.write("shared-tmp.txt", version_string)

  # Message to the user to show the new version and build number
  UI.success("App Version Updated: v" + version + " (build " + build_number + ")")

  # If there is no git remote to push to, then exit the lane
  if !options[:push_to] || options[:push_to].strip == ""
    UI.success("Skipping git")
    next
  end

  # Commit the version change
  commit_version_bump(message: "v" + version + " (build " + build_number + ")", force: true)

  # Add a git tag in the format "builds/v1.1-b3"
  add_git_tag(includes_lane: false, prefix: "v" + version + "-b", build_number: build_number)

  # Push to specified remote
  git_remote = options[:push_to].split("/", 2)
  remote = git_remote[0]
  branch = git_remote[1]
  push_to_git_remote(remote: remote, remote_branch: branch)
end

I won’t go through this line by line but the basic idea is that it will prompt me to ask whether this is a new version of the app or a new build; if the former, the version is updated to the one specified and the build number set to 1 across all targets; if the latter, then I just bump the build number. Once that is done, a version string is created that looks something like v1.2.3-b2 which I will use later in the workflow; this string is saved to a temporary file so the original Fastfile can reference it.

With the version and build number updated, the script then commits the changes to Git, tags them, and pushes them to the remote branch if one was specified.

The code resumes in the Fastfile with an Xcode build command (which stores the build and it’s dSYMs in a local directory), an upload to TestFlight, and the uploading of the dSYMs to the Crashlytics service.

With this system in place, I can press one button to have the entire build process execute in the background. This is hugely important to me as I can start work on another project whilst this process plays out; on the Mac Studio I don’t even notice anything is happening as the build process doesn’t come close to maxing out the CPU.

The nice thing about this Fastlane system is that I can make it bespoke for projects that need something a little different. Here, for example, is the file for a Catalyst project I work on:

# Config
xcode_version = "14.1.0"
targets = ["ATPDigital7"]
git_remote = "origin/master"

# Import shared Fastfile
import "~/Files/Scripts/SharedFastfile.rb"

##
# LANES
##

lane :distribute do

  xcode_select("/Applications/Xcode-" + xcode_version + ".app")

  ensure_git_status_clean()

  shared_increase_version(
    targets: targets.join(","),
    push_to: git_remote,
    alwaysIncrementBuildNumber: true
  )

  system("git push helastel HEAD:develop")

  version = File.read("shared-tmp.txt")
  UI.important(version)

  ios_export = "ios-" + version
  mac_export = "mac-" + version

  # Build the macOS app
  build_app(
    catalyst_platform: "macos",
    output_directory: "builds",
    output_name: mac_export
  )

  # Rename the macOS app export to mac-v1.0-b1.app
  FileUtils.mv("../builds/ATPdigital 8.app", "../builds/" + mac_export + ".app")

  # Zip the macOS app and then upload it to S3
  zip(
    path: "builds/" + mac_export + ".app",
    output_path: "builds/" + mac_export + ".zip"
  )
  s3_upload(
    access_key_id: "IMNOTTHATSILLY",
    secret_access_key: "Uhuhuhyoudidntsaythemagicword",
    bucket: "bucketname",
    content_path: "builds/" + mac_export + ".zip",
    name: "clients/bgs/builds/" + mac_export + ".zip"
  )

  # Build the iOS app
  build_app(
    catalyst_platform: "ios",
    output_directory: "builds",
    output_name: ios_export
  )

  upload_symbols_to_crashlytics(dsym_path: "builds/" + mac_export + ".app.dSYM.zip", binary_path: 'scripts/upload-symbols')
  upload_symbols_to_crashlytics(dsym_path: "builds/" + ios_export + ".app.dSYM.zip", binary_path: 'scripts/upload-symbols')

  # Upload macOS app to TestFlight
  upload_to_testflight(
    pkg: "builds/" + mac_export + ".pkg",
    skip_submission: true,
    skip_waiting_for_build_processing: true,
    app_platform: "osx"
  )

  # Upload iOS app to TestFlight
  upload_to_testflight(
    ipa: "builds/" + ios_export + ".ipa",
    skip_submission: true,
    skip_waiting_for_build_processing: true,
  )

end

This one is a lot more involved but the basic steps are very similar:

  1. Set up the configuration that is needed
  2. Select Xcode, check the repo is clean, and bump the version number
  3. Do another Git push to a different repo
  4. Build the macOS version of the app, zip it, and upload it to an Amazon S3 instance
  5. Build the iOS version of the app
  6. Upload the dSYMs to Crashlytics for both versions
  7. Upload each version to TestFlight

The process typically takes about 20 minutes to run but would take longer if I were doing it manually as there are multiple points that would require user interaction. That I can press one button and have this run seamlessly in the background is of huge benefit to me, especially if I’m doing multiple builds in a single day.

I also have a client that has 6 apps that all come from the same codebase. Again, I can press one button and have all 6 of those apps compiled and uploaded; it also automatically submits each app to the App Store once the builds have finished processing!

I’m planning on extending this further in the future as I create a HTML page for some clients which gives a changelog for each build based on the commit messages. At the moment I do this manually but I could easily automate that with AppleScript and hook it into this process.

My Stream Deck homescreen

The final thing to mention is the Stream Deck homescreen itself. I have a folder for each project denoted by their app icon which then goes into the start / build commands detailed above. There is also a STOP button will stop any Toggl timer that is currently running and a sleep button that will turn off the Stream Deck display.

I have been incredibly impressed with the Stream Deck as an input device and think it can be an incredibly valuable tool for any app developer that works on multiple projects. All of the above can be achieved by just running an AppleScript file (as that is what the Stream Deck is doing) but I find the tactile nature of the device to be incredibly rewarding and I know there is a ton more I’m going to do with it over time.

  1. The Bravo Throttle Quadrant has physical buttons on it for dealing with auto pilot settings and flaps which were my main use cases with FlightDeck. ↩︎

  2. All of the magic works in the Fastfile but the Stream Deck still needs to be able to start the command so I use an AppleScript to open Terminal and kick off fastlane distribute in the correct directory. ↩︎

  3. I have 3-4 versions of Xcode installed at any one time as different clients will be on different release schedules so I may still need to build something with an older version. I’ll likely change this portion of the script to always use the latest version unless I manually specify one as it’s a pain to edit my script files whenever a new Xcode update comes out. ↩︎

The Dodo Developer

For the past few months I’ve been meaning to start a newsletter; partially as a way of motivating me to finish some of my many projects, partially as a way of building up a mailing list so I can market my own apps better. The priority for this has only increased as the long-term future for Twitter (my main source of referrals to my various apps and websites) has moved to shakier ground1.

Today I finally took that step by creating The Dodo Developer2 on Substack.

I’ve created an initial introductory post going through some of the motivations for this move and listing some of the items on my “To do” list but the aim is to produce a new issue every 2 weeks.

I like to use my main website for either announcing new releases or providing code-level tutorials; the newsletter will be slightly different in that it will be showcasing things way before they are necessarily ready for prime time. This has two main benefits in that working akin to a 2-week sprint cycle will hopefully motivate me to proceed with the numerous app ideas that are half-finished on my hard drive; on the flip side, it will also hopefully generate some much needed feedback during the development process as I’m planning to invite subscribers to early access betas of new apps and major updates to existing apps. This all begins next week when I’ll be going into detail with the Music Library Tracker upgrade that introduces Spatial Audio matching as well as showing off a new dice-based game mechanic I’ve created for a “Choose your own Adventure” style game.

As we head into 2023 I’m determined to try and spend more of my time on my independent projects. By subscribing to the newsletter, you’ll be directly boosting my motivation towards that aim and hopefully getting an insight into how a developer forms an idea into a digital reality.

I’ll be landing in your inbox for the first issue next week!

  1. I don’t necessarily share the pessimism of others with regards to Twitter’s future but there is a non-zero chance that it shuts down without warning one night in the near future so best to be prepared. I won’t be moving to any other social media network. ↩︎

  2. I can’t remember if I’ve mentioned my affinity for dodos on this blog before but the reason for the name comes from two things: my surname contains the same letters as “dodo” and when I was at school I loved the dodo character in Disney’s Alice in Wonderland (“you there, stop kicking that mackeral”). When I developed my first piece of software, a database system for the management of choral robes (honestly), I used “Dodo Apps” as the name and it’s kind of stuck! ↩︎

'Chaise Longue to 5K' and porting a tvOS app built with UIKit to iOS, iPadOS, and macOS

A little while ago, my chiropractor recommended that I take up the “Couch to 5K” program in order to improve my fitness in a way that wouldn’t see me literally running before I could walk. It was a huge success and I was able to lose a significant amount of weight and improve my overall health. Unfortunately, the 2020 lockdowns and the birth of a new child meant most of that effort was undone and I found myself once again needing to embark on a gradual increase in exercise.

One of the key features of Couch to 5K is that you do intermittent bursts of running and walking; for example, in the first week you’ll do three runs consisting of alternating 60 seconds of running (x8) and 90 seconds of walking (x7) sandwiched between a 5 minute warm up and cool down. To keep track of this, I used the free NHS Couch to 5K app which tells you what to do at each stage via an audio voiceover which also offers encouragement throughout your run. This worked well for me as I listened to music whilst doing my runs, but nowadays I prefer to run whilst watching TV shows or YouTube videos on an Apple TV in front of my treadmill. For this use case, audio interruption wasn’t necessarily what I wanted, especially as I was already vaguely familiar with the different run timings. Instead, I wanted an app on my Apple TV that could show me my run progress in a Picture in Picture window.

Introducing “Chaise Longue to 5K” (after all, couches are so common):

Frasier, the discerning runner's choice.

The idea is straightforward enough; you open the app to a grid showing all of the available runs1 and then navigate to a fullscreen running page with a timer and coloured blocks that show what you should be doing. This can then be shrunk down into a Picture in Picture window so you can see the critical information whilst you watch something else.

Originally I’d planned to use the new AVPictureInPictureController.init(contentSource:) API that was introduced with tvOS 15.0 as that would allow me to fairly easily render the UIView of the run screen into the PiP window; unfortunately, there is a bug with tvOS which prevents that from working and is still present in the tvOS 16 betas.

My next plan was to have the app render a video of the run on the fly. Essentially I would display the running UI, snapshot it with UIView.drawHierarchy(in rect: CGRect, afterScreenUpdates afterUpdates: Bool), and then pipe the UIImage into an AVAssetWriter at 1 second intervals to generate a video. Unfortunately that proved too intensive for the Apple TV hardware (especially the non-4K model) with the render taking a couple of minutes for each video. However, as I’d already built the render pipeline, I instead updated the system to generate all of the videos sequentially and store them; I then ran that in the tvOS Simulator to generate the 12 videos2 and then bundled them in the app. Much easier 🤣

The final part was to add a button to the top of the run selector page that would show you your next run. To do this, I store the week and number of the last run that was completed3 within NSUbiquitousKeyValueStore; this is a similar API to UserDefaults with the advantage that it is synced through the user’s iCloud account meaning it’ll survive reinstallations or switching to a new Apple TV without restoring from backup.

However, that led to an interesting idea. Could I port this to other platforms? And if I could, would I be able to do it in a single day?

Yes.

Despite using UIKit rather than SwiftUI, I was able to port everything over to iPhone, iPad, and Mac within 5 hours or so. I started by rejigging the project files so shared code would be separate from the xib files I use for the interface. I then added a new target for iOS and went through the laborious process of recreating the xib files; unfortunately tvOS and iOS xibs are incompatible even so far as you can’t copy and paste standard elements like UILabel between them.

The design was such that it was quite easy to make it work for iPhone. The run page itself just needed some font size adjustments whilst the grid view showing all of the runs had some stack views tweaked so they were shown vertically rather than horizontally.

The next step was to optimise the design for iPad. Again, this mostly worked out of the box as I use AutoLayout for everything. I just needed to monitor trait changes and update the code to render slightly differently depending on whether we were in compact or regular width mode. This had the nice side effect of enabling the three column layout on an iPhone 13 Pro Max in landscape and also working across the various split screen views that are available on iPad.

Finally, I checked the box for Catalyst support for macOS and was surprised to find that everything pretty much worked out of the box. I only needed to add the following code to get the app looking just fine on the Mac:

#if targetEnvironment(macCatalyst)
    if let titlebar = window?.windowScene?.titlebar {
        titlebar.titleVisibility = .hidden
        titlebar.toolbar = nil
    }
    window?.windowScene?.sizeRestrictions?.minimumSize = CGSize(width: 1024, height: 768)
#endif

That code effectively hides the toolbar so the traffic light window management buttons blend into the app view and then restricting the minimum view size to that of a regular iPad so you can’t break the layout4.

With that done, I then went through the app and added a few quality of life improvements such as a native menu bar on the Mac, keyboard shortcuts for Mac and iPad, and adding the ability for PiP to automatically engage when you leave the app during a run on iPhone and iPad.

SwiftUI would undoubtedly have made the UI faster to port, but I still think the platform is too immature for full app development. As Richard Turton put it:

SwiftUI allows you to move so incredibly fast that by the time you realise what you want isn’t yet possible, you’re already off the edge of the cliff, like Wile E Coyote

That certainly matches my experience 🤣. Whilst it can be a phenomenally quick tool for building UI, it can’t quite match the smooth experience that users expect when it comes to the small but crucial details.

In Conclusion

As I’ve said many times before, one of the great joys of being a software developer is that you can build apps bespoke for your own needs and interests. I’ve massively enjoyed having Chaise Longue to 5K on my Apple TV whilst doing my runs, but I also really enjoyed the challenge of porting the app across to all the other Apple platforms that support Picture in Picture5. As ever, there are a number of small details that I’d like to highlight:

  • Whilst adding extra platforms didn’t take that long from a development point of view, it massively increased the amount of time preparing for App Store submission as I had to create a lot more screenshots, text, etc. Trying to work out the best way to show PiP on macOS was an interesting challenge! It was also difficult to work out a way to show it on tvOS without using copyrighted video content.
  • A nice side effect of using the same bundle identifier for all three versions of the app (Apple TV, iOS/iPadOS, Mac) is that if you buy it on one platform you get it on all of them! As I’m selling it for £1.79 currently, that makes it a pretty sweet deal…
  • I spent quite a lot of time on a little animation effect that happens when you first complete a 5K run. Confetti in front of and behind a real time blur that animates seamlessly. Beautiful.
  • I’m really pleased with how the Apple TV app icon came out as I’m always a big fan of the parallax effects you can create. I found a very cheap designer on Fiverr who created a first pass and then I tweaked it to match my needs; I can’t draw to save my life but I can recompose and recolor objects pretty well!
  • I was amazed that all of the Picture in Picture stuff just worked across all the platforms; I didn’t have to change any code whatsoever and whilst the videos were rendered from the tvOS app UI they look good on all platforms.
  • Thanks to using NSUbiquitousKeyValueStore coupled with .didChangeExternallyNotification, completing a run on one device will see the UI automatically update on all other devices within seconds. No 3rd party frameworks or private data collection required!

I’d love it if you would give Chaise Longue to 5K a try. It’s available now on tvOS 14 and above, iOS / iPadOS 14 and above, and macOS Big Sur (11.0) and above. One low price unlocks it across all platforms.

  1. Here’s a rough drawing I did in Notes compared with the final product↩︎

  2. Even though there are 27 runs, that only equates to 12 videos as most weeks the three runs are identical so they can use the same video. Generating those on my M1 Ultra Mac Studio took less than 3 minutes and means I can easily update them should I want to update the UI in future. Each video is rendered at 720p and weighs in at around 3mb leaving the overall app size at under 40mb. ↩︎

  3. Which is defined as getting to the cool down section. ↩︎

  4. Views in Catalyst always seem to be of the “Regular / Regular” size regardless of what tiny windows you create so it isn’t possible to have the view seamlessly change between iPad and iPhone style sizes when resizing hence the need for a sensible minimum size. ↩︎

  5. I did not bother porting the app to Apple Watch as there are loads of Couch to 5K apps that will serve you better on that platform; this app is predominantly about the Picture in Picture experience. ↩︎

Introducing the Spatial Audio Finder

I love Spatial Audio. The sound quality and the balance of the individual audio elements is truly extraordinary and a huge advantage for Apple Music over other streaming services. But the process of finding supported tracks is… well, a bit rubbish.

At the moment, there are 3 ways to discover Spatial Audio tracks:

  1. Using Apple’s own curated category which features a number of playlists and a rotating list of songs that have been added to the service. This is only a very small subset of all of the tracks though and requires a lot of scrolling and trying to work out what is new and what isn’t.

  2. Going to an album page on Apple Music and seeing if it shows the Dolby Atmos logo. This only works if the entire album is available in Spatial Audio; it can’t tell you if individual tracks are available (something which my Apple Music Artwork Finder can do!)

  3. Playing songs in your library; if they have Spatial Audio, it’ll kick in and show a Dolby Atmos logo. This obviously only works on a song-by-song basis and is therefore very slow.

Now there is a better way. Introducing the Spatial Audio Finder, the quickest way to see which tracks by your favourite artists are available in Spatial Audio:

So how does this work? At WWDC 2022, Apple made a small change to the Apple Music API such that it is now possible to request audioVariants as an extended attribute on songs and albums. Within this are such things as Lossless Audio, Dolby Audio, and Dolby Atmos (which denotes Spatial Audio). Whilst I would like to be able to just see Dolby Atmos as a flag on songs within my library, this new API method at least can get me the information via a fairly laborious process. Essentially, I scrape these data points and build up my own database which I can then query very quickly.

To begin with, I built a system which would accept an Apple Music identifier for a song; it would then fetch the album that song belongs to in order to get a full list of all the tracks along with whether they support Spatial Audio or not1. I then store this information in my database along with a “last checked” flag. My script runs continuously and will check to see if there are any songs present that have not been checked in the past 2 weeks and do not have Spatial Audio. If a song or album gains Spatial Audio status, then I tweet it via a new account; @NewSpatialAudio2.

With this in place, it becomes trivial to build up the database as I just need to throw in a load of song identifiers and the script will churn away fetching all of the information and actually expanding it to more songs as it will gather the entire album. I began by importing my entire music library which is relatively simple thanks to my Music Library Tracker app which allowed me to collect all of my song identifiers in a matter of milliseconds. I have 7508 songs in my library but as many of them are single tracks from albums, my script expanded this out to over 16000 tracks (of which around 1000 had support for Spatial Audio).

This is obviously skewed to my musical preferences so the next step was to add the various Spatial Audio playlists that Apple curates. I’ve stored the identifiers of all of their playlists from “Made for Spatial Audio” and “Hits in Spatial Audio” to “Jazz in Spatial Audio” and “Bollywood in Spatial Audio”. These playlists helpfully have a “last updated” flag on them so I check them frequently but only fetch all of their track identifiers if they have changed. This added another 20000 tracks of which most were compatible with Spatial Audio3.

At this point I was able to see which songs in my library were updated to Spatial Audio and also see new releases and when my tracks got upgraded thanks to the @NewSpatialAudio account. As every change thus far had been tweeted, it was possible to search Twitter for specific artists to see what songs or albums were compatible doing something like from:newspatialaudio "avril lavigne"4. Unfortunately, it turned out this was only working when I was logged in as @NewSpatialAudio and results were mixed if searching from different accounts. I don’t know if this is due to spam protection or some form of caching but it meant there was a need for a new tool; Spatial Audio Finder.

Creating the Spatial Audio Finder was relatively easy as I had all the building blocks in place. You enter an artist name and hit search, then I look up all the tracks in my database and list the songs that have been updated. In the end it took a bit longer as I realised I’d want to have the album artwork and track numbers on the page, and I wasn’t currently collecting that information; this would need to be added to my data necessitating a full re-fetch of the nearly 40000 tracks. I also decided that it was likely people might search for an artist that was not in my database. To remedy that, if a search is made and there are zero results, I go and fetch the top 25 songs for that artist on Apple Music and add their identifiers to the database which will typically expand out to their most popular albums which are the likeliest candidates for upgraded audio5. In this way, the more that people use this tool, the more Spatial Audio tracks will be discovered.

I hope that the Spatial Audio Finder will be useful to many people, but this is just a stop gap solution. My ultimate goal is to be able to scan your music library and then show you the tracks that have been updated to Spatial Audio then go a step further and generate a Spatial Audio playlist for you that gets updated automatically as new songs get upgraded on the service. The first step of this will be happening very soon as I release a new version of my Music Library Tracker app that will allow you to opt-in to upload your library to the Spatial Audio database; the next step after that will be showing you what tracks in your library have been updated! This will in turn expand the musical variety being placed into the database and showcase more Spatial Audio tracks. Eventually, I should have the most complete record of Spatial Audio tracks outside of Apple and also the fastest and most useful ways of accessing that data.

If you run into any issues, please do contact me so I can improve the service as much as I can.

  1. For example, if I fetch the song “This Love” by Maroon 5 (which has the identifier 1440851719), then that will give me the full album (identifier 1440851650) along with all the tracks so I don’t need to check each track individually. ↩︎

  2. The account will differentiate between individual tracks on an album and full albums that support Spatial Audio. I also distinguish between old tracks being upgraded to Spatial Audio versus new releases by checking if the release date was in the past 2 weeks or not. ↩︎

  3. They aren’t all compatible as there might be a single song in a playlist which is the only Spatial Audio song on an album; I fetch the entire album so I can monitor if other tracks get added over time. ↩︎

  4. Yes, “Sk8er Boi” is available in Spatial Audio. ↩︎

  5. I can’t believe there isn’t a single Michael Jackson song rendered in Spatial Audio yet (although you can get I Want You Back and ABC by the Jackson 5). ↩︎

Unlisted App Distribution on the App Store

Back in 2015 I was commissioned to rebuild an iOS app for models1, the leading modelling agency in Europe. The app performed double duty acting as a personal portfolio for models within the agency and a collection of all of the portfolios for the agents. As it was only for staff and models, the app was distributed via an Enterprise certificate which allowed me to generate a single .ipa and produce a website where users could download the app directly to their devices. Over the years, this process became slightly more arduous as the user would need to manually approve the enterprise certificate within their device settings but it worked.

A little while later I also built a small app that allowed talent scouts to choose multiple photos and fill out some details in a form which was then compiled into an email. This, again, was distributed via an Enterprise account.

Fast forward to 2022 and my client had allowed their developer account to lapse. Upon trying to renew, they were told by Apple that they no longer met the current program requirements and that they should seek to distribute via Apple Business Manager, Ad Hoc Distribution, TestFlight, or the App Store. The Apple Business Manager would not have worked well as that is essentially a full MDM system whereby the client would need to manage all devices (which would not be suitable for the models). Similarly, Ad Hoc Distribution would be a pain as we’d need UDIDs for every device we want to distribute to and TestFlight would require sending out new builds every 90 days; the App Store being public would not be an option given the niche aspect of the app.

Luckily I’d remembered that Apple had announced a new “unlisted” status back in January which would allow you to upload your app to the App Store but make it available only via a direct link, kind of like the public link system within TestFlight1. I browsed around App Store Connect but could find no mention of it within the “Pricing and Availability” tab which only allowed me to go down the business route. It turns out you have to apply for unlisted status via a form:

You’ll need to submit a request to receive a link to your unlisted app. If your app hasn’t been submitted for review or was already approved for public download on the App Store, simply complete the request form. If your app was already approved for private download on Apple Business Manager or Apple School Manager, you’ll need to create a new app record in App Store Connect, upload your binary, and set the distribution method to Public before completing the request form.

To start with, I uploaded a build and filled out all of the metadata on App Store Connect including screenshots, description, review notes, etc. I then attempted to fill out the form but was denied access, despite being an admin, as it is only available to the account holder. For reference, the questions they ask are:

  • App Name
  • Apple ID of App
  • Describe in detail the business problem your app solves
  • Why do you prefer to distribute this app unlisted on the App Store instead of privately to specific organizations on Apple Business Manager or Apple School Manager?
  • What is the estimated size of your app’s user base?
  • Who is the audience for your app?
  • In what regions will your app be available?

My client filled this out and within 24 hours they had a reply:

After careful review, we regret that we’re unable to approve your request for unlisted app distribution at this time.

In order to evaluate your app, please complete all the required metadata in App Store Connect, including app name, description, keywords, and screenshots and submit for review.

This was a surprise as everything was set up on App Store Connect. After another email back and forth, Apple replied with:

I will reiterate the note below, and say that all unlisted apps must be submitted for App Store review.

So it turns out you do need to still submit the app via App Store Connect but also fill in a request form. Obviously we did not want the app to go live publicly so I set everything to “Manual release” and then added an extra line to the review notes that read:

IMPORTANT: We will request “unlisted” status for this app, it is not for general release but we were told by unlistedapp_request@apple.com that we needed to submit this to App Review before requesting this status.

I submitted the app and my client filled out the request form again. The next day, I received notification that both apps had been rejected by App Review for Guideline 2.1 - Information Needed:

We need more information about your business model and your users to help you find the best distribution option for your app

Please review the following questions and provide as much detailed information as you can for each question.

  1. Is your app restricted to users who are part of a single company? This may include users of the company’s partners, employees, and contractors.
  2. Is your app designed for use by a limited or specific group of companies?
    • If yes, which companies use this app?
    • If not, can any company become a client and utilize this app?
  3. What features in the app, if any, are intended for use by the general public?
  4. Identify the specific countries or regions where you plan to distribute your app.
  5. How do users obtain an account?
  6. Is there are any paid content in the app? For example, do users pay for opening an account or using certain features in the app?
  7. Who pays for the paid content and how do users access it?

You’ll note that a lot of this is similar to the form that was already filled out 🤔

I was out of the office for the day so had not gotten around to replying to these questions when out of the blue I got two notifications to say the apps had been approved for the App Store! It looks like App Review had rejected the app as they were not suitable for App Store release (hence the questions above) but had completely ignored my note about the unlisted status. Then, later that day, the team that deals with unlisted apps looked at them and approved them.

When I went back to App Store Connect, it now showed a new “Unlisted App URL” within the “App Distribution Methods” on the Pricing and Availability page2.

A few thoughts:

  1. This process is not great and I’m not sure why they’ve decided to split it across two separate systems. It would surely make more sense to have “Unlisted” be an option within App Store Connect and then App Review can have the correct team contact you to ask whatever questions they need, especially as the distribution is done through App Store Connect.
  2. The documentation clearly says you only need to fill out the form if the app hasn’t been submitted for review. I’m not sure why we needed to submit the app only for it to be rejected by a different team?
  3. The “Unlisted App URL” you get is actually the same as what the public URL would be. This strikes me as odd, especially as you can convert an already public app into an unlisted one (though you can’t go back to being public). Those URLs are going to be cached by search engines, etc, so seems like a bit of a flaw. This wasn’t an issue for us but worth mentioning.
  4. Viewing the app on the App Store is exactly the same as if you’d shared any public URL; the store page is identical with screenshots, description, ratings, App Privacy, etc. There is no indication whatsoever that this is a private page.

Overall I like that this system exists and that there is a way to get apps to a very specific small audience without paying 3x the cost for an enterprise account or going through the Apple Business Manager. I was also pleasantly surprised that the very basic app for talent scouts was approved as I did not think it would be; it looks like the App Store Guidelines are much less strictly enforced than they are for a public release3. However, there are clearly some teething problems that need to be ironed out.

In any case, if you’re looking to get an unlisted app on the App Store, just be aware that a rejection may not necessarily be all it seems!

  1. Similar, but App Store apps don’t typically expire so a single build would be enough until iOS changes necessitate another build. ↩︎

  2. As the app was still set to “Manual release” I had to press the release button. I was a bit nervous of this as the language is still “Are you sure you want to release this to the App Store” but it did not make it publicly available; once you have the unlisted URL there is no way to make the app public. ↩︎

  3. Or we got lucky 😂 ↩︎

« Older Entries