Ben Dodson

Freelance iOS, macOS, Apple Watch, and Apple TV Developer

Retiring TubeUpdates and WikiLocation

Today I’m sad to announce that both TubeUpdates.com and WikiLocation.org have been retired.

TubeUpdates

Started in 2009 as part of the Guardian Hack Day 2, TubeUpdates.com was an API that allowed you to find out the status of each of the London Underground lines. It worked by screen scraping the TfL website every minute of the day and for the past 5 years it has stored data on every status update.

Back in March of this year, the API stopped fetching new data due to changes to the TfL website. As TfL were now providing their own richer API, I announced it would be retired in May.

The API will now return a 404 error for all endpoints. I’d recommend checking out the TfL API as a replacement for live status information. If you are interested in historical information, you can download a MySQL dump of all of the status updates I retrieved during the 5 year period.

WikiLocation

I built WikiLocation in 2010 as I was working on an app which needed to fetch Wikipedia articles that were nearby. As it turned out, I never got around to building that app but the API took off and has been used around the world. At the time of writing, there was only one unofficial API that fetched geocoded articles and it was woefully out of date. I was informed by a Wikipedia staff member recently (quite rudely) that they now have their own API.

I’ve been meaning to retire WikiLocation for a while but some issues with my hosting company this week have accelerated this as some automatic updates they added have killed my infrastructure. WikiLocation was the last site on the old hardware and the time it would take to rewrite it and put it onto new hosting is time I don’t have (especially considering that donations for the service from those who have used it haven’t covered a single week of the hosting I’ve been paying for over the last 4 years).

The API will now return a 404 error for all endpoints. You may want to take a look at the official API from Wikipedia (I haven’t looked into it myself) or you can download the latest MySQL dumps (in 38 languages) of the data I retrieved. Alternatively, you can take a look at the original python script that crawls the Wikipedia linking tables to build your own scraper.

It is always a shame to have to terminate a service and I apologise for the inconvenience this may cause as there was no deprecation time. However, it just isn’t feasible for me to rewrite the entire API for new hosting when there are other alternatives available.

Font Finder Featured » « Off on honeymoon!

Want to keep up to date? Sign up to my free newsletter which will give you exclusive updates on all of my projects along with early access to future apps.