Unfortunately my day job doesn’t involve much of the new technology I see talked about, but  I do like to keep up where I can. Since the start of the year, in my free time I have been starting to scratch the surface of MQTT, MongoDB and REST.  I have an existing project which I’m updating, and  thought it might be worth jotting down a few thoughts as I go, probably over several posts. In this first post, I’ll cover a bit of background and outline ‘the plan’, with future posts going more technical as appropriate.

The Project

I’ve been tinkering with Android development for a few years now, as and when my spare time has been available,  dipping in and out depending on other commitments. This culminated in the release of an app (Acca Buddy) to Google Play last year. Basically it’s a live football scores app, targeted at users who enjoy a specific type of sports bet. An ‘accumulator’ (or Acca) is a bet where the user selects several fixtures from a list and predicts their outcome, the bookmaker then provides odds based on that set of predictions. To win, the user has to get all the predictions correct.

For these specific types of bet, the user is interested in the scores for their own selected groups of fixtures,  not the scores of an entire league or competition. A user might also have several ‘accas’ running at one time, with some matches in multiple accas, with different predictions for the same match in different accas. Finally, the user needs to know how close each acca is to being correct, and how long there is left in the games for that to happen. This is so that they can decide whether to take some action, such as ‘cashing out’ (an option some bookies offer to get out the bet early for a reduced take) or maybe make a small bet ‘the other way’ to offset a probable loss.

The upshot is that users want to know the state of their specific sets of fixtures, against their individual predictions, and that’s what Acca Buddy provides. Live score updates are collated on the server, and the app pulls those scores and displays them appropriately.

 

Current Architecture

Fixtures and scores are made available to the clients via two XML files, one for the days fixtures, and one containing match status and scores. The use of two files allows for the live scores file to be optimised slightly to reduce the amount of data the client needs to download while matches are in flight. The files are generated using Perl scripts which run periodically, with the fixtures being updated every few hours, and the live scores every minute. Once created, the files are uploaded to Amazon S3 storage, and made available via HTTP from there.

Clients can then download the files as required, and make the data available to the user in a way that makes sense. In the case of the Android app (which I wrote – there is a Windows Phone version written by someone else), the app grabs the fixtures for the current day at some time early in the morning, and then, since it knows the kick off times of all the games, starts polling for the scores file five minutes before the next fixture starts, and stops while no matches are in play.

Clearly this is not an optimal solution, and it was the best fit at the time for a number of reasons. For the current number of users, making XML files available via AWS is the cheapest and simplest solution. I did consider a more elegant solution, such as a database backed API. This raised several practical questions though. Firstly was the question of time – mainly a lack of it on my part. I had to develop the back-end and the Android app single handed, with limited time available. Getting the XML file solution working was a much simpler and quicker option and let me crack on with the Android side of things, which is where I wanted to focus my time. Additionally, a ‘proper’ solution would have meant having to consider hosting options, sizing for an unknown number users, and then managing that infrastructure, which was not something I really wanted to do at that point. So, despite the obvious downsides, the XML option ‘did the job’.

That said, I’m now in a position to spend a bit of time modifying the back end, at the same time as getting to grips with some new technology. This will hopefully allow me to improve things considerably.

 

 

Why Change?

The fact it’s not optimal aside, the XML solution has worked quite well thus far. In terms of scaling the number of clients, using AWS S3 cloud storage really takes all the pressure off. The number of clients connecting is tiny compared to other larger scale apps, and AWS is more than capable of serving the files for far more users than we currently have. Sure, more clients means more cost, and the cost scales predictably with usage.

The real driver for change is the desire to increase the number of competitions and fixtures covered by the app. Currently, as described above, each client will download the live scores file while any match is in progress. Increasing the number of fixtures from around the world results in both a much larger file to download, and a longer period during the day when matches are being played. This results in a huge increase in the amount of data being transferred per client, and my initial tests showed that this increased data transfer is simply not viable for mobile devices with limits on monthly data transfer.

There is also the question of battery life to consider – all that data being transferred needs to be processed by the mobile device, and the increased CPU cost will increase battery drain. So each client is sucking more data, and using more cycles to process that data, most of which the user might not actually be interested in. This is “Not Good”(TM) and thus an alternative architecture is required if I want to increase the range of fixtures.

 

 

The Plan

In a nutshell, the plan is to move the back end to a database backed REST API combined with MQTT. In this case, a NoSQL DB like Mongo is ideal – there is no need for a strict schema and it will allow me to add extra data (such as goal scorers or cards) to new documents easily. The data in the DB will be made available via a simple REST API, with live score updates being fed through MQTT, using JSON in both cases.  The Perl scrips which previously generated the XML will feed the database and MQTT as appropriate.

Clients will use the REST API to discover which fixtures are available, and then, knowing which fixtures they are interested in, subscribe to the appropriate MQTT topics to receive live updates for just those fixtures. This ticks a lot of boxes.  It significantly reduces the amount of data transferred to each client, and effectively implements ‘push’ notifications of score updates to those clients, which improves things for users. At the same time, since live score updates are handled over the more efficient MQTT connection rather than REST/HTTP, the server side load should be more manageable than a pure REST solution, which also wouldn’t provide the push updates. This helps mitigate the original concerns I had regarding scaling for an unknown number of clients, and I now have a better idea of user numbers which makes planning easier.

In summary, the idea is to implement a hybrid system where REST is used for discovery and MQTT for updates.  This is what struck a chord when I saw a tweet by @andypiper, referencing (I think) @michaeljkoster talking about IoT, and the suggestion to think about a similar approach for IoT systems. While my app isn’t IoT, the approach certainly seems similar.

 

What’s Next?

I’ve already started the ground work for the new back end, and will write a couple of posts about that as I find time. It actually all appears to work so far – the database is up and running, and being updated, live score updates are being posted to MQTT, and a REST API to access the database is in place. It does need hardening though, and I’m currently looking at securing both the API and the MQTT broker before updating the Android app to use the new system.