Extending REST APIs with API Aggregator

by Josep M. Pujol

First things first

This article showcases what is perhaps the most important problem of REST APIs: they are too chatty. If you are not aware of how critical it is you are welcome to try it out on the online demo below. As you would expect the article also introduces a potential solution to it :-)

It’s not the only solution to this problem, in fact Netflix have a full-fledged system to optimize their API against REST chattiness. Do you remember what stored procedures are? Then, get ready for “API stored procedures” using API Aggregator.

REST Api Chatiness

The resource-driven approach of REST is conceptually (and practically) very appealing for developers because it maps very well with the underlying data model of their application or system. If you want to fetch the information on user x, do a GET /users/x.json. If you want to modify the user, do a PUT /users/x.json with the new data on the request body.

API consumers, which are also developers, are also very familiar with this resource-driven approach. Both sides of the table agree that REST is a good design pattern, and consequently, it is widely accepted.

The simplicity and ease of understanding of REST does not come for free. The fine granularity of REST APIs is a double edge sword. It makes understanding and integration easier but there are some performance issue that in some cases cannot be ignored. REST APIs tend to generate a lot of requests to accomplish any non-trivial use-case, REST is chatty, too chatty to be left unchecked.

Why chattiness is so bad for mobile APIs? Multiple reasons,

  • latency, mobile networks latency (whether HSPA, 3G, 4G) is terrible, and such large latencies can really affect the user experience of users of a mobile app using an API. The same problem appears for home DSL connection although is not as bad as mobile networks. See the last-mile problem.
  • power consumption, using radio signal is expensive, a non-planned used on the connectivity on a mobile application can make it a battery hog (see Qualcomm paper). Mobile apps need to be aware of when requests are made if you want to keep power consumption at bay.

Most mobile apps (whether HTML5, iPhone, Android, FirefoxOS) end up consuming APIs. As a matter of fact the app backend can very well be an API. If one is not careful on how you app interacts with the API you can end up offering a very bad user experience for the end-users of the application. The end-user will be angry at the app, and the app developer will be angry at the API. Nobody wins.

How bad is REST Chattiness?

To illustrate the effects of REST chattiness we have developed a little demo.

We created a little mobile app in HTML5. The app has only a very simple use-case. It must return the most emotionally positive charged word of a sentence if the sentence is already positive.

To develop the app we checked around to see if there was an API that we could leverage. Luckily, there are many APIs that do mood/sentiment analysis. We decided to use the SentimentAPI (which is a simple API that 3scale uses for examples). After creating the account and getting the keys we were ready to roll.

The SentimentAPI is fully REST. It has GET operations for the emotional value associated for two resources: sentence and word. You can play with the Active Docs of the API here.

Thanks to the API the app could not be simpler. Do a GET /sentence, and if the result if positive, do a GET /word for each word in the sentence to find the maximum. After some coding, the Mobile app is ready

The mobile app looks like this (screen-shots below). There is also an online demo that you can test on your browser or cellphone (source code is also available)

Main page of the mobile AppMethods to do requests to the APIExample of results of a test, a good one ;)


The application is super simple. You can put the sentence you want and it will give you the most emotionally charged word if the sentence is positive.

Note that the app also have a combo-box (right figure above) that lets you choose which method you want to use to do the API calls, the type of request response cycle. There are two options:

  • AJAX Async API Requests: This is the typical approach. It uses asynchronous requests. The first request fetches the sentence, and then, if positive, N asynchronous requests are launched to fetch the sentiment of each word. In total, there are 1+N requests. Note that the use-case we use as example is very well-suited for concurrency. More realistic and complex use-cases will have more dependencies between the REST calls which will make latency even more critical. This simple example, however, is more than enough to showcase the problems of REST chattiness.

  • API Aggregation by Lua and Nginx: The second method is to aggregate all the requests in a single new API method that is no longer REST but use-case based. We are somehow cheating here! We have extended the API to fit our particular use-case. But we show how trivial is to provide those use-case based methods from REST API later on. For now, just play along please :-)

Empirical results

We instrumented a little experiment to get some performance numbers.

The conditions of the experiments were: batch of 1000 req (after a warm-up batch of 200 req). We kept concurrency to 1 since it’s not important for the experiment. By request we mean the page load from clicking submit to get the results (this translates to multiple requests to the REST API). The mobile access network was Vodafone 3G (Spain). Mobile app run on Android 4.1.2 (Samsung Galaxy S2). SentimentAPI runs on us1-east on Amazon EC2.

The result of the experiment are the following:

Empirical Results

Disclaimer: Please, remember to take the numerical results with a grain of salt. This experiment, like 99.9% of system experiments on the Internet, are not comprehensive enough to be scientific. Further testing on different network conditions, devices, etc. would be required. However, the experiment, as simple as it is, already hints were the problem is.

The mobile app when using the typical AJAX async calls (red bards) is very bad because of the latency on the Vodafone 3G network. The end-user sees an average page load time of 815 ms, a really bad user experience. The low granularity of the REST API of SentimentAPI made development very easy. But now we pay the price of not having a more use-case oriented API call that did just what we needed.

Now let’s image that we could aggregate all the REST API calls needed for the use case on a single API request. The average page load time would go down to 278 ms (blue bars). By doing API aggregation with Nginx+Lua we reduced the page load by a factor of 3.

That’s to be expected

If you are not not a developer you might be scratching your head asking why mobile apps do no suck so badly as the empirical results seem to suggest. On the other hand, if you are a developer this should not surprise you that much.

Developers are aware of this problem because they have experienced many times. The typical solution is to eat it up and create not only the mobile app but also a backend service for the mobile app. The mobile app does not call the 3rd party API directly, it calls the backend service that proxies (and aggregates) the requests between the mobile app and the 3rd party APIs.

When you have a backend service latency become less important since ping times between servers in data-centers with good connectivity (e.g. AWS, Rackspace, Heroku, etc.) are at least one order of magnitude less than latency on a mobile network. Furthermore if you are lucky and your backend service is hosted in the same data-center than the REST API you are consuming the latency will drop another order of magnitude. Co-location is always a big plus, and as you will see it might not be so difficult to achieve.

To sum up, why mobile apps do not suck so badly in terms of page loads and latency? Because developers are doing a lot of extra work. The mobile app and a full backend service to support it. That’s “twice” the work on implementing it, and “twice” the work on maintaining it too. And to make things worse, they require a skill set that is quite different. This last problem is fading out thanks to cloud hosting solutions like Heroku that aim to minimize the ops and sysadmin part of running a backend service.

You can have a cake and eat it too

What a mobile app developers would like to do is to ask the API developers; could you please add the following X methods to your API? That would really make my life so much easier.

This is unlikely to end up very well. The API owners will say, rightfully, that your use case is too specific; or that it cannot be maintained; or that you would be cluttering their nice REST API so carefully designed.. In short, they are not likely to do it, and neither should they.

So, if you want it done you will have to do it yourself.

Is there a solution in the middle? Yes,

In the DB world there is something called stored procedures. Roughly speaking they are aggregation of multiple SQL sentences in a single function that are executed in the database server as a block. The advantage of it is that they encapsulate complex use-cases and workflows that are specific to an application. The application can take advantage of being close to DB engine and the DB admins maintain the data model clean and simple.

Can something similar exists for APIs? Also yes,

In fact a power-house like Netflix is using this approach to optimize their API. Netflix offers a framework that their different dev teams (different platforms) can use to write their own JVM-based code that uses Netflix REST API. This code runs co-located to their REST API. As a result, your PS3 at home is not doing hundreds of requests to the Netflix API, it’s doing a single request to /ps3/homepage. This end-point contains the code pushed by the PS3 dev team that does all the magic against the REST API.

The REST API team is happy since they can keep their API under control. The dev-teams that work in different platforms are happy since they do not have to maintain a myriad backend services to make their apps run smoothly. And finally, the end-user is also happy since Netflix on their PS3, TV, mobile phone, etc. is responsive.

Netflix engineering does quite an amazing job, I advise you to follow their blog if you don’t do it already.

Do you have to an Internet giant to do something like this? No.

As a matter of fact, you can use open source tools that do 99% of the work. The remainder 1% that glues everything together can be found on the API Aggregator project. If you want to know more about how to enhance your API with nginx and lua you can check Rai’s previous blog post on the matter.

Introducing API Aggregator

API Aggregator is a system that combines lua and nginx to offer a sandbox environment to safely run user generated scripts. This scripts can perform complex API aggregation so that REST API can easily extend to support dynamically generated API method that are use-case based rather than resource-based.

  • Dynamic creation of end-points when new user generated scripts are added
  • Script must be written in Lua (an extremely fast scripting language)
  • The system works as a stand-along, it can be used by any API regardless of the programming language or framework used
  • Flexible definition of the restrictions on the sandbox environment
  • Ability to define timeout for long handing rogue user generated scripts

The source-code is available at Github. Feel free to use and to contribute!

blog comments powered by Disqus