Creating event-based architecture on top of existing API ecosystem

Olga Annenko integration best practices

Event-based architecture REST APIs background

Is it possible to create an event-based architecture on top of the existing API ecosystem?

We are all creatures of convenience. Just think, which package delivery company you would give a favorable review: the one that, though providing package tracking, simply informs you on what day and within what time frame you’re supposed to expect your delivery to arrive (and even that ends up incorrect due to, say, traffic delays), or the one that actively notifies you that your package is now just ‘this many’ stops away from you?

The demand for real-time applications and services is undoubtedly increasing, and if your business app or service is customer-oriented, be aware that customers will probably expect a more immediate, real-time experience. And with that, the topic of event-driven architecture increasingly gains strategic importance and hence interest from companies of all sizes.

Today, it is not my intention to describe step-by-step how to build an event-based architecture because frankly, there are too many use cases to consider and too many supporting technologies to mention, to be able to pack it in one article. There are also too many variables to account for: your existing IT infrastructure, your personal preferences when it comes to technology stacks, and your available resources are just some of the factors that can direct this conversation in various different ways.

Today, my intention is to review how it is possible to facilitate the creation of an event-based architecture on top of the already existing API landscape. But let’s start with the ‘why’ first – why the terms ‘event-based’ and ‘event-driven’ seem to be on (almost) everyone’s lips nowadays.

The beauty of real-time applications and the importance of event-based architecture

As for me personally, I don’t need to be bombarded with notifications all the time. In fact, for most of my phone apps, I switched this function off. However, there are some situations where I would appreciate a real-time or at least near-real time update. Package delivery is one of them – I prefer to intercept it before the delivery person rings the doorbell, which throws one of my dogs into a fit of hysterical barking. And I would love to have an app that would ping me when my package is one stop away from me (as opposed to me checking the status every ten minutes or so).

Or imagine a check-in app that would send you real-time push notifications when your gate has been changed – might be highly useful if this change happens just half an hour prior to your flight departure and you’re already running late because of a traffic jam.

It’s not about this kind of information being made available per se – because oftentimes it is –, but rather about its availability to the customer in real-time and pro-actively served. This is what constitutes great customer experience everyone is talking about nowadays. It might not seem like a game-changer, but this very small differentiator can become the reason for you choosing one carrier or airline over the others.

In many cases nowadays, data exchange between applications is enabled through APIs, oftentimes REST. One application has an update – e.g. a package has been shipped or a gate for a specific flight has been changed – and another application receives this update. The issue is that RESTful applications and services are essentially polling-based. This means they ask for data at certain time intervals and even then, sometimes only when requested to do so (think how often we refresh this or that mobile application to get the most recent data).

In order to deliver this kind of great customer experience we’ve mentioned earlier and serve information pro-actively and in real-time, we need a different mechanism where business software applications, services and systems can be triggered immediately in response to a particular event. This is essentially what enabling event-based architecture is about.

Components you need to build an event-based architecture

There is no one way to build an event-driven architecture, and there are multiple variations, protocols, and methods that can be implemented for this purpose. Essentially, though, it consists of three decoupled components – an event producer, an event consumer, and an event broker. Decoupling is what makes it possible to process events asynchronously – this ensures that the event producer and the event consumer continue operating independently from each other (as opposed to synchronous communication that “blocks” operations until a request from one system receives a request from another).

Event-based architecture schema
Event-driven architecture schema

Event producer

Event producer is essentially the sender of an event. If we take our examples from above, a shipment tracking system or a gate management system can be taken as event producer examples.

Event consumer

Event consumer is, logically, the recipient of the event. It can be an app on your phone or another business application in a company’s IT ecosystem that checks or listens for the occurrence of a specific event in order to respond to it accordingly – for example, with a push notification on your phone or by triggering yet another process. Usually, this is enabled through event consumers being subscribed to these specific events.

Event broker

Now, event broker is what makes up the gist of event-based architecture and enables the decoupling of the event producer and event consumer. The event broker accepts events from the former, stores them long-term, and then sends these to the latter. Depending on the number of event consumers, event brokers are also responsible for routing events to the right consumer. The messaging pattern that is mostly often used in this context is pub/sub.

This asynchronous pattern ensures that no messages get lost. Even if one or more event consumers (i.e. subscribers) are unavailable for some reason at a given moment, events will be queued in the exact order in which they occurred. Once the event consumer reconnects, it will just consume those queued messages and resume its previous activity.

My applications are REST. Can I still build event-based architecture?

Most of the time when you search for information on how to create an event-driven architecture, you will find the terms ‘streaming APIs’ and ‘event-driven APIs’. These are, as you might already know or at least have guessed, a new breed of APIs that enable event-driven communication.

The reality is, however, that most software application vendors do not provide these types of APIs; either because there are other business-critical priorities or they simply don’t have the required resources. This is where augmenting existing APIs to make them fit for an event-based architecture may become interesting. When talking about augmenting REST APIs we have already been working with, there are several strategies we can pursue depending on the role of these APIs in the event-driven schema: are these APIs event producers or event consumers?

When REST APIs are event producers, the workaround is usually quite straightforward; we just need an event broker that supports various APIs (including REST) and protocols – in other words, the one that helps translate REST to message protocols such as AMQP or MQTT.

When REST APIs are event consumers, the workaround is likely to be more complex because we need to find mechanisms to set up topical subscriptions to existing REST APIs or the event broker, and teach them how to be [more] event-based in general. This can usually be achieved with an event-driven layer consisting of webhooks, pub/sub services, and server-sent events.

To sum it up – yes. If the business software applications and systems that are implemented in your company provide only REST APIs, you can still build an event-based architecture around them. While we wouldn’t encourage overcomplicating things, sometimes (or rather, more often than not) we have to work with what we have, and augmenting existing API is a solid workaround if a native solution to the problem is not on the table right now. In this context, I’d like to leave here a link to a fascinating article called “Marrying RESTful HTTP with Asynchronous and Event-Driven Services”.

A side note: Even though streaming and event-driven APIs are getting so much attention, REST APIs are not going away anytime soon. The use cases that require the added complexity of real-time asynchronous communication through streaming / event-driven APIs and event brokers are not as numerous as the number of use cases for REST APIs. From this perspective, we are going to see more hybrid architectures that incorporate both REST and streaming / event-driven APIs.

Event-driven approach in application integration

The event-based architecture is useful, by the way, not only for providing great customer experience through customer-facing applications and services. It can also be extremely useful in building integration between multiple business software applications – if not for the sake of real-time data synchronization then at least for the sake of considerable resource savings.

Krombacher, one of elastic.io clients, wanted to build a robust 360-degree customer view and streamline its marketing efforts across all channels, aiming for personalized messaging and ultimately outstanding customer care and satisfaction. In the first phase of this project, they focused on enabling data exchange between online points of contacts, their Salesforce CRM and the customer data & experience platform from Exponea.

To ensure seamless data flow and exchange between various systems of records, Krombacher used elastic.io’s Webhook and AMQP connectors to trigger a lot of integration flows as long as the applications and systems in question support the ability to push data via webhooks or onto an event-bus. And with the help of the PubSub connector, they kept each integration flow as small as possible, thus implementing a modular flow architecture. Since the requirement was to synchronize data not merely between two but rather between three to five systems, the PubSub connector allowed it to break these flows into small chunks and dynamically propagate updates between them.

If you are curious to learn more about it, you can download the entire case study here. It is built around the specific use case of 360° customer view but you might find there some useful technical tricks for building an event-driven architecture for your own projects.

Alternatively, feel free to take the elastic.io integration platform for a test drive by setting up a demo.


About the Author
Avatar für Olga Annenko

Olga Annenko

Twitter

Olga Annenko is a tech enthusiast and marketing professional. She loves to write about data and application integration, API economy, cloud technology, and how all that can be combined to drive companies' digital transformation.


You might want to check out also these posts