Techno

Event streaming is modernising how applications consume data

0

I’ve spoken to many organisations about modernisation and what their options are to achieve it. Typically, we discuss Kubernetes, automation, DevSecOps and observability. Now, something else comes up regularly: the need to look at data.

Many organisations are trying to see how they can improve upon their data integration layer, to be able to provide data faster and improve applications’ use of the data.

Until now, organisations have used many data transformation methods for data integration layers including Enterprise Service Bus (ESB), Message Queue (MQ) and more. However, the volume of data, connectors and interdependencies means that organisations create an integration layer that is complex, difficult to manage and most importantly delayed or slow.

You need a single view of the truth without a commitment (lock-in) to a unique technology or data format

What you want to do is create a single view of the truth, where legacy and modern applications are enabled to consume and provide data in near real time, moving to an event-driven framework.

Ideally, you would want to move to a smart endpoint framework with a pipe of data as your single view of the truth with constant and consistent data flowing.

If you have looked into ESBs or data integration before, you would have seen a picture with lots of scary crazy lines. And then just below it, a cool picture of a single ESB layer that ties everything into a neat bow. While in principle this is true and it works, when you require an application to be able to publish and subscribe data at the same time or when data is required near real time, the integration layer becomes intertwined and impossible to manage. A problem also jumps up when hundreds of applications or services need the same message in an MQ or ESB, but often it disappears once it is consumed. How do you solve this? Your first thought might be to simply duplicate the messages, but that causes a host of other issues. Event streaming does this out of the box, without any of the complexities that arise from it.

Every organisation needs to integrate data, and if you look at the wide spectrum of technologies and frameworks, it becomes a nightmare.

From SOAP, REST, JSON and many more, to the addition of a vast number of programming languages and application architecture from monoliths and microservices to SOA. Creating a truly all-encompassing data integration layer with event-driven processing and streaming is not even an option anymore, but a requirement. You need a single view of the truth without a commitment (lock-in) to a unique technology or data format.

Event-driven processing and streaming are key

What is Kafka, and how is it different?

Kafka is an open-source distributed streaming system used for stream processing, real-time data pipelines and data integration at scale. Capable of handling high-velocity and high-volume data, able to scale to trillions of messages per day, it uses permanent secure storage to not only action events in near real time but gives you the ability to retrospectively take action based on past events. This is not a data lake replacement, but short time frames for past events work great.

When you move to modernise applications, you need to keep in mind how data will be used and how you will integrate and connect application data together as this is crucial to how the applications are architected.

What are the benefits of event streaming?

  • Event-driven architectures make it easier to power business operations with services capable of immediately responding to the countless actionable events occurring every day.
  • Businesses across every industry use real-time event streaming platforms to power applications that deliver value by responding immediately to triggering events.
  • Large and elastic scale: Enable high volume and throughput in a multi and hybrid cloud model.
  • Event driven: Connect microservices asynchronously and move data where it’s needed.
  • Interaction at scale connecting legacy and modern applications.

But how can it be used in your business?

  • Event-driven processing of big data sets like IoT or edge cases
  • Mission-critical, real-time applications
  • Integration between different legacy applications and modern applications
  • Microservices architecture

Event streaming is changing many businesses for the better and LSD has worked with (and is currently helping) many businesses adopt this new paradigm with overwhelming benefits.

Reach out if you feel you might want to pursue this, and we can discuss it. I’m happy to share my thoughts and give insight into a course of action that could work for you.

About LSD
LSD was founded in 2001 and wants to inspire the world by embracing OPEN philosophy and technology. LSD is your cloud-native acceleration partner that provides managed platforms, leveraging a foundation of containerisation, Kubernetes and open-source technologies. We deliver modern platforms for modern applications.

  • The author, Deon Stroebel, is head of solutions at LSD
  • This promoted content was paid for by the party concerned

Source

Somali Journalists and African Union Transition Mission in Somalia (ATMIS) officers train on Conflict Sensitive Reporting

Previous article

Africa: Mobile money can add 1% to GDP per capita yearly (Report)

Next article

You may also like

Comments

Comments are closed.

More in Techno