The more time we spend with Kafka, the more we like it. Kafka helps us think more clearly, collaborate more effectively, and it just works well.

Event streaming is the digital equivalent of the human body's central nervous system. It is the technological foundation for the 'always-on' world where businesses are increasingly software-defined and automated, and where the user of software is more software.
                     -- Apache introduction to Kafka

Yesterday’s data architecture can’t meet today’s need for speed, flexibility, and innovation.
                    -- McKinsey

Kafka is a streaming platform that can be used to integrated disparate systems. For example, if you have an ERP system to manage accounting workflows, a CRM system to manage customer conversations, and use Ondema to manage production scheduling on the shop floor, you can use Kafka to integrate these different systems by using an event-driven architecture (links below if you’re new to Kafka and would like to learn more).

Think more clearly

The limiting factor in every technology project is usually people. A common challenges is that different project stakeholders simply don’t understand the scope of a business problem the same way. Integrations with Kafka can help when thinking in terms of messages and events turns out to be more intuitive than thinking about systems, APIs or database schemas. For example:

With an API driven system, it may be necessary to create one or more APIs to make it possible to “Add a new customer to the customer database”. Creating an API may require getting teams to agree on what a new customer is as well as everyone having some level of intuition about what the customer database is, and what it means to add a new customer to it. The plain fact is that key stakeholders (e.g. your best sales people) probably don’t know what the customer database is, how it works, and they don’t care. Their job is to get customers, not figure out how IT systems model customers, and they are reasonably focused on their jobs.

Meanwhile, an event stream will be much more focused on the event, e.g.: “Here is a new customer”. It is still necessary to get everyone to agree on what attributes we need to capture for that new customer. But teams can now talk about the process of adding a new customer without having to think about how different systems model customers. Conversations become more focused, questions can be resolved more quickly, and working software can be deployed faster and at a lower cost.

Collaborate more effectively

When a Kafka event stream becomes the API between different system, it introduces an abstraction layer that can be an order of magnitude easier to think about than the REST APIs or database schemas that might have been required without an event stream.

Systems that publish events to a stream are often only concerned with the event schema and are unconcerned with downstream systems and processes. Similarly, systems that consume the events don’t need to worry about publishers or other subscribers. Iterating on the schema will require collaboration between publishers and subscribers but making sense of the impact of a schema change will often be a great deal easier to do than getting teams to understand complex dependency graphs resulting from shared databases or REST APIs.

Kafka just works

Like many web-scale technologies, Kafka offers unmatched reliability and can scale to massive loads. A growing number of cloud providers make it possible for you to deploy Kafka without having to manage it yourself. In many cases, you should be able to both lower costs and improve reliability by using event streams.

Kafka’s ecosystem is growing explosively. New integrations, new vendors, and a constant stream of new announcements suggest that we are still in the early days of event streaming and the evolution of event-driven architectures.

The right kind of buzz

The list of companies using Kafka at massive scale continues to grow. Obviously seeing many of our smartest and most effective peers adopting the platform is a big plus for Kafka.

Perhaps more important is that when we talk to practitioners, they’re just happy to have a better tool, to be more productive, and to spend more time solving interesting problems, and less time trying to solve problems that don’t add value.

Want to learn more about how we use Kafka? Check out our new eBook:

Download Ondema Guide to ERP Integrations

Learn more about Kafka:

Submit a comment

You may also like

Agility Reduces Risk
Agility Reduces Risk
18 August, 2020

If you work with complex systems (software, manufacturing, people, etc.), you know that agility increases productivity, ...

When Spreadsheets Fall Short
When Spreadsheets Fall Short
22 February, 2019

I enjoy nerding out on spreadsheets. They’re excellent for financials, basic data analysis, and modeling. However, manag...

Serverless is a Game-Changer
Serverless is a Game-Changer
13 April, 2020

In late March there were reports of unemployment web sites being overloaded by demand and becoming unresponsive. This so...