Users use software, that’s why they’re called users. We all use software every day on our desktops and mobile devices in the form of applications and data services, many of which are delivered or connected over the web. Even if we switch off and refuse to touch a keyboard or keypad all day, we’re all probably still using software to adjust our home heating controls, listen to the chimes from our video doorbells or to watch television programs.
This extremely simplistic truth has led us to develop a foundational hidden assumption i.e the impression is that it is us the users that primarily drive the use of the software in our lives.
The nature of software has changed
But software has grown up. The entire model that describes the relationship between applications and the databases that serve them has changed.
Connections between applications and databases used to be essentially passive i.e. the database was waiting for the user to click a button on a screen in an application, which then triggered the database to deliver upon a request. We now see that dynamic move from passive to active i.e. applications are proactively reacting to other things happen (inside other applications, other data services and other machines) elsewhere in the business.
Our second extreme truth that comes from this new reality is that applications are no longer passive islands that stand alone and interact with a database for storage needs. This is the opinion of Jay Kreps, founder and CEO of Confluent and co-creator of Apache Kafka.
Confluent is a company that provides an event streaming platform that aims to make it easier for organizations to take advantage of this change in terms of how we think about data and software. This is software for much more dynamic applications that can work in multiple directions at the same time… and at scale. This is the type of software that may see connections to other pieces of software more regularly than it sees connections, instructions or commands from human users.
A working example of modern software automation
Kreps has explained the relevance of the new world order in software by using a real world example of a work process that completely predates the existence of computers but that still exists today. A loan approval process from a bank used to be a heavily manual process. Even throughout the last 50-years when computers have existed, this type of work involved bank agents, financial analysts, mortgage officers and credit managers all coming together to make business decisions.
Today, this same level of work (or more accurately, the end results) can be heavily automated. The user submits his or her information online (thereby bypassing the need for a bank agent) as the credit software, risk assessment software and market analytics software all work in (mostly) harmonious union in the background. The result of these system calculations and judgements are then communicated back to the user themselves through a Customer Relationship Management (CRM) layer, which ultimately results in an alert on an app somewhere.
As Confluent’s Kreps has noted here, “This transition has many significant implications, but my focus [is] on what it means for the role and design of the software itself. The purpose of an application, in this emerging world, is much less likely to be serving a UI to aid a human in carrying out the activity of the business and far more likely to be triggering actions or reacting to other pieces of software to carry out the business directly.”
Kreps suggests that today’s data platforms need to think about reinvention. He explains that, traditionally, databases would passively wait for a command to deliver the information stored within. Traditionally, this worked okay, because users would instruct a human-facing application through a user interface to make that query happen. But he asks, is this model still the right fit for bringing together the full set of software that would comprise a real-time loan approval process built on continuous queries on top of ever-changing data? The answer inside Confluent is a resounding no and the solution lies in event streams.
How event streams allow software to talk to software
“I believe the answer starts with the concept of events and event streams. What is an event? Anything that happens e.g. a user login attempt, a purchase, a change in price, etc. What is an event stream? A continually updating series of events, representing what happened in the past and what is happening now. Event streams present a very different paradigm for thinking about data from traditional databases. A system built on events no longer passively stores a dataset and waits to receive commands from a UI-driven application. Instead, it is designed to support the flow of data throughout a business and the real-time reaction and processing that happens in response to each event that occurs in the business,” said Confluent’s Kreps.
All of this provides the potted history for how and why Kreps and his current team developed the open source Apache Kafka event streaming project while working at LinkedIn. Because LinkedIn’s information streams were running 24×7 in a world where many of the connections being made were application to application, Kreps envisioned an event streaming platform that could work in a world where the global business day never ends. The result is the US$4.5 billion business that we now know as Confluent.
Kreps recently resonated many of these ideas in his Kafka Summit 2020 virtual keynote which was shown here.
A virtuous circle of applications & event streams
Looking ahead then, more of our software is going to talking to more of our software, oftentimes before it gets around to talking to us, the users. That’s okay though, but only if we make sure that the internal software-to-software systems are working without bias and inside the correct data pipelines.
Confluent’s Kreps wants event streams to be thought of as the central nervous system around which modern applications are built — and, because events streams are multi-reader (they can have any number of application or data service ‘subscribers’ that connect to the stream) a virtuous circle of adoption can then occur.
As event streams share information to applications, smart applications come onto the event streaming platform and get smarter, faster and more streamlined (pun intended)… and this in turn creates new data streams and so on and so on. As Kreps concludes, Confluent’s mission is to build and be that event streaming platform and help companies begin to re-architect themselves around it.