It Is Much Harder To Be Event-Driven In A Batch-Driven Reality

We spend a significant amount of time working with large enterprise organizations that live in a batch-driven reality. Meaning they are used to moving large files around internal and external networks, relying on older, more outdated extract transform and load (ETL) technologies to move data where they need it. Depending on FTP and HTTP to move large files around between different groups, and with partners is something that is proving to be difficult to evolve, adapt, and change in a smaller more transactional world that has emerged in the last decade on the web, and via mobile devices.

Enterprise groups we work with want to realize the benefits our event-driven technology delivers, but are finding it very difficult to implement while still being so dependent on batch technology. For these organizations to move towards realizing the benefits of event-driven approaches, they’ll have to decompose their batch-driven operations to be more transactional using web APIs. Instead of compressing and zipping up large volume of data, they will have to embrace API design and begin the hard work of decoupling their data, breaking it down into smaller, more meaningful chunks. Then using the web to take these smaller, more bite size chunks of data and move them around bit by bit, using HTTP to broker each individual transaction.

While it may seem ridiculous to move millions of individual records around as single transactions over doing it all as a single bulk transfer, it actually becomes more flexible, agile, and faster to do it in smaller chunks. Leveraging HTTP to ensure every record is accounted for, and scaling compute and bandwidth resources to deliver data at scale. Then, one of the side effects of breaking things down, you will be able to record and respond to record level events that occur, as data is being transacted, rather than waiting until bulk transfers are finished, and everything has been accounted for. Allowing large enterprise organizations to go from hundreds of thousands of slow-moving transfers to millions or billions of individual transactions migrating where they are needed, when they are needed, while responding to events as they occur.

Many groups we are working with are investing in decomposing their monolith systems, in favor of a more microservices approach. One challenge within this new world is that they will have to decompose their data, and begin thinking about data as the smallest possible component they can. This isn’t easy for data stewards who have been relying on batch transfers to do business for decades, but is something that once understood will bring a much more reliable, real-time, flexible approach to making sure data is wherever it is needed. We wish everyone was ready for event-driven approaches to doing business, but we understand that not every enterprise group is ready, which is why we conduct our API lifecycle workshops, helping teams understand the discovery, design, deployment, operations, and governance of API infrastructure, so that they can be more successful in moving from a batch-driven world to a more transactional, event-driven reality.

Docker Container

**Original source: streamdata.io blog

LEAVE A REPLY

Please enter your comment!
Please enter your name here