Legacy applications that were developed in bygone days may appear to be close to unsalvageable. In reality, these applications are still running in production and carrying out the important day-to-day missions for their respective companies. After all, companies have spent a considerable amount of time and money on developing those applications, and despite the lack of perfection, these applications nonetheless keep their companies in operation. So does it make sense for an entire legacy application to be re-written? Keep in mind that the implementation of certain business functionality can be a daunting task for the busy developers. How about if we redesign the system, and identify pieces of the complex business functionality in the legacy system that can potentially be "recycled", and retrofit them into the new system that leverages on the power of the reactive data flow pipeline?
This presentation will be a lively discussion with hands-on coding to illustrate how to construct a reactive, event-driven data flow pipeline, which are composed of different library implementations of the Reactive Streams specification, such as Akka Streams, Eclipse Vert.x and RxJava. The sample use case will mimic a real-life example of data being collected from multiple distributed sources, which will then be fed to a legacy processor as "wrapped" by a reactive microservice for transformation, after which the resulting data will flow to a "sink" to be prepared for further processing. We will highlight the strength of the Reactive Streams in controlling backpressure during processing.