17:00 - 18:00
They are everywhere: old applications. Worse: old applications that work just fine. Old applications that really deliver value, are stable, perform great, but still, we hate working on those. Given the chance, we’d tear them down and replace them with something fresh, but deep down we know that that is not a good idea. On the other hand, expanding those vintage applications forever is also problematic. Change data capture can deliver an interesting way out of this dilemma: We’ll take a look at Debezium, a piece of infrastructure software that can connect to our database, can give us a stream of all the mutations, and send it into Kafka. This opens many possibilities: With this stream, we can materialize this data in many different forms in all kinds of data stores. We can radically increase our read scalability by materializing our denormalized into MongoDB. We can feed it into a search engine like Elasticsearch to provide full text search. We can even feed it into a machine learning library like TensorFlow. And we barely need to touch that crusty old application to make this happen.