Deep Learning with the Apache Kafka® Ecosystem

Share it with your friends Like

Thanks! Share it with your friends!


Intelligent real time applications are a game changer in any industry. Deep Learning is one of the hottest buzzwords in this area. New technologies like GPUs combined with elastic cloud infrastructure enable the sophisticated usage of artificial neural networks to add business value in real world scenarios. Tech giants use it e.g. for image recognition and speech translation. This session discusses how any company can leverage deep learning in real time applications.

The session demos how to deploy Deep Learning models built with TensorFlow, DeepLearning4J or H2O into real time applications to do predictions on new events. The Apache Kafka open source ecosystem can be used to train, apply and monitor deep learning models in a highly scalable and performant way. The examples focus on Apache Kafka and Apache Kafka’s Streams API.

Links and further reading:
– Github:
– Confluent Blog Post:
– Slides:
– Video (more detailed about Kafka + Machine Learning):


Confluent, founded by the creators of Apache™ Kafka™, enables organizations to harness business value of live data. The Confluent Platform manages the barrage of stream data and makes it available throughout an organization. It provides various industries, from retail, logistics and manufacturing, to financial services and online social networking, a scalable, unified, real-time data pipeline that enables applications ranging from large volume data integration to big data analysis with Hadoop to real-time stream processing. To learn more, please visit


cyril owuor says:

How can i acquire the dataset you have used ?

Write a comment