Build robust streaming data pipelines with MongoDB and Kafka

videoIcon Watch and Learn

mongoDBLogo

Kafka is an event streaming solution designed for boundless streams of data that sequentially write events into commit logs, allowing real-time data movement between your services. The MongoDB database is built for handling massive volumes of heterogeneous data. Together MongoDB and Kafka make up the heart of many modern data architectures today.

Database plays a critical role in event-driven architectures. While events flow through Kafka in an append-only stream, MongoDB helps the consumer to proactively make streams of data from the source systems available in real time.

Watch how MongoDB Connector for Kafka simplifies building a robust, streaming event pipeline.

What you will learn

How MongoDB and Kafka play vital roles in the modern data ecosystem to build event driven architectures

Customer use cases and scenarios

Demo: Installing and configuring the MongoDB Connector for Kafka – to build robust and reactive data pipelines

    Watch and Learn

    yesAshnik will process personal data under this Privacy Policy