{Blog}  MongoDB to release new Vector Search and Stream Processing capabilities. Learn more >


Atlas Stream Processing

Transform building event-driven applications by continuously processing streams of data with a familiar developer experience.
Request Early Access
Atlas Stream Processing Explained thumbnail image
Atlas Stream Processing Explained in 3 minutes
Learn how Atlas Stream Processing combines the document model, flexible schemas, and a rich aggregation language to provide a new level of power and convenience when building applications that require processing complex event data at scale.
This is an image

Stream processing like never before

When working with streaming data, schema management is critical to data correctness and developer productivity. MongoDB’s document model and familiar aggregation framework give developers powerful capabilities and productivity gains you won't find elsewhere in stream processing.
This is an image

Unifying data in motion and data at rest

For the first time, developers can use one platform – across API, query language, and data model – to continuously process streaming data alongside the critical application data stored in their database.
This is an image

Fully managed in Atlas

Atlas Stream Processing builds on our robust and integrated developer data platform. With just a few API calls and lines of code, a developer can stand up a stream processor, database, and API serving layer — all fully managed on any of the major cloud providers.


Continuous processing

Build aggregation pipelines to continuously query, analyze, and react to streaming data without the delays inherent to batch processing.


Continuous validation

Perform Continuous Schema Validation to check that events are properly formed before processing, detect message corruption, and detect late arriving data that has missed a processing window.


Continuous merge

Continuously materialize views into Atlas database collections or streaming systems like Apache Kafka to maintain fresh analytical views of data supporting decision making and action.

Atlas Stream Processing

How does it unify the experience of working with data in motion and data at rest?
Atlas Stream Processing diagram
Hands typing on laptop
Event-driven applications
Paving the path to a responsive and reactive real-time business
Download the Whitepaper


Want to learn more about stream processing?
What is streaming data?
Streaming data is generated continuously from a wide range of sources. IoT sensors, microservices, or mobile devices are all common sources of high volume streams of data. The continuous nature of streaming data as well as its immutability make it unique from static data at rest in a database.
What is stream processing?
Stream processing is continuously ingesting and transforming event data from an event messaging platform (like Apache Kafka) to perform various functions. This could mean creating simple filters to remove unneeded data, performing aggregations to count or sum data as needed, creating stateful windows, and more. Stream processing can be a differentiating characteristic in event-driven applications, allowing for more reactive, responsive customer experiences.
How is event streaming different from stream processing?

Streaming data lives inside of event streaming platforms (like Apache Kafka), and these systems are essentially an immutable distributed log. Event data is published and consumed from event streaming platforms using APIs.

Developers need to use a stream processor to perform more advanced processing, such as stateful aggregations, window operations, mutations, and creating materialized views. These are similar to the operations one does when running queries on a database, except that stream processing continuously queries an endless stream of data. This area of streaming is more nascent; however, technologies such as Apache Flink and Spark Streaming are quickly gaining traction.

Stream processing is the area where Atlas Stream Processing focuses. MongoDB is providing developers with a better way to process streams for use in their applications, leveraging the aggregation framework.

Why did MongoDB build Atlas Stream Processing?
Stream processing is an increasingly critical component to building responsive, event-driven applications. By adding stream processing functionality as a native capability in Atlas, we can help more developers build innovative applications leveraging our multi-cloud developer data platform, MongoDB Atlas.
How do I get access to the private preview?
You can request access to the private preview of Atlas Stream Processing from this page and our team will be in touch once available.
How is stream processing different from batch processing?

Stream processing is the processing of data continuously. In the context of building event-driven applications, stream processing enables reactive and compelling experiences like real-time notifications, personalization, route planning, or predictive maintenance.

Batch processing does not work on continuously produced data. Instead, batch processing works by gathering data over a specified period of time and then processing that static data as needed. An example of batch processing is a retail business collecting sales at the close of business each day for reporting purposes and/or updating inventory levels.

Request early access today

Once the private preview of Atlas Stream Processing is available, our team will be in touch.
Request Access
An illustration of the chart with a rocket