How Confluent Helped Notion Scale AI and Productivity for 10…
Analytics India Magazine (Ankush Das)
Notion has become the productivity companion for more than 100 million people worldwide. But maintaining that scale needed more than sleek design and collaborative features. It needed a real-time backbone.
In an interaction with AIM, Rohit Vyas, director of solutions engineering and customer success for South Asia at Confluent, explained how the company powered Notion’s scale, productivity gains, and AI ambitions.
The official Notion-Confluent case study echoes this transformation, noting how Confluent’s data streaming platform enabled Notion to triple its productivity while overcoming scaling limits. But what the case study summarises in numbers, Vyas unpacked the details, describing a story of event-driven architecture, lean engineering teams, and real-time AI.
The SaaS Match: Notion and Confluent
Vyas said Notion was “born in the digital ecosystem” and already ran as a SaaS-first company. Its architecture had to handle millions of daily events, from task creation to collaborative edits.
“All of these events that get triggered, they are all managed and funnelled and distributed and collected over Confluent’s data streaming platform,” Vyas said.
Confluent offered Notion a fully managed SaaS service running on AWS. This gave Notion the “affordable, flexible, highly scalable and capable” backbone it needed, Vyas said.
The official blog post added that Confluent’s managed Kafka solution integrated with AWS, allowing Notion to stream events into Amazon S3, Snowflake, and PostgreSQL without managing infrastructure. This freed the lea
Powering Notion AI with Real-Time Streams
Notion’s AI features, centred on “the find” and “the do,” require more than static data lakes.
“There is no AI without data, and there is no trustworthy AI without trustworthy data,” Vyas said. Confluent became the real-time supply chain for Notion’s AI workloads.
That meant streaming events as they happened, enabling retrieval-augmented generation for precise search, and feeding large language models with fresh, contextual data.
Vyas explained that Confluent Flink, the platform’s stream processing engine, played a central role. It transforms, aggregates, and routes real-time streams for Notion’s AI features.
He compared it to “tapping into a flowing pipe to modify, combine, or divert the water.” This allows real-time functions, such as helping a user find relevant information as they type.
The blog post noted that Confluent’s backbone ensures changes in Notion are instantly reflected in its vector database, keeping search and generation tools up to date.
Cost Optimisation and Reliability
Scaling is not only about performance but also about cost. Vyas explained that the total cost of ownership for data platforms comes from infrastructure, human resources, downtime risks, and reputational costs.
“If you look at these four drivers of total cost of ownership, on every [factor], Confluent comes out as extremely capable and robust,” he said.
Reliability was also key. Confluent provides 99.99% uptime annually, which is less than one hour of downtime per year.
Vyas stressed that such guarantees let Notion’s engineers stay focused on product experiences rather than firefighting outages. The Notion case study also highlighted this leap, noting how Confluent helped save time and triple productivity.
Privacy and Security at Scale
Handling sensitive user data across millions of workspaces makes privacy and security critical.
Vyas said Confluent works within cloud safeguards such as encryption at rest and in motion, firewalling, and identity management, while adding its own controls.
“We also expose a very strong capability called client-side field-level encryption, which basically encrypts the data payload completely to an extent that even Confluent can’t see it,” he explained.
With role-based access, strong authentication, and compliance with regional data residency rules, Notion ensures customer data is protected throughout its journey.
This layered security posture, combining AWS infrastructure, Confluent’s features, and Notion’s governance, allows it to deliver AI-driven features while meeting global privacy expectations.
The Bigger Picture
For Notion, the move to Confluent was more than infrastructure modernisation. It created a cultural shift across teams.
“If you are Confluent, become the central data platform, then the transactional teams, operational teams, analytical teams, cyber security teams, event and logging teams and GenAI teams all play a part of that,” Vyas said.
In effect, Confluent helped Notion not only support 100 million users but also build a centralised, secure, AI-ready data foundation. This enabled faster innovation, easier data access across teams, and a sharper focus on the future of connected work.
As Vyas summed up, Notion could focus its resources on improving the platform once freed from managing complexities.
The post How Confluent Helped Notion Scale AI and Productivity for 100M+ Users appeared first on Analytics India Magazine.
Generated by RSStT. The copyright belongs to the original author.