Real-Time Data Streaming: The Future of AI is Real-Time Data

Table of contents

Armend Avdijaj on the need for real-time data streaming, in conversation with Thani Shamsi

---

The need for real-time data streaming

Armend Avdijaj, co-founder and CEO of GlassFlow, has always been deeply immersed in the world of data. With a master's in data science and early experience at Mr. Spex in Berlin, he developed a keen understanding of data management and its evolving challenges. "I was always somehow attached to the data field," he says. "My first job was at Mr. Spex, where I was among the first in the data team. I was responsible for building the marketing data warehouse."

His journey into real-time data streaming began with his first startup in the influencer marketing data field. Initially relying on traditional batch processing, he soon faced increasing demands from customers for real-time data access. "We started with typical batch processing—so overnight, we ingested data into our systems, processed it, and our clients had access to updated reports the next day," he explains. "But then, customers started asking for real-time access. They wanted to build things like cost-per-click attribution for social media ad platforms. That’s when I started digging deeper into data streaming."

As he explored solutions, he quickly realized the challenges. "Even with a team of five engineers, there was no guarantee we’d succeed. The market solutions weren’t built for Python users or small, fast-moving teams," he recalls. "That’s when I thought—there must be a better way. I started talking to data engineers and heads of data, and that eventually led to starting GlassFlow."

How GlassFlow was born and how it’s being built

Having learned valuable lessons from his first startup, Armend now prioritizes product development above all else. "With my first company, I focused a lot on distribution—how to explain to people why we were different," he says. "But I learned that if you don’t spend enough time on building the right product with the right quality, everything else becomes much harder." He has since shifted his mindset. "Now, we are a product-first company. Everything revolves around building a great product. If you have an amazing product that truly solves a problem, everything else—distribution, communication, even sales—becomes much easier."

The essence of GlassFlow lies in real-time data streaming, a concept that contrasts with traditional batch processing. "With batch processing, data is collected and processed at set intervals—every few minutes, hours, or even once a day," Armend explains. "With streaming, data is processed and transferred immediately. There’s no time gap between data creation and usage."

More and more use cases demand data in realtime—anything else is too late

Armend highlights critical use cases where real-time data streaming makes a significant impact. "One example is personalization in e-commerce. If someone searches for red sneakers on a web shop, you want to show them more red sneakers right away. If you do it an hour later, they might have moved on, and you’ve lost that opportunity," he says. "Another example is logistics. Companies need real-time tracking of trucks to ensure timely deliveries. If a truck is delayed, they need to notify clients or adjust operations immediately—an hour later is too late. In these cases, you can see the business impact of real-time data access."

Beyond internal applications, real-time data sharing between companies is becoming increasingly crucial. "At Mr. Spex, we linked weather forecasts with marketing efforts. More sun hours meant increased demand for sunglasses, so we adjusted ad spending accordingly," he shares. "Car-sharing companies in Berlin do something similar. When it starts raining, more people look for cars, so they need to update their pricing and availability in real time."

There are use cases that are more technical, such as CDC (change data capture), Armend explains. “If you’re syncing between databases and one database has an event change, you want to update the other databases as well. You want to do it as quickly as possible, because you don't want to risk that your user is using their accounts in several applications and then they don't see the update that they did on the mobile app in the web app, for example.” 

While not every use case necessitates real-time data streaming, for many businesses, the ability to process and act on data instantaneously offers a significant competitive advantage. "For some cases, batch processing is fine," Armend acknowledges. "But in others, real-time data is a game-changer. It can mean the difference between missing or capturing an opportunity." With GlassFlow, he is making real-time data streaming more accessible than ever.

Real-time data streaming brings obstacles business are beginning to overcome 

Real-time data streaming is becoming increasingly essential, yet many organizations struggle to implement it effectively. According to Armend Avdijaj, the challenge is twofold—both a mindset shift and a technological hurdle.

"If you have an organization that is not used to real-time use cases, there’s a mindset shift needed," Avdijaj explains. Traditional data processing allows for errors to be addressed by reloading data. In contrast, real-time streaming demands a different approach, where immediate processing and error handling are critical.

On the technology side, data streaming involves two primary phases: ingestion and transformation. "There’s some technology called a message broker that collects all the events and makes them available for other systems. That’s the first part," he says. "Then comes transformation—filtering, manipulating, and enriching data before passing it to another system." The dominant technologies in this space—Apache Kafka and Apache Flink—were developed over a decade ago and remain the industry standard. However, Avdijaj notes that they come with significant barriers to entry.

"These systems were built 12, 13 years ago, in a different world when it comes to infrastructure management and programming languages," he explains. Kafka, pioneered at LinkedIn, and Flink, developed at a university in Berlin, both require deep technical expertise. "If you go to a data team today and tell them they need to write in Java to build real-time pipelines, you’ll quickly see resistance. Many data engineers prefer Python, and suddenly they need backend engineering support. That’s a big problem."

Beyond programming constraints, setting up these systems is another challenge. "You really need to understand how to configure connectors, storage, processors, and clusters. Most companies don’t have a Kafka or Flink expert, and that’s holding them back."

For many organizations, this complexity makes real-time streaming seem out of reach. Large enterprises can afford dedicated experts, but smaller teams struggle. "80% of Fortune 500 companies are using Kafka and Flink because they have the knowledge, manpower, and budget. But when you have a smaller team and want to move fast, you get lost very quickly."

This gap in accessibility is what solutions like GlassFlow aim to address. "It’s completely native Python—low code. The infrastructure setup is managed with a few clicks. You write your Python function, run it, select connectors, and within ten minutes, your first pipeline is up and running. That makes it very attractive."

Industries are adopting real-time data streaming and AI is the frontrunner

Real-time data streaming isn't just for consumer applications—it plays a crucial role in industries like finance, IoT, and automation. "In banking, real-time transactions are critical for anomaly detection and security checks. These need to happen instantly, as the person is standing in front of the ATM," Avdijaj notes. Similarly, industries leveraging IoT sensors and robotics rely on real-time data processing for automation and efficiency.

The shift towards real-time processing is accelerating. "One year ago, my bank started offering real-time transactions. I was very impressed. Yes, they charge extra, but the experience of real-time money transfers was exciting," he shares. As organizations across industries recognize the need for immediacy, solutions that simplify real-time streaming will play a pivotal role in shaping the future of data sharing.

Unsurprisingly, the industry which has been fastest to invest in and adopt real-time data streaming technology is artificial intelligence (AI). AI is driving automation, decision-making, and user experiences, and real-time data streaming has emerged as a crucial enabler of responsive and adaptive AI systems. The ability to process and share data instantaneously is transforming how companies build AI solutions, shifting from static models trained in batches to dynamic, continuously updating models that improve over time.

"In a lot of use cases, the users of AI expect responses that are first quick, and second, take into account what you did before," says Armend Avdijaj, an expert in real-time data infrastructure. Companies offering AI-powered workflow automation, for example, must cater to users who need immediate access to evolving datasets. "If they just changed something, they want to see the impact immediately," he explains. This demand for immediacy introduces challenges at scale, particularly as AI startups transition from self-built, fast API-based solutions to enterprise-grade deployments.

As Avdijaj notes, many AI startups initially build lightweight, self-contained systems that handle one request at a time efficiently. "At the beginning, the moment they start to go out and want to sell it to enterprises, the load is so big that it doesn't work anymore," he observes. Scaling AI systems means managing data loss, processing delays, and pipeline failures, all of which require robust real-time data streaming infrastructure. "This is where real-time data streaming comes in. This is where the infrastructure grows with usage, which is pretty awesome," he adds.

Beyond responsiveness, the way AI models learn and adapt is also changing. Traditionally, AI models are trained in batches, with companies storing data in warehouses and periodically releasing new model versions. "I think the majority of companies are running this model optimization in batches," Avdijaj confirms. However, a shift towards real-time optimization, often called online optimization, is gaining traction. "It means, while the model is in the wild, it starts to learn, or it keeps learning," he explains.

One compelling use case for online optimization is AI-driven fitness coaching. "Let's say you have an AI fitness coach, and you just did an activity. The AI can measure the heartbeat changes in real-time without waiting for the heartbeat batch to be updated," Avdijaj suggests. "Depending on the activity, the model needs to suggest the next one in real time." While still in its early stages, this approach holds immense potential, particularly for applications where personalized, real-time recommendations enhance user experiences.

Data transformation goes hand-in-hand with real-time data streaming

Effective real-time data streaming isn’t just about collecting and moving data—it also involves transformation. "Theoretically, you can just ingest data and make it available somewhere else. But as you can imagine, the different sources of the data have different kinds of format that they need," Avdijaj explains. Transforming data ensures consistency across systems, whether by changing formats, filtering unnecessary information, or enriching data with external sources. "Say you want to understand which city is meant with the particular geo-location data," he says. "You connect to Google Maps and figure out it’s actually in Berlin, and then which street and number, etc. That’s enrichment of data."

As real-time data sharing becomes more prevalent, its future will be shaped by simplification and broader accessibility. "Simplification is important because that increases adoption," Avdijaj predicts. AI-driven coding tools are making it easier for developers to integrate data streaming capabilities without deep technical expertise. "These AI tools will very often have data streaming under the hood and be using it," he notes. “And that's how AI developers will also build an edge over other companies.Their answers will be derived from data streamed in real-time, and not the newspaper from yesterday.” 

Monetize your data

150+ data companies use Monda's all-in-one data monetization platform to build a safe, growing, and successful data business.

Explore all features

Related articles

Monda makes it easy to create data products, publish a data storefront, integrate with data marketplaces, and manage data demand - data monetization made simple.

News

Data Broker Registries in the US 2025

Lucy Kelly

News

Data Valuation: DaaS is Stickier Than Any Other Business Service - That’s Why It’s Valuable

Thani Shamsi

News

Data Discovery: People Don't Want Data, They Want Answers

Thani Shamsi

Monda Logo

Grow your business with one data commerce platform.

Get a demo

Be the best informed in the data industry

Sign up to our newsletter for unique thought leadership and to be the first to know about every product update and event.

© Monda Labs, Inc. • 2025 • All rights reserved.