The Agentic future of Data Analytics
How will Data Analytics change with the rise of AI? This is my view on the most prominent use cases and how it will shape the domain going forward.
First let me start with apologising for the silence on my end. It’s been an engaging last couple of months on my side. From presenting at conferences, to stepping up to lead an entire engineering organisation and also becoming a second time father. I'll be tuning down the amount of blog posts but will continue posting opinionated blog posts on the intersection of software engineering, data & AI. As a consequence I'll move away from informative posts and focus more on sharing my views and opinions. Enjoy this post!
The domain of data analytics has changed significantly over the past decades. From the inception of data warehouses in the 90’s to the adoption of the first business intelligence (BI) tooling in the likes of Oracle, SAP and IBM the 00’s. The 10’s have been characterised by the rise of cloud data warehouses (Snowflake, BigQuery, Databricks) and modern BI tools (Tableau, PowerBI, Looker). Fast forward to the present and we've entered the era of embedded analytics tools (Metabase, Embeddable, Luzmo) and starting to observe the first agentic data analytics tools (Upsolve.ai, WrenAI).
But what does the future really bring and what are the type of use cases to double down on? In this post I'll dive into the following
Natural language in (embedded) data analytics
Generative BI a game changer?
The advent of data agents
Generating entire customised apps with tailored insights and actions
1. Natural language in (embedded) data analytics
For years, the promise of self-service analytics has been dangled in front of us. The idea is to empower everyone in an organisation to answer their own questions with data. In reality, as I've seen countless times, this often means giving business users various dashboards and hope for the best.
I've spent a significant amount of time working with tools like Looker. While powerful, building a truly comprehensive and accurate semantic layer, i.e. the LookML models that define all your metrics and business logic, is an immense task. Data analytics teams invest months, sometimes years, trying to get it right. And even then, you're building foundations for dashboards that are essentially a set of pre-canned answers to questions you think your users will have. The moment they have a slightly different question, they're back asking for a different dataset or dashboard.
This is where I believe natural language can be a complete game-changer. Imagine replacing a rigid dashboard with a simple textual search bar. Instead of clicking through filters and charts, a product manager can just ask, "Which of my new product feature launches were performing poorly from a click rate perspective over the past 30 days?" This bypasses the entire problem of an analyst having to anticipate every single query.
Of course, this isn't magic. As I mentioned in a previous post on self-serve analytics, you still need solid data fundamentals. For a natural language query to work, the system needs to understand what your "product," "feature," and "click rate" definition is. The need for an accurate, well-maintained semantic layer doesn't disappear; if anything, it becomes even more critical. But it changes the point of interaction. The complexity is hidden behind an intuitive, human interface.
Embedded analytics
Lately, I've been particularly impressed by the developments in embedded analytics. Working with tools like Embeddable, it's clear that bringing insights directly into the applications people use every day is the future. It removes the friction of context-switching. Now, combine this with generative AI. A user inside a SaaS application doesn't just see an embedded chart; they can have a conversation with their data right where they work. I believe this is what true self-service analytics could look like.
The challenge, however, remains in teaching the system the specific terminology of all its various users. The way a marketing team talks about "leads" can be vastly different from the sales team's definition, and the AI needs to navigate that nuance. Is it impossible to overcome? No, but humans will still play a role in teaching the AI system to learn the right terminology.
2. Generative BI a game changer?
The shift to natural language querying is just the first step. The next, more interesting leap I'm observing is in Generative BI (GenBI). This is where we move from asking questions of existing data and visualisations to having AI generate entirely new assets (queries, datasets, visualisations and dashboards) for us. In the past, I've spent countless hours preparing for board meetings, company updates and tech presentations, compiling data from different sources, building slide decks, and trying to craft a compelling narrative. It's necessary work, but often tedious and repetitive.
I envision a future where I can give a generative BI system a high-level prompt: "Create a presentation on my product's performance last quarter. Include a breakdown of merchants, analyse the feature adoption rates for our latest release, and provide a forecast for next quarter's based on current trends." The GenBI system would then do the heavy lifting: querying the databases, creating the visualisations, and calculating the forecasts. It could even help in arranging the insights onto slides, and writing the narrative for a presentation.
Tools like Upsolve.ai and WrenAI are the first in taking a step into the direction of GenBI. By only using natural language entire dashboards can be created to answer data analytics questions. If it doesn't fully answer your question, you can ask it to revise the dashboard or do a drill-down yourself.
The implications? In my view this doesn't replace the need for data analytics teams. But instead it frees them up from report building and focus on more fundamental work. You will still need a well-defined semantics layer with good dimension and measure definitions. As soon as you've covered that a data analyst leveraging GenBI can rather spend time validating the AI's output, digging deeper into the anomalies it flags, and crafting the high-level strategy that the data points towards. It changes the analysts role from being a data aggregator to a data strategist.
3. The advent of data agents
The final, and perhaps most exciting, part of this vision is taking analytics completely outside the confines of a specific tool. This is the world of agentic AI, where intelligent assistants operate at the browser or even the desktop level, capable of interacting with multiple applications to fulfil a request.
Agentic data crawlers
I've been experimenting with some fascinating new tools in this space. Take something like Firecrawl, an AI-native crawler that can scrape and structure information from websites. I can imagine pointing it at a handful of competitor websites and saying, "Analyse the product features and pricing tiers for all these companies and present a comparative analysis." A task that would have previously involved days of manual scraping and analysis can now be kicked off with a single command.
Agentic browsers
Then you have agentic browsers, like StrawberryAI, which can actually interact with web pages on your behalf. This is where it gets truly powerful. I can foresee a workflow where I ask my browser agent, "Log into my Google Analytics account, pull the latest user engagement report for my personal website, cross-reference the top traffic sources with for my various channels, and analyse where I should put more focus into."
The agent would autonomously navigate between these different tabs and platforms, authenticating, clicking buttons, downloading reports, and compiling the information. It could become my personal, on-demand data analyst. The distinction between a BI tool, an analytics tool like Google Analytics, and a spreadsheet starts to blur. These workflows can define the agentic future we're moving towards.
Agentic desktop apps
Agentic desktop applications would in my mind be an extension to agentic browsers. Instead of just being able to interact with your browser they would be able to also interact with any desktop applications it might need. This can give it significantly more tooling options and more autonomy to drive it's own discovery. It does make it quite a bit more difficult to train though, as the possible scope of application usage and tooling options is incredible.
Are we there yet? Not really, in my experience these agentic tools still require detailed instructions and occasional user interventions. I think there's still a lot of gains to make and innovations to come in this area, as the most prominent user interactions will (for now) still be done through web UI's.
4. Generating entire customised apps with tailored insights and actions
As we combine the previous information, I believe we start to see a pattern. The pattern of natural language interactions and bespoke experiences. Namely, we're moving beyond generating reports or dashboards. I believe that we're entering the realm of generating bespoke (analytics) applications on the fly.
Think about the limitations of a dashboard, even a generative one. It's a fantastic tool for displaying information, but it's fundamentally a read-only experience. Throughout my career, the most impactful analysis was never just about finding an answer to a pre-defined question, it was rather about what you did next. It involved identifying patterns, using that as guidance to iterate, measuring the change through experimentation, and then taking appropriate next steps. This workflow has always required me to go through multiple tools and manual steps.
This is where custom and tailored AI-generated applications come in. I envision a future where I can move beyond asking for a chart and instead ask for a tool that neatly integrates with my data. For example, a product manager could prompt an agent with: "Build me an interactive churn analysis app. Connect to my production database and CRM. Show me the main reasons for churn in the last 60 days, broken down by merchant segment. List the at-risk merchants who fit this profile. I want a button next to each merchant's name that adds them to a 'Retention' campaign in our CRM and assigns a task to their account manager."
In this scenario, the AI isn't just presenting data. It's building a functional, single-purpose application with a UI, business logic, and the ability to perform actions by calling APIs, writing SQL queries and more. It has the potential to combine the skills of a data analyst, a back-end engineer, and a front-end developer to deliver a solution tailored to a specific problem. The human role in this process becomes more like a product manager for their own tools. The critical skill is no longer SQL or dashboard design, but the ability to precisely articulate a business problem and the logic of the desired solution.
Is it possible to do this currently without programming knowledge? To a certain extend. But thorough architectural design, well-thought out instructions and manual interventions are as of today still necessary. Eventually, I believe we will move from being users of generic software to architects of our own personalised applications. Tools that will be able to deliver this experience in a trust-able, secure and bounded way will be able to offer users significantly differentiating experiences. Tools like Lovable and Replit are already shaping these type of experiences but then for full fledged web applications. Imagine a tool that can automatically retrieves your data, gets your business logic and can from there generate specific tailored but bounded applications. This, to me, is the true potential of the agentic future in data analytics.
Thank you for reading another edition of The Data Canal. To support me in writing my content, please do not forget to subscribe.