
Introducing Analyst Agent: Deploy custom AI agents for self-service analytics in minutes
TL;DR: Fabi.ai Workflows are the easiest ways to automate data insight pipelines to push insights right to where your team work, regardless of where your data lives. You can connect to Snowflake, Google Sheets, Airtable, Slack or any other popular enterprise tools. Fabi's AI Analyst Agent can assist in building these workflows expedite the building process.
It's 8 am, and your CEO is already asking about this quarter's at-risk deals while sipping their morning coffee. Instead of frantically pulling up dashboards and exporting data to spreadsheets, imagine if those insights were already waiting in their inbox—complete with accurate AI-generated summaries and actionable recommendations.
This isn't a pipe dream. This is the future of personalized insight pipelines.
Here's the uncomfortable truth about modern business intelligence: Most "BI" platforms are actually business dashboard graveyards.
We've all been there. Your team spends weeks building the perfect dashboard, complete with every metric stakeholders requested. Fast forward three months, and it's collecting digital dust while everyone exports data to spreadsheets anyway.
Why? Because dashboards require users to:
Meanwhile, your stakeholders are living in Slack, email, and Google Sheets—the tools they open first thing every morning, where real business decisions happen.
Legacy BI was built in an era that no longer exists. This created three critical issues for anyone doing data work:
Legacy BI was built when data teams had to carefully craft isolated dashboards and metrics to "tell a story." These were static, canned dashboards that didn't adjust to the data or their audience. They required users to remember to log in, navigate complex interfaces, and then, inevitably, export everything to a spreadsheet anyway.
Now with AI, expectations have completely changed. We expect to use AI seamlessly in our data analysis workflows. We expect updates that aren't just canned reports, but meaningful insights extracted from our data.
Data teams are evolving faster than ever before, but legacy BI solutions aren't keeping pace. The way data practitioners will make an impact in the future isn't by producing countless dashboards or queries—it's by setting up repeatable, intelligent pipelines that meet users where they actually are.
When we first launched our data Smartbooks for data analysis, one of the very first requests we received was to turn these into internally and externally shareable dashboards. We quickly delivered, and they were rapidly adopted. But then we noticed something fascinating: our users were leveraging the AI to write custom Python scripts that pushed data to various destinations—primarily Slack, email, and Google Sheets.
As we always do at Fabi.ai, we got on calls with some of our amazing users to better understand their workflows. This opened up a whole new world: builders, product leaders, and data teams weren't just wanting dashboards—they wanted to automate mini “insights pipelines” that delivered intelligence directly to their workspaces.
We started pulling this thread and came to a powerful conclusion: modern data practitioners need a way to build repeatable workflows using data from any source and push insights directly to the tools their teams already know and love.
Existing tools, particularly legacy BI solutions, weren't designed with modern data workflows in mind, and certainly not with AI capabilities. They're built for a world where email was the primary communication tool and "pipeline_final_final.xls" was cutting-edge collaboration. Today's organizations need something fundamentally different.
So, we decided it was time to introduce a data insights pipeline platform unlike anything that exists today. With Fabi.ai Workflows, users can:
If you’re a visual learner, you can watch me build a complete workflow from scratch in 10 minutes:
Ready to see how simple this actually is? Let's build a workflow that would make any sales ops team weep with joy.
In this example, we'll take a messy list of sales opportunities with cryptic notes like "customer seems interested 🤞", use AI to extract real insights, analyze everything with SQL and Python, then automatically send an executive summary via email while pushing clean data to Google Sheets. All without writing a single line of code.
And this template is also available in our galley if you want to start leveraging it immediately.
If you don't already have a Fabi account, you can create one for free in under a minute.
We'll pull our deals from Google Sheets using this public dataset (AI-generated synthetic data for demo purposes), but you could just as easily connect Snowflake, Airtable, or any other data source.
From a blank Smartbook:
Your data is now stored as a Python DataFrame, ready for analysis.
Here's where it gets interesting. Instead of manually reviewing hundreds of sales notes (while crying softly into our coffee), we'll let AI do the heavy lifting.
First, we'll process the Notes field to flag opportunities that present risk based on sales rep notes. Under your Google Sheets cell, add an AI Enrichment cell. This cell takes a DataFrame as input and creates new fields based on prompts you provide.
Configure your AI Enrichment cell with the following:
Tip: You may need to refine your prompt—the more specific you are, the better for ensuring repeatability.
Hit Run and watch AI categorize every deal based on those cryptic sales notes. This creates a new DataFrame with your original data plus the new Risk_Level column.
Now that we have our data prepped, let’s turn it into insights that executives actually care about.
At this point, we've imported our opportunities and categorized each one by risk level, independent of sales rep input. Since we're focusing on enterprise deals closing this quarter, I'll ask our AI Analyst: "Using @processed_opps, write a query to filter on enterprise deals closing this quarter."
The AI generates the SQL automatically. Add it as a code block to your workflow.
For the visualization, I’ll ask: "Using this data, plot the total number of deals by risk category."
Again, the AI writes the Python code. Accept the suggestion and pin the chart to your workflow or continue asking the AI to make updates until you're satisfied, then pin the results to your workflow.
Here's the magic moment—instead of your stakeholders hunting for insights, insights hunt for them (helpfully).
Let’s set up the final step: Distribution. We'll send go-to-market leadership a weekly report with a link to a spreadsheet for deeper inspection.
To start, we want to push to Google sheets by doing the following:
Next, we want to generate our AI summary for the email.
This generates an AI summary of your DataFrame that dynamically updates each time your workflow runs. Next, create an Email cell and specify variables using {{}}. Here's the email structure I used:
Weekly summary of at-risk enterprise deals:
{{ai_executive_summary}}
Deal distribution:
{{deal_risk_distribution_fig}}
{{dataframe1}}
Data: [Link to your Google Sheet]
Reminder: Replace [Link to your Google Sheet] with your actual sheet URL so stakeholders can inspect the data directly.
Finally, you can schedule this workflow to run weekly, and your stakeholders will get fresh insights each Monday morning without you lifting a finger.
To make this all “just work”, there is some serious engineering magic happening behind the scenes:
Python-native architecture: Everything operates in Python to allow all components to work together while teh AI understands everything. This allows seamless integration between SQL, Python, and low/no-code tools while enabling AI assistance across all components. The magic is in our architecture, not proprietary languages or frameworks that AI struggles with (as is the case with legacy BI thanks to too much tech debt and proprietary code).
Enterprise-grade security: Connecting to external systems requires handling sensitive keys and tokens. We manage these securely, ensuring they're never exposed to AI or unauthorized users. For connectors like Airtable, you can interact directly with our secret manager. We love AI, but we take privacy extremely seriously.
Scalable AI processing: AI enrichment cells can process up to 10,000 rows (we can work with organizations on higher limits). We chunk DataFrames, process them in parallel batches with LLM providers, then restructure results–all while keeping your original data untouched.
We're not just building workflows, we're building the future of how organizations consume data intelligence.
This is just the beginning. We predict the rise of personal data pipelines and perhaps even a new "Insights Operations" role dedicated to building and maintaining these automated intelligence systems. Fabi workflows are designed for rapid addition of new and improved components.
Here are some blocks we're considering:
AI agent cells: Think of this as your personal data analyst embedded in workflows. Give it any inputs and context, and it generates DataFrames, charts, or summaries on-demand while you sleep.
New workflow cells: We're adding Google Slides, Gamma, Microsoft Teams, and more based on customer requests.Want a specific integration? Just ask—we move fast.
Smart triggers: Today workflows run on schedules, but soon they'll respond to external events. Imagine Slacking your personal data analyst a question and getting back analysis within minutes.
We're extremely excited about the future of data analysis in the AI era. We’re in the early stages of AI and firmly believe it will unlock workflows and productivity levels that previously seemed far-fetched.
If you haven’t already created your Fabi.ai account, you can get started for free and build your first workflow in just a few minutes. You can also check our full template library to get a jumpstart on building your insight pipelines.
Welcome to the future of data insights.