Introducing Fabi.ai Workflows: Deliver insights where they matter

TL;DR: Fabi.ai Workflows are the easiest ways to automate data insight pipelines to push insights right to where your team work, regardless of where your data lives. You can connect to Snowflake, Google Sheets, Airtable, Slack or any other popular enterprise tools. Fabi's AI Analyst Agent can assist in building these workflows expedite the building process.

Introducing Fabi.ai Workflows: Deliver insights where they matter

It's 8 am, and your CEO is already asking about this quarter's at-risk deals while sipping their morning coffee. Instead of frantically pulling up dashboards and exporting data to spreadsheets, imagine if those insights were already waiting in their inbox—complete with accurate AI-generated summaries and actionable recommendations.

This isn't a pipe dream. This is the future of personalized insight pipelines.

The problem: Legacy BI has critically failed data and product teams 

Here's the uncomfortable truth about modern business intelligence: Most "BI" platforms are actually business dashboard graveyards.

We've all been there. Your team spends weeks building the perfect dashboard, complete with every metric stakeholders requested. Fast forward three months, and it's collecting digital dust while everyone exports data to spreadsheets anyway.

Why? Because dashboards require users to:

  • Remember they exist
  • Navigate to them proactively
  • Understand complex interfaces
  • Export everything to actually use the data

Meanwhile, your stakeholders are living in Slack, email, and Google Sheets—the tools they open first thing every morning, where real business decisions happen.

Legacy BI was built in an era that no longer exists. This created three critical issues for anyone doing data work:  

  1. Restricted data access: Legacy BI confines itself to data warehouses. While we love data warehouses, ignoring that most businesses run on spreadsheets, Airtable, and other sources is ignoring reality. Real analysis requires data from multiple sources—your warehouse, that critical spreadsheet from finance, the Airtable base from marketing.
  2. Misaligned incentives: Legacy BI incentivizes users to log into their platforms to sell more seats, even for users who only need insights monthly. The whole point of finding insights is sharing them to impact business—what good does it do if you're hesitant to give access to everyone who needs it?
  3. Dashboard graveyards: Because of the points above, legacy BI creates fancy dashboard graveyards rather than business "intelligence" solutions. Few business users proactively search for dashboards with their morning coffee. When they do, they immediately hit the export button and load data into spreadsheets (goodbye data lineage).

Legacy BI was built when data teams had to carefully craft isolated dashboards and metrics to "tell a story." These were static, canned dashboards that didn't adjust to the data or their audience. They required users to remember to log in, navigate complex interfaces, and then, inevitably, export everything to a spreadsheet anyway. 

Now with AI, expectations have completely changed. We expect to use AI seamlessly in our data analysis workflows. We expect updates that aren't just canned reports, but meaningful insights extracted from our data.

Data teams are evolving faster than ever before, but legacy BI solutions aren't keeping pace. The way data practitioners will make an impact in the future isn't by producing countless dashboards or queries—it's by setting up repeatable, intelligent pipelines that meet users where they actually are.

The solution: Fabi.ai Workflows and AI that actually helps analysis

When we first launched our data Smartbooks for data analysis, one of the very first requests we received was to turn these into internally and externally shareable dashboards. We quickly delivered, and they were rapidly adopted. But then we noticed something fascinating: our users were leveraging the AI to write custom Python scripts that pushed data to various destinations—primarily Slack, email, and Google Sheets.

As we always do at Fabi.ai, we got on calls with some of our amazing users to better understand their workflows. This opened up a whole new world: builders, product leaders, and data teams weren't just wanting dashboards—they wanted to automate mini “insights pipelines” that delivered intelligence directly to their workspaces. 

We started pulling this thread and came to a powerful conclusion: modern data practitioners need a way to build repeatable workflows using data from any source and push insights directly to the tools their teams already know and love.

Existing tools, particularly legacy BI solutions, weren't designed with modern data workflows in mind, and certainly not with AI capabilities. They're built for a world where email was the primary communication tool and "pipeline_final_final.xls" was cutting-edge collaboration. Today's organizations need something fundamentally different.

So, we decided it was time to introduce a data insights pipeline platform unlike anything that exists today. With Fabi.ai Workflows, users can:

  • Connect to all data sources: Snowflake, Databricks, MotherDuck, Google Sheets, Airtable, and more. (Don't see the connector you need? Just let us know!)
  • Process data with integrated tools: SQL for querying and pivoting, Python for advanced data analysis and visualization, AI for natural language processing and generating summaries, all working seamlessly together.
  • Push insights to your favorite destinations: Send AI-generated, customized data and messages via email, Slack, or Google Sheets (Again, missing a destination? Reach out, we can easily add it).

If you’re a visual learner, you can watch me build a complete workflow from scratch in 10 minutes: 

Building your first workflow

Ready to see how simple this actually is? Let's build a workflow that would make any sales ops team weep with joy.

In this example, we'll take a messy list of sales opportunities with cryptic notes like "customer seems interested 🤞", use AI to extract real insights, analyze everything with SQL and Python, then automatically send an executive summary via email while pushing clean data to Google Sheets. All without writing a single line of code.

And this template is also available in our galley if you want to start leveraging it immediately.

Step 1: Create a Fabi account and connect your data (2 minutes) 

If you don't already have a Fabi account, you can create one for free in under a minute.

We'll pull our deals from Google Sheets using this public dataset (AI-generated synthetic data for demo purposes), but you could just as easily connect Snowflake, Airtable, or any other data source.

Google Sheets import cell in Fabi.ai

From a blank Smartbook:

  1. Add a Google Sheets Pull cell
  2. Authenticate your Google Workspace account
  3. Select your spreadsheet and sheet
  4. Hit Run

Your data is now stored as a Python DataFrame, ready for analysis.

Step 2: Let AI process your field (3 minutes) 

Here's where it gets interesting. Instead of manually reviewing hundreds of sales notes (while crying softly into our coffee), we'll let AI do the heavy lifting.

First, we'll process the Notes field to flag opportunities that present risk based on sales rep notes. Under your Google Sheets cell, add an AI Enrichment cell. This cell takes a DataFrame as input and creates new fields based on prompts you provide.

Configure your AI Enrichment cell with the following: 

  • Input DataFrame: Select your opportunities DataFrame (the one from your Google Sheets pull)
  • Column to process: Choose the "Notes" field from the dropdown
  • Output DataFrame name: Give it a name like "processed_opps"
  • New column name: Enter "Risk_Level"
  • AI Prompt: "Mark each deal as 'Low', 'Medium' or 'High' risk based on the Notes. Use only those categories."

Tip: You may need to refine your prompt—the more specific you are, the better for ensuring repeatability.

Hit Run and watch AI categorize every deal based on those cryptic sales notes. This creates a new DataFrame with your original data plus the new Risk_Level column.

AI cells in Fabi make it easy to interweave SQL, Python and AI for data analysis

Step 3: Analyze your data with SQL and Python (3 minutes) 

Now that we have our data prepped, let’s turn it into insights that executives actually care about.

At this point, we've imported our opportunities and categorized each one by risk level, independent of sales rep input. Since we're focusing on enterprise deals closing this quarter, I'll ask our AI Analyst: "Using @processed_opps, write a query to filter on enterprise deals closing this quarter."

Fabi's AI Analyst Agent can help you "vibe-code" workflows

The AI generates the SQL automatically. Add it as a code block to your workflow.

For the visualization, I’ll ask: "Using this data, plot the total number of deals by risk category."

AI-generated Python script for a custom chart

Again, the AI writes the Python code. Accept the suggestion and pin the chart to your workflow or continue asking the AI to make updates until you're satisfied, then pin the results to your workflow.

Step 4: Automate distribution to email and Google Sheets (2 minutes) 

Here's the magic moment—instead of your stakeholders hunting for insights, insights hunt for them (helpfully).

Let’s set up the final step: Distribution. We'll send go-to-market leadership a weekly report with a link to a spreadsheet for deeper inspection.

To start, we want to push to Google sheets by doing the following: 

  • Add a Google Sheets Push cell
  • Select your target spreadsheet (you're already authenticated)
  • Hit Run to transfer your processed data
Push Python DataFrames right back to Google Sheets

Next, we want to generate our AI summary for the email. 

  • Under the SQL cell filtering enterprise deals, add an AI Summarization cell 
  • Input your filtered enterprise deals with risk ratings (dataframe1)
  • Prompt: "Generate an executive summary highlighting key deals at risk"
  • Switch from Preview to Run mode and click Run to create dynamic summaries
Generate AI summaries to include in your reports

This generates an AI summary of your DataFrame that dynamically updates each time your workflow runs. Next, create an Email cell and specify variables using {{}}. Here's the email structure I used:

Weekly summary of at-risk enterprise deals:
{{ai_executive_summary}}‍

Deal distribution:
{{deal_risk_distribution_fig}}‍

{{dataframe1}}‍

Data: [Link to your Google Sheet]

Reminder: Replace [Link to your Google Sheet] with your actual sheet URL so stakeholders can inspect the data directly.

Finally, you can schedule this workflow to run weekly, and your stakeholders will get fresh insights each Monday morning without you lifting a finger. 

Under the hood: The engineering magic behind Workflows

To make this all “just work”, there is some serious engineering magic happening behind the scenes: 

Python-native architecture: Everything operates in Python to allow all components to work together while teh AI understands everything. This allows seamless integration between SQL, Python, and low/no-code tools while enabling AI assistance across all components. The magic is in our architecture, not proprietary languages or frameworks that AI struggles with (as is the case with legacy BI thanks to too much tech debt and proprietary code). 

Enterprise-grade security: Connecting to external systems requires handling sensitive keys and tokens. We manage these securely, ensuring they're never exposed to AI or unauthorized users. For connectors like Airtable, you can interact directly with our secret manager. We love AI, but we take privacy extremely seriously.

Scalable AI processing: AI enrichment cells can process up to 10,000 rows (we can work with organizations on higher limits). We chunk DataFrames, process them in parallel batches with LLM providers, then restructure results–all while keeping your original data untouched. 

The future of data insights pipelines

We're not just building workflows, we're building the future of how organizations consume data intelligence.

This is just the beginning. We predict the rise of personal data pipelines and perhaps even a new "Insights Operations" role dedicated to building and maintaining these automated intelligence systems. Fabi workflows are designed for rapid addition of new and improved components. 

Here are some blocks we're considering:

AI agent cells: Think of this as your personal data analyst embedded in workflows. Give it any inputs and context, and it generates DataFrames, charts, or summaries on-demand while you sleep.

New workflow cells: We're adding Google Slides, Gamma, Microsoft Teams, and more based on customer requests.Want a specific integration? Just ask—we move fast.

Smart triggers: Today workflows run on schedules, but soon they'll respond to external events. Imagine Slacking your personal data analyst a question and getting back analysis within minutes.

We're extremely excited about the future of data analysis in the AI era. We’re in the early stages of AI and firmly believe it will unlock workflows and productivity levels that previously seemed far-fetched.

If you haven’t already created your Fabi.ai account, you can get started for free and build your first workflow in just a few minutes. You can also check our full template library to get a jumpstart on building your insight pipelines. 

Welcome to the future of data insights. 

Related reads
Subscribe to Query & Theory