Skip to content

Azure-Samples/sql-agentic-app-with-fabric

Repository files navigation

🏦 Agentic Banking App with SQL in Fabric

Agentic Banking App is an interactive web application designed to simulate a modern banking dashboard. Its primary purpose is to serve as an educational tool, demonstrating:

  • How SQL-based databases are leveraged across different types of workloads: OLTP, OLAP,
  • How agile AI-driven analysis and insight discovery can be enabled via prescriptive data models in Fabric.

Through a hands-on interface, users can see the practical difference between writing a new transaction to the database, running complex analytical queries on historical data, and using natural language to ask an agent to query the database for them.


✨ Features

Banking Application

  • The Banking Dashboard: A central hub to view account balance and navigate the application.
  • Transactions (OLTP): View a real-time list of all past transactions. This demonstrates a typical high-volume, read-heavy OLTP workload.
  • Money Transfer (OLTP): Perform transfers between accounts. This showcases a classic atomic, write-heavy OLTP operation that must be fast and reliable.
  • Financial Analytics (OLAP): Explore an analytics dashboard with charts and summaries of spending habits. This represents an OLAP workload, running complex, aggregate queries over a large dataset.
  • AI Agent (AI Workload):
    • Ask questions about your finances in plain English (e.g., "How much did I spend on groceries last month?"). An AI agent translates your question into a SQL query, executes it, and returns the answer.
    • Get customer support from using RAG over documents
    • Open new account, transfer money in plain English through an action oriented data agent

Agent Analytics Service

  • A separate service which is run to capture, in real-time, the operational data generated by the agentic solution and store it to a Fabric SQL database via a prescribed data model.

Agentic app insights and analytics in Fabric

  • As the app is used, the agentic operational data is captured, modeled, and used to reflect valuable analytics and insights via Fabric features such as reports, data agent, notebooks, etc.

🛠️ Tech Stack

Layer Technology
Frontend React, Vite, TypeScript, Tailwind CSS
Backend Python, Flask, LangChain, LangGraph
Database Fabric SQL Database
AI Azure OpenAI API

🔧 Prerequisites

  • Node.js (v18 or later)
  • Python -- Must be 3.11.9
  • A Fabric workspace
  • An Azure OpenAI API Key
  • ODBC Driver for SQL Server 18
  • Recommend VSCode as tested in VS Code only
  • This demo runs currently only on a Windows Machine as it support ActiveDirectoryInteractive

Set up required resources

1. Fork and clone the Repository

  • Fork the repo (click the fork button, this will create a copy of the repo under your GitHub account.)
  • Clone the forked repo: Navigate to your forked repository on GitHub and click the "Code" button. Copy the URL provided for cloning. Then run below in terminal:
git clone <copied url of forked repo>
cd <sql-agentic-app-with-fabric or main repo folder name if you have changed the name when forking>

2. Set up your Fabric account

  • If you do not already have access to a Fabric capacity, you can easily enable a Microsoft Fabric trial capacity for free which will give you free access for 60 days to all features needed for this demo: https://learn.microsoft.com/en-us/fabric/fundamentals/fabric-trial

  • In Home tab (with Welcome to Fabric title), click on "New workspace" and proceed to create your workspace for this demo.

3. Automatic set up of all required Fabric resources and artifacts

To easily set up your Fabric workspace with all required artifacts for this demo, you need to link your Fabric workspace with your repo

  • Go to your workspace and click on "Workspace settings" on top right of the page
  • Go to Git integration tab
  • Click on GitHub tile and click on "Add account"
  • Choose a name, paste your fine grained personal access token for the repo you just forked (don't know how to generate this? there are a lot of tutorials online such as: https://thetechdarts.com/generate-personal-access-token-in-github/)
  • paste repo url (forked to your account)
  • After connecting to the repo, you will see the option in the same tab to provide the branch and folder name. Branch should be "main" and folder name should be "Fabric_artifacts"
  • Click on "Connect and Sync"
  • Now the process of pulling all Fabric artifacts from the repo to your workspace starts. This may take a few minutes

4. Configure Environment Variables

Before running the application, you need to configure your environment variables. This file stores all the secret keys and connection strings your application needs to connect to Azure and Microsoft Fabric resources.

Rename the sample file: In the backend directory, find the file named .env.sample and rename it to .env.

Edit the variables: Open the new .env file and fill in the values for the following variables:

-> Microsoft Fabric SQL Databases

FABRIC_SQL_CONNECTION_URL_BANK_DATA: This is the connection string for the database containing the sample customer banking data. You can find this in your Fabric workspace by navigating to the SQL-endpoint of this database, clicking the "settings" -> "Connection strings" -> go to "ODBC" tab and select and copy SQL connection string.

FABRIC_SQL_CONNECTION_URL_AGENTIC: This is the connection string for the Fabric SQL warehouse that will store the application's operational data (e.g., chat history). You can find this in your Fabric workspace by navigating to the SQL-endpoint of this database, clicking the "settings" -> "Connection strings" -> go to "ODBC" tab and select and copy SQL connection string.

-> Azure OpenAI Services

AZURE_OPENAI_KEY: Your API key for the Azure OpenAI service. You can find this in the Azure Portal by navigating to your Azure OpenAI resource and selecting Keys and Endpoint.

AZURE_OPENAI_ENDPOINT: The endpoint URL for your Azure OpenAI service. This is found on the same Keys and Endpoint page in the Azure Portal.

AZURE_OPENAI_DEPLOYMENT: The name of your chat model deployment (e.g., "gpt-5-mini"). This is the custom name you gave the model when you deployed it in Azure OpenAI Studio.

AZURE_OPENAI_EMBEDDING_DEPLOYMENT: The name of your embedding model deployment (e.g., "text-embedding-ada-002").

5. Populate your database with sample data

  • In your Fabric workspace, go to the banking_db database.
  • Click on "New Query" from the top menu
  • In your local repo folder, go to Data_Ingest and copy content of banking.sql file, paste it in the query tab you opened on Fabric and click on Run
  • Sample data should now be populated in the banking_db tables.

Follow below steps to run the app locally!

Now that all resources are set up, follow below steps to run and test the app:

1. Install Backend Requirements (Flask API)

In the root project directory run below commands:

python3 -m venv venv
.\venv\Scripts\activate # (on Windows)
pip install -r requirements.txt

2. Configure the Frontend (React + Vite)

From the root project directory:

npm install

3. Run Jupyter Notebook to create embeddings from the PDF Document

You need to ingest embeddings from the PDF in the SQL Database

  1. Copy the .env file in the folder Data_Ingest.
  2. Open the Jupyter Python Notebook in the path: Data_Ingest/Documentation ingestion_pdf_Bank_App.ipynb
  3. Ensure the Kernel is pointing to the "venv" virtual environment you created previously
  4. Run all the cells in the notebook (you will be prompted for Fabric username and password so watch out for that pop up!)

4. Run the Application

Open two terninal windows.

Terminal 1: Start Backend

Run below

python launcher.py

This will launch two services:

  1. Banking service on: http://127.0.0.1:5001
  2. Agent analytics service on: http://127.0.0.1:5002

You will be prompted for your Fabric credentials during this so watch out for window pop ups and in taskbar!

Terminal 2: Start Frontend

Ensure you are in the root of your folder and run below:

npm run dev

Frontend will run on: http://localhost:5173


Explore Agentic Analytics

As you use the app:

  • The operational data is being stored in the agentic_app_db sql Fabric database
  • Agentic_lake lakehouse gets updated
  • The data model captured via agentic_semantic_model is refreshed
  • Agent_Insights report gets updated based on most recent data

Contributing

Contributions are welcome! If you have suggestions for improvements or find any bugs, feel free to open an issue or submit a pull request.

About

No description, website, or topics provided.

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published