Skip to main content

Overview

Workflow Analytics provides visibility into how AI workflows are performing in production - what users are asking, how workflows respond, where errors happen, and how workflows function over time. It is designed to help users monitor usage and any issues that may come up. Workflow Analytics is an observability layer for AI workflows. Every time a workflow is created and run, AI Squared captures data including:
  • When the workflow was initiated
  • The status (success or failure)
  • Performance characteristics such as run time
This data is gathered and analysed through the Workflow Analytics dashboard giving details on how often workflows are being used, their success or failure, user feedback on the responses and more. 

Accessing Workflow Analytics

Workflow Analytics is accessible from the AI Workflows section of the AI Squared platform. Each workflow allows you to view analytics details, and helps teams to get visibility from a high-level overview into individual execution logs. Analytics can be filtered by time range (for example, last 30 days) to analyze recent behavior or trends over time.

Analytics Overview Metrics

At the top of the Workflow Analytics view, AI Squared displays a set of metrics that give a quick snapshot of workflow health and usage.

Workflow Runs

Workflow Runs represents the total number of times the workflow has been executed during the selected time period.

High workflow run counts indicate higher user engagement, while low counts may suggest discoverability or relevance issues.

Workflow Errors

Workflow Errors shows the number of executions that failed. A failure can occur due to various factors such as SQL query errors, MCP server failures, invalid input, and such others.  This metric is useful for organizations to measure reliability of AI systems. 

Success Percentage

Success Percentage is the proportion of successful workflow runs relative to total runs. This metric provides a quick health check:
  • High success percentage indicates stable and reliable workflows
  • Lower success percentage highlights the need for optimization
Teams use this as a key deciding factor before they roll out a workflow for the public. 

Average Run Time

Average Run Time measures how long workflows take to complete, from start to the end. This includes:
  • Input processing
  • Data retrieval or tool calls
  • Response creation and formatting
Monitoring run time is important to track user experience. Sudden increases in run time can point to slow data queries or other service issues.

Feedback Percentage

Feedback Percentage indicates the percentage of workflow runs for which users provided explicit feedback. This metric helps teams understand:
  • How often users are engaging with feedback mechanisms
  • Whether there is enough signal to evaluate response quality
Low feedback percentages may suggest that feedback prompts are unclear or too intrusive.

Positive Feedback

Positive Feedback represents the proportion of feedback that was positive. This metric is a qualitative signal layered on top of system performance metrics. Even when workflows are technically successful, low positive feedback can indicate:
  • Irrelevant or incomplete answers
  • Poor response (formatting etc)
  • Mismatch between user intent and workflow behavior

Workflow Execution Logs

Workflow Analytics provides a detailed view of individual workflow executions. Each log entry includes the following fields:
  • Flow ID: A unique identifier for each workflow execution.
  • Flow Initiated: The timestamp indicating when the workflow was triggered.
  • Input: The Input column captures the user query or trigger that initiated the workflow.
  • Output: The Output field stores the response generated by the workflow. This may include summaries, table data, SQL queries and others.
  • **Status: **The Status column indicates whether the workflow execution completed successfully or failed.

Using Workflow Analytics Effectively

Workflow Analytics are designed to be used without deep setup or configuration. Key usage patterns include:
  • Track overall workflow health by reviewing performance and success data over a 30‑day period
  • Diagnose failures quickly by inspecting individual runs, reviewing inputs, outputs, and execution status together.
  • Evaluate response quality by combining system success metrics with user feedback signals.
  • Share and review data externally using export functionality, enabling collaboration across product, engineering, and data teams.
Workflow Analytics is designed to be lightweight, making it easy to adopt as part of regular workflow review and iteration.

Feedback and Continuous Improvement

Workflow Analytics surfaces both quantitative performance metrics and qualitative user feedback.
  • Feedback Percentage reflects how often users provide feedback on workflow responses.
  • Positive Feedback highlights overall sentiment toward workflow outputs.
In addition to viewing feedback directly in the Analytics UI, AI Squared is actively developing capabilities to export feedback data for specific requirements.

Exporting Analytics Data

Workflow Analytics allows you to export analytics and feedback data so it can be analyzed to improve future performance. Exported data can be used to:
  • Review workflow performance with stakeholders
  • Perform deeper offline analysis
  • Feed user feedback into model improvement or fine‑tuning pipelines
This makes Workflow Analytics a feedback mechanism for continuous improvement.