You built the workflow. It runs. The basic logs say "success." But what does that really mean? Is the new automated process actually more efficient? Is it saving the company money? Where are the hidden bottlenecks that are costing you time and resources?
For many developers and engineering teams, once a business process is automated, it becomes a "black box." Data is scattered across logs, and translating system-level events into clear business value—like ROI and overall health—is a manual, time-consuming nightmare.
What if you could answer those crucial questions and prove the value of your work with a simple API call?
Enter Analytics.do. We provide AI-powered analytics as a service, designed to help you measure what matters. This guide will show you how to go from zero to full workflow observability in under 15 minutes.
Automating a process like order fulfillment, customer onboarding, or data processing is a huge win. But the victory is incomplete without visibility. You're often left wondering:
Traditional monitoring tools are great for server health, but they don't understand your business workflows. Analytics.do bridges that gap.
Analytics.do is built on a simple premise: gaining deep process intelligence shouldn't require a massive engineering effort. By instrumenting your code with a few lines, you can send execution data to our platform. Our AI then gets to work, aggregating, analyzing, and transforming raw events into a clear picture of your workflow's performance and value.
Validate. Optimize. Scale. With Analytics.do, you can finally:
Let's get practical. We'll use a common example: an order-processing-workflow. Our goal is to track its performance, cost, and overall efficiency.
First, head over to Analytics.do and sign up for an account. Once you're in, create a new project and grab your unique API key. Keep it handy.
Before writing any code, identify what you want to measure. For our order-processing-workflow, we care about:
These map directly to the kind of insights Analytics.do provides.
Now, let's add the tracking code. Analytics.do is API-first, so you can use any language that can make an HTTP request. Here’s a conceptual example in pseudo-code showing how you'd wrap your existing logic.
# Fictional Python SDK example for clarity
import analytics_do
import time
import uuid
# 1. Configure the client with your key
analytics_do.api_key = "YOUR_API_KEY_HERE"
def process_order(order_data):
"""
Your existing workflow logic for processing an order.
"""
workflow_id = "order-processing-workflow"
execution_id = str(uuid.uuid4()) # A unique ID for this specific run
start_time = time.time()
status = "success"
error_message = None
cost = 0.03 # A calculated or estimated cost for this run
try:
# == YOUR EXISTING LOGIC STARTS HERE ==
print(f"Processing order {order_data['id']}...")
# Step 1: Validate payment
# Step 2: Check inventory
# Step 3: Schedule shipping
print("Order processed successfully.")
# == YOUR EXISTING LOGIC ENDS HERE ==
except Exception as e:
status = "failed"
error_message = str(e)
print(f"Failed to process order: {e}")
finally:
# 2. Send the event to Analytics.do
duration_seconds = time.time() - start_time
analytics_do.events.create(
workflow_id=workflow_id,
execution_id=execution_id,
status=status,
metrics={
"duration_seconds": round(duration_seconds, 2),
"cost_per_execution_usd": cost
},
metadata={
"error_details": error_message,
"order_id": order_data['id']
} if error_message else { "order_id": order_data['id'] }
)
# Example usage:
# new_order = {"id": "ORD-12345"}
# process_order(new_order)
That's it! You simply wrap your business logic and send a single event at the end. You're not sending logs; you're sending structured data about the workflow's outcome.
Once your code is deployed, Analytics.do starts collecting these events. Our AI engine crunches the numbers to give you high-level reports and deep insights. Instead of combing through thousands of individual log lines, you can query our API (or check your dashboard) and get a clean summary like this:
{
"workflowId": "order-processing-workflow",
"reportId": "rep_2a7b3c9d4e1f",
"timeframe": "2024-10-26T00:00:00Z/2024-10-27T00:00:00Z",
"status": "completed",
"summary": {
"totalExecutions": 15230,
"overallHealth": "Good",
"roiEst": 4.25
},
"metrics": [
{
"name": "completion_rate",
"value": "99.2%",
"target": "98%",
"status": "on_target"
},
{
"name": "average_duration_seconds",
"value": 112,
"target": "< 120",
"status": "on_target"
},
{
"name": "error_rate",
"value": "0.8%",
"target": "< 0.5%",
"status": "needs_attention"
},
{
"name": "cost_per_execution_usd",
"value": "0.03",
"target": "< $0.05",
"status": "on_target"
}
]
}
Instantly, you can see that while most metrics are on target, the error_rate needs attention. You've just turned a complex operational question into a single, actionable data point. The platform even provides an estimated ROI of 4.25x, giving you a powerful metric to share with business stakeholders.
This 15-minute integration is just the beginning. From here, you can:
Stop guessing about the performance of your automated processes. Start measuring what matters.
Ready to unlock your workflow insights? Sign up at Analytics.do and instrument your first workflow today.