Results Panel

Understanding execution results, response time charts, step details, and anomaly detection.

Overview

After a flow runs, the detail area switches to the Results Panel. This panel shows a comprehensive breakdown of the execution: status header, summary metrics, a response time chart, the step results list, and anomaly/error details if applicable.

If you navigate to a step detail view, click the Results button in the Action Bar to return to the results panel. Use the Clear Results button to dismiss all run data and return to the design view.

Execution Header

The header bar at the top of the results panel shows the overall run status:

StateIndicatorDetails
RunningPulsing blue dotShows progress (e.g., "3/5 steps") and a progress bar. If running with iterations, shows "iteration 2/4".
PassedGreen dotAll steps completed successfully.
FailedRed dotOne or more steps failed. Shows "X of Y failed".
Anomaly DetectedYellow warning triangleAppears alongside Passed/Failed when response time anomalies are detected (multiple iterations only).

Summary Cards

Four metric cards provide an at-a-glance overview of the run:

Duration
101ms
4 steps total
Success Rate
100%
4/4
Avg Response
21ms
Data Transfer
2.3 KB
804 B sent
CardDescription
DurationTotal wall-clock time for the run. Shows step count below. Updates live during execution.
Success RatePercentage of steps that passed, shown as a ring chart. Displays passed/total count below (e.g., "4/4"). With iterations, counts all steps across all iterations (e.g., "16/16").
Avg ResponseAverage response time across all executed steps.
Data TransferTotal bytes received. Shows bytes sent below.

Note: The metrics also automatically identify the bottleneck — the slowest step in your flow. If a step's response time is significantly higher than the average, it's flagged with the excess time shown.

Response Time Chart

Below the summary cards, a line chart visualizes the response time for each step. The X-axis shows step names, the Y-axis shows time in milliseconds.

Single Iteration

40ms30ms20ms10msRegisterLoginValidateGet User
  • A single blue line connects the response time of each step, with a subtle area fill below
  • Dots mark each step's data point on the line
  • Hover over a step to see a tooltip with the step name and exact response time. A vertical guide line appears at the hovered position.
  • If a step failed, its dot shows a red ring around it

Response Time Chart with Iterations

When running with multiple iterations, the chart changes to show one line per iteration, each in a different color. This makes it easy to compare response times across runs.

Multiple Iterations (4x)

4 iterations
15ms10ms5ms0msRegisterLoginValidateGet User

Iteration Colors:

#1 Blue
#2 Emerald
#3 Amber
#4 Violet
#5 Rose
#6 Cyan
#7 Orange
#8 Lime

Chart Modes:

  • All (aggregate): All iteration lines are shown with equal weight. No specific iteration is highlighted. Great for spotting overall patterns and outlier iterations.
  • Selected iteration: When you click an iteration, its line becomes bold with area fill and dots. Other iterations' lines become thin and faded, providing context without distraction.

Iteration Selector

When running with multiple iterations, an iteration selector bar appears below the chart. It lets you switch between aggregate and individual iteration views.

Iterations:
  • All: Shows aggregate view — all iteration lines with equal weight on the chart, summary cards show totals across all iterations
  • Individual iteration buttons: Each shows its color indicator, number, and total duration. Click to select that iteration — the chart highlights its line, and the step results list updates to show that iteration's data.
  • Failed iterations: If an iteration has any failed steps, a small red dot appears on its button
  • Click a selected iteration again to deselect it and return to the aggregate view

Step Results List

Below the chart, every executed step is listed with detailed result information. Click any step to open its full response in the step detail panel.

POST
Register
http://localhost:8080/register
20138ms374 B1/1
POST
Login
http://localhost:8080/login
20018ms482 B
GET
Validate
http://localhost:8080/validate
20013ms374 B1/1
GET
Get User
http://localhost:8080/user/1
20022ms407 B
ColumnDescription
Status Icon success, failure, running (animated), pending / skipped
MethodHTTP method badge (GET, POST, PUT, PATCH, DELETE) with color coding
Name & URLStep name (bold) with the full request URL below in smaller text
Status CodeHTTP response status code (e.g., 200, 201, 404). Color-coded: green for 2xx, blue for 3xx, amber for 4xx, red for 5xx.
Response TimeTime in milliseconds for the request round-trip
Response SizeTotal response body size in bytes (e.g., 374 B, 1.2 KB)
AssertionsIf the step has assertions, shows passed/total count (e.g., 1/1). Green when all pass, red when any fail. Empty if no assertions configured.

Tip: Click any step in the results list to open its full response details — body, headers, cookies, timing breakdown, and assertion results. Use the Results button in the Action Bar to navigate back.

Anomaly Detection

When running with multiple iterations, Tigrister automatically analyzes response times across iterations to detect anomalies. If detected, an "Anomaly Detected" badge appears in the execution header, and detail cards appear at the bottom of the results panel.

Anomaly Types:

  • Outlier: A specific iteration's response time for a step deviates significantly from the mean. Shown as: "Iteration 1: 38ms — 250% above average"
  • Inconsistent: A step's response times vary too much across iterations (high coefficient of variation). Shown as: "Response times are unstable (CV: 85%) — 5ms, 38ms, 42ms, 31ms"
  • Both: A step can have both outlier and inconsistency anomalies simultaneously

Example Anomaly Card

Registeravg 7ms

Iteration 1: 38ms — 443% above average

Response times are unstable (CV: 85%) — 38ms, 5ms, 7ms, 5ms

Note: Anomaly detection requires at least 3 iterations to produce meaningful statistical comparisons. Runs with fewer iterations do not produce anomaly data. This is particularly useful for identifying cold-start effects (first iteration is slower) or intermittent server issues.

Error Details

When any step fails during execution, an Error Details section appears below the step results list. It shows each failed step with its error message and any failed assertion messages, making it easy to identify and debug failures without clicking into individual steps.

Managing Results

  • Clear Results: Click the "Clear Results" button in the Action Bar to dismiss all run data and return to the design view
  • Results button: After clicking a step to view its response, click "Results" in the Action Bar to return to the results panel
  • Re-run: Click Run again to execute the flow with the same settings. Previous results are replaced by the new run.