Thresholds

Thresholds are pass/fail rules that Tigrister evaluates against a test run's aggregate metrics. They are how you turn a load test into an automated gate: if every threshold passes, the run is marked Passed; if any enabled threshold fails, the run is marked Failed.

Where They Live

Thresholds are stored on the spec, not on individual test types. Every run — Load, Stress, Spike, Soak, or Custom — evaluates the same set of thresholds. Add or edit them in the Thresholds card of the Dashboard panel, which sits just below the configuration form together with the Fail on assertion errors toggle.

Shared across test types: this is a deliberate choice. A single spec with one threshold list makes it easy to compare runs — if the Load run passes the P95 gate but the Stress run fails it, you know where the system broke.

Available Metrics

A threshold is built from three pieces: metric, operator, and value. These are the seven metrics you can gate on:

MetricUnitWhat it measures
Avg Response TimemsArithmetic mean of every request's response time
P50 Response TimemsMedian — 50% of requests completed at or below this time
P90 Response Timems90th percentile — 90% of requests at or below this time
P95 Response Timems95th percentile — the value most SLOs are defined against
P99 Response Timems99th percentile — captures the long tail that hurts real users
Error Rate%Percentage of failed requests out of the total
Throughputreq/sSuccessful requests per second across the whole run

Operators

Four comparison operators are available. Pick the one that expresses "this is the limit I do not want to cross":

  • <Metric must be strictly less than the value (e.g. P95 < 2000 ms)
  • <=Metric must be less than or equal to the value
  • >Metric must be strictly greater than the value (useful for throughput floors)
  • >=Metric must be greater than or equal to the value

Editing Thresholds

Press Add Threshold to append a new row. New thresholds default to P95 Response Time < 2000 ms — a reasonable SLO starting point that you can change with the dropdowns. Each row has:

  • Enable/disable toggle (the checkmark icon) — disabled thresholds are dimmed and excluded from pass/fail evaluation, but stay in the list
  • Metric dropdown — pick one of the seven metrics above
  • Operator dropdown — one of the four operators
  • Value input — the numeric threshold; the unit label on the right updates automatically based on the metric
  • Trash icon — removes the row

Error rate is a percentage, not a ratio: type 5 in the value field for a 5% threshold, not 0.05. Tigrister converts between the two internally so you never have to.

Fail on Assertion Errors

Above the Thresholds list there's a separate toggle called Fail on assertion errors. Each step in a spec can have per-request assertions (status code equals, JSON path equals, body contains, etc.) that run on every VU iteration. When this toggle is on and any assertion fails during the run, the whole test is marked Failed — regardless of the threshold results.

This is stored per test type along with the rest of the LoadTestConfig, so you can enforce it strictly on a Load run and relax it on a Stress run where error rate is expected to spike.

How a Run's Status Is Decided

At the end of a run Tigrister computes aggregate metrics, evaluates every enabled threshold and derives the run's TestStatus:

  • Passed — every enabled threshold held, and (if the toggle is on) no assertions failed
  • Failed — at least one enabled threshold was violated, or an assertion failed while the toggle was on
  • Stopped — the user pressed Stop before the run finished; thresholds are not evaluated

The Results panel shows each threshold's individual pass/fail state next to the aggregate status — see the Results section for the detailed view.