No results for

Powered byAlgolia

Thresholds

suggest edits

What are thresholds?

Thresholds are a pass/fail criteria used to specify the performance expectations of the system under test.

Example expectations (Thresholds):

  • System doesn't produce more than 1% errors.
  • Response time for 95% of requests should be below 200ms.
  • Response time for 99% of requests should be below 400ms.
  • Specific endpoint must always respond within 300ms.
  • Any conditions on any Custom metric.

Thresholds analyze the performance metrics and determine the final test result (pass/fail). Thresholds are a essential for load-testing automation.

Here is a sample script that specifies two thresholds, one evaluating the rate of http errors (http_req_failed metric) and one using the 95 percentile of all the response durations (the http_req_duration metric).

threshold.js
1import http from 'k6/http';
2
3export let options = {
4 thresholds: {
5 http_req_failed: ['rate<0.01'], // http errors should be less than 1%
6 http_req_duration: ['p(95)<200'], // 95% of requests should be below 200ms
7 },
8};
9
10export default function () {
11 http.get('https://test-api.k6.io/public/crocodiles/1/');
12}

In other words, you specify the pass criteria when defining your threshold, and if that expression evaluates to false at the end of the test, the whole test will be considered a fail.

When executing that script, k6 will output something similar to this:

threshold-output
✓ http_req_duration..............: avg=151.06ms min=151.06ms med=151.06ms max=151.06ms p(90)=151.06ms p(95)=151.06ms
{ expected_response:true }...: avg=151.06ms min=151.06ms med=151.06ms max=151.06ms p(90)=151.06ms p(95)=151.06ms
✓ http_req_failed................: 0.00% ✓ 01
  • In the above case, the criteria for both thresholds were met. The whole load test is considered to be a pass, which means that k6 will exit with exit code zero.

  • If any of the thresholds had failed, the little green checkmark next to the threshold name (http_req_failed, http_req_duration) would have been a red cross instead, and k6 would have generated a non-zero exit code.

Copy-paste Threshold examples

The quickest way to start with thresholds is to use the standard, built-in k6 metrics.

Here are a few copy-paste examples that you can start using right away.

threshold-request-duration.js
1import http from 'k6/http';
2import { sleep } from 'k6';
3
4export let options = {
5 thresholds: {
6 // 90% of requests must finish within 400ms.
7 http_req_duration: ['p(90) < 400'],
8 },
9};
10
11export default function () {
12 http.get('https://test-api.k6.io/public/crocodiles/1/');
13 sleep(1);
14}
threshold-error-rate.js
1import http from 'k6/http';
2import { sleep } from 'k6';
3
4export let options = {
5 thresholds: {
6 // During the whole test execution, the error rate must be lower than 1%.
7 // `http_req_failed` metric is available since v0.31.0
8 http_req_failed: ['rate<0.01'],
9 },
10};
11
12export default function () {
13 http.get('https://test-api.k6.io/public/crocodiles/1/');
14 sleep(1);
15}

Multiple thresholds on a single metric

threshold-request-duration.js
1import http from 'k6/http';
2import { sleep } from 'k6';
3
4export let options = {
5 thresholds: {
6 // 90% of requests must finish within 400ms, 95% within 800, and 99.9% within 2s.
7 http_req_duration: ['p(90) < 400', 'p(95) < 800', 'p(99.9) < 2000'],
8 },
9};
10
11export default function () {
12 let res1 = http.get('https://test-api.k6.io/public/crocodiles/1/');
13 sleep(1);
14}

Threshold on group duration

threshold-group-duration.js
1import http from 'k6/http';
2import { group, sleep } from 'k6';
3
4export let options = {
5 thresholds: {
6 'group_duration{group:::individualRequests}': ['avg < 200'],
7 'group_duration{group:::batchRequests}': ['avg < 200'],
8 },
9 vus: 1,
10 duration: '10s',
11};
12
13export default function () {
14 group('individualRequests', function () {
15 http.get('https://test-api.k6.io/public/crocodiles/1/');
16 http.get('https://test-api.k6.io/public/crocodiles/2/');
17 http.get('https://test-api.k6.io/public/crocodiles/3/');
18 });
19
20 group('batchRequests', function () {
21 http.batch([
22 ['GET', `https://test-api.k6.io/public/crocodiles/1/`],
23 ['GET', `https://test-api.k6.io/public/crocodiles/2/`],
24 ['GET', `https://test-api.k6.io/public/crocodiles/3/`],
25 ]);
26 });
27
28 sleep(1);
29}

You can find more specific threshold examples on the Counter, Gauge, Trend and Rate pages.

Threshold Syntax

Thresholds can be specified in a short or full format.

threshold-options.js
1export let options = {
2 thresholds: {
3 metric_name1: [ 'threshold_expression', ... ], // short format
4 metric_name1: [ { threshold: 'threshold_expression', abortOnFail: boolean, delayAbortEval: string }, ], // full format
5 }
6};

The above declaration inside a k6 script means that there will be a threshold configured for the metric metric_name1. To determine if the threshold has failed or passed, the string 'threshold_expression' will be evaluated. The 'threshold_expression' must follow the following format.

aggregation_method operator value

Examples:

  • avg < 200 // average duration can't be larger than 200ms
  • count >= 500 // count must be larger or equal to 500
  • p(90) < 300 // 90% of samples must be below 300

A threshold expression evaluates to true or false.

Each of the four metric types included in k6 provide its own set of aggregation methods usable in threshold expressions.

Metric typeAggregation methods
Countercount and rate
Gaugevalue
Raterate
Trendavg, min, max, med and p(N) where N is a number between 0.0 and 100.0 meaning the percentile value to look at, e.g. p(99.99) means the 99.99th percentile. The unit for these values is milliseconds.

Here is a (slightly contrived) sample script that uses all different types of metrics, and sets different types of thresholds for them:

thresholds-all.js
1import http from 'k6/http';
2import { Trend, Rate, Counter, Gauge } from 'k6/metrics';
3import { sleep } from 'k6';
4
5export let TrendRTT = new Trend('RTT');
6export let RateContentOK = new Rate('Content OK');
7export let GaugeContentSize = new Gauge('ContentSize');
8export let CounterErrors = new Counter('Errors');
9export let options = {
10 thresholds: {
11 RTT: ['p(99)<300', 'p(70)<250', 'avg<200', 'med<150', 'min<100'],
12 'Content OK': ['rate>0.95'],
13 ContentSize: ['value<4000'],
14 Errors: ['count<100'],
15 },
16};
17
18export default function () {
19 let res = http.get('https://test-api.k6.io/public/crocodiles/1/');
20 let contentOK = res.json('name') === 'Bert';
21
22 TrendRTT.add(res.timings.duration);
23 RateContentOK.add(contentOK);
24 GaugeContentSize.add(res.body.length);
25 CounterErrors.add(!contentOK);
26
27 sleep(1);
28}

We have these thresholds:

  • A trend metrics that is fed with response time samples, and which has the following threshold criteria:
    • 99th percentile response time must be below 300 ms
    • 70th percentile response time must be below 250 ms
    • Average response time must be below 200 ms
    • Median response time must be below 150 ms
    • Minimum response time must be below 100 ms
  • A rate metric that keeps track of how often the content returned was OK. This metric has one success criteria: content must have been OK more than 95% of the time.
  • A gauge metric that contains the latest size of the returned content. The success criteria for this metric is that the returned content must be smaller than 4000 bytes.
  • A counter metric that keeps track of the total number of times content returned was not OK. The success criteria here implies that content can't have been bad more than 99 times.

Thresholds on tags

It's often useful to specify thresholds only on a single URL or a specific tag. In k6, tagged requests create sub-metrics that can be used in thresholds as shown below.

export let options = {
thresholds: {
'metric_name{tag_name:tag_value}': ['threshold_expression'],
},
};

And here's a full example.

thresholds-on-submetrics.js
1import http from 'k6/http';
2import { sleep } from 'k6';
3import { Rate } from 'k6/metrics';
4
5export let options = {
6 thresholds: {
7 'http_req_duration{type:API}': ['p(95)<500'], // threshold on API requests only
8 'http_req_duration{type:staticContent}': ['p(95)<200'], // threshold on static content only
9 },
10};
11
12export default function () {
13 let res1 = http.get('https://test-api.k6.io/public/crocodiles/1/', {
14 tags: { type: 'API' },
15 });
16 let res2 = http.get('https://test-api.k6.io/public/crocodiles/2/', {
17 tags: { type: 'API' },
18 });
19
20 let responses = http.batch([
21 [
22 'GET',
23 'https://test-api.k6.io/static/favicon.ico',
24 null,
25 { tags: { type: 'staticContent' } },
26 ],
27 [
28 'GET',
29 'https://test-api.k6.io/static/css/site.css',
30 null,
31 { tags: { type: 'staticContent' } },
32 ],
33 ]);
34
35 sleep(1);
36}

Aborting a test when a threshold is crossed

If you want to abort a test as soon as a threshold is crossed, before the test has completed, there's an extended threshold specification format that looks like this:

threshold-abort.js
1export let options = {
2 thresholds: {
3 metric_name: [ { threshold: string, abortOnFail: boolean, delayAbortEval: string }, ... ],
4 }
5};

As you can see in the example above the threshold specification has been extended to alternatively support a JS object with parameters to control the abort behavior. The fields are as follows:

NameTypeDescription
thresholdstringThis is the threshold expression string specifying the threshold condition to evaluate.
abortOnFailbooleanWhether to abort the test if the threshold is evaluated to false before the test has completed.
delayAbortEvalstringIf you want to delay the evaluation of the threshold for some time, to allow for some metric samples to be collected, you can specify the amount of time to delay using relative time strings like 10s, 1m and so on.

Here is an example:

abort-on-fail-threshold.js
1import http from 'k6/http';
2
3export let options = {
4 vus: 30,
5 duration: '2m',
6 thresholds: {
7 http_req_duration: [{threshold: 'p(99) < 10', abortOnFail: true}]
8 },
9};
10
11export default function () {
12 http.get('https://test-api.k6.io/public/crocodiles/1/');
13}

⚠️ Evaluation delay in the cloud

When k6 runs in the cloud, thresholds are evaluated every 60 seconds, therefore the "abortOnFail" feature may be delayed by up to 60 seconds.

Failing a load test using checks

Checks are nice for codifying assertions, but unlike thresholds, checks will not affect the exit status of k6.

If you only use checks to verify that things work as expected, you will not be able to fail the whole test run based on the results of those checks.

It can often be useful to combine checks and thresholds, to get the best of both:

check_and_fail.js
1import http from 'k6/http';
2import { check, sleep } from 'k6';
3
4export let options = {
5 vus: 50,
6 duration: '10s',
7 thresholds: {
8 // the rate of successful checks should be higher than 90%
9 checks: ['rate>0.9'],
10 },
11};
12
13export default function () {
14 const res = http.get('http://httpbin.org');
15
16 check(res, {
17 'status is 500': (r) => r.status == 500,
18 });
19
20 sleep(1);
21}

In this example, the threshold is configured on the checks metric - establishing that the rate of successful checks should be higher than 90%.

Additionally, you can use tags on checks if you want to define a threshold based on a particular check or group of checks. For example:

1import http from 'k6/http';
2import { check, sleep } from 'k6';
3
4export let options = {
5 vus: 50,
6 duration: '10s',
7 thresholds: {
8 'checks{myTag:hola}': ['rate>0.9'],
9 },
10};
11
12export default function () {
13 let res;
14
15 res = http.get('http://httpbin.org');
16 check(res, {
17 'status is 500': (r) => r.status == 500,
18 });
19
20 res = http.get('http://httpbin.org');
21 check(
22 res,
23 {
24 'status is 200': (r) => r.status == 200,
25 },
26 { myTag: 'hola' },
27 );
28
29 sleep(1);
30}

Thresholds in k6 Cloud Results

In k6 Cloud Results Thresholds are available in their own tab for analysis.

You can also see how the underlying metric compares to a specific threshold throughout the test. The threshold can be added to the analysis tab for further comparison against other metrics.

k6 Cloud Thresholds Tab

Learn more about analyzing results in the k6 Cloud Results docs.