No results for

Powered byAlgolia

End-of-test summary

When a test finishes, k6 prints an end-of-test summary to stdout, providing top-level details about the test run.

If the default report doesn't suit you, you can use the handleSummary() function to completely customize output (including format).

The default summary

The end-of-test summary reports details and aggregated statistics for the primary aspects of the test:

  • Summary statistics about each built-in and custom metric (e.g. mean, median, p95, etc).
  • A list of the test's groups and scenarios
  • The pass/fail results of the test's thresholds and checks.

You can use options to configure or silence the report.

End-of-test example

Here's an example of a report that k6 generated after a test run.

  • It has a scenario, Ramp_Up
  • The requests are split into two groups:
    • GET home, which has a check that responses are 200 (all passed)
    • Create resource, which has a check that responses are 201 (all failed)
  • The test has one threshold, requiring that 95% of requests have a duration under 200ms (failed)
Summary with scenario, groups, checks, and thresholds
Ramp_Up ✓ [======================================] 00/20 VUs 30s
█ GET home -
✓ status equals 200
█ Create resource -
✗ status equals 201
↳ 0% — ✓ 0 / ✗ 45
checks.........................: 50.00% ✓ 45 ✗ 45
data_received..................: 1.3 MB 31 kB/s
data_sent......................: 81 kB 2.0 kB/s
group_duration.................: avg=6.45s min=4.01s med=6.78s max=10.15s p(90)=9.29s p(95)=9.32s
http_req_blocked...............: avg=57.62ms min=7µs med=12.25µs max=1.35s p(90)=209.41ms p(95)=763.61ms
http_req_connecting............: avg=20.51ms min=0s med=0s max=1.1s p(90)=100.76ms p(95)=173.41ms
✗ http_req_duration..............: avg=144.56ms min=104.11ms med=110.47ms max=1.14s p(90)=203.54ms p(95)=215.95ms
{ expected_response:true }...: avg=144.56ms min=104.11ms med=110.47ms max=1.14s p(90)=203.54ms p(95)=215.95ms
http_req_failed................: 0.00% ✓ 0 ✗ 180
http_req_receiving.............: avg=663.96µs min=128.46µs med=759.82µs max=1.66ms p(90)=1.3ms p(95)=1.46ms
http_req_sending...............: avg=88.01µs min=43.07µs med=78.03µs max=318.81µs p(90)=133.15µs p(95)=158.3µs
http_req_tls_handshaking.......: avg=29.25ms min=0s med=0s max=458.71ms p(90)=108.31ms p(95)=222.46ms
http_req_waiting...............: avg=143.8ms min=103.5ms med=109.5ms max=1.14s p(90)=203.19ms p(95)=215.56ms
http_reqs......................: 180 4.36938/s
iteration_duration.............: avg=12.91s min=12.53s med=12.77s max=14.35s p(90)=13.36s p(95)=13.37s
iterations.....................: 45 1.092345/s
vus............................: 1 min=1 max=19
vus_max........................: 20 min=20 max=20
ERRO[0044] some thresholds have failed

Customize with handleSummary()

Use the handleSummary() function to completely customize the end-of-test summary report.

k6 callshandleSummary() at the end of the test run, even after teardown(). If handleSummary() is exported, k6 does not print the default summary.

Besides customizing the CLI summary, you can also transform the summary data into machine- or human-readable formats. This lets you create JS-helper functions that generate JSON, CSV, XML (JUnit/xUnit/etc.), HTML, etc. files from the summary data.

Note: For now, the handleSummary() feature is available only for local k6 run tests. However we plan to support the feature for k6 Cloud tests eventually. You can track progress in this issue.

Data format returned by handleSummary()

k6 expects handleSummary() to return a {key1: value1, key2: value2, ...} map that represents the summary-report content. While the values can have a type of either string or ArrayBuffer, the keys must be strings.

The keys determine where k6 displays or saves the content:

  • stdout for standard output
  • stderr for standard error,
  • any relative or absolute path to a file on the system (this operation overwrites existing files)
example keys for handleSummary output
return {
'stdout': textSummary(data, { indent: ' ', enableColors: true }), // Show the text summary to stdout...
'other/path/to/summary.json': JSON.stringify(data), // and a JSON with all the details...

To get an idea how data would look in your specific test run, just add return { 'raw-data.json': JSON.stringify(data)}; in your handleSummary() function and inspect the resulting raw-data.json file. Here's a very abridged example of how it might look:

data passed to handleSummary()
2 "root_group": {
3 "path": "",
4 "groups": [
5 // Sub-groups of the root group...
6 ],
7 "checks": [
8 {
9 "passes": 10,
10 "fails": 0,
11 "name": "check name",
12 "path": "::check name"
13 },
14 // More checks...
15 ],
16 "name": ""
17 },
18 "options": {
19 // Some of the global options of the k6 test run,
20 // Currently only summaryTimeUnit and summaryTrendStats
21 },
22 "metrics": {
23 // A map with metric and sub-metric names as the keys and objects with
24 // details for the metric. These objects contain the following keys:
25 // - type: describes the metric type, e.g. counter, rate, gauge, trend
26 // - contains: what is the type of data, e.g. time, default, data
27 // - values: the specific metric values, depends on the metric type
28 // - thresholds: any thresholds defined for the metric or sub-metric
29 //
30 "http_reqs": {
31 "type": "counter",
32 "contains": "default",
33 "values": {
34 "count": 40,
35 "rate": 19.768856959496336
36 }
37 },
38 "vus": {
39 "type": "gauge",
40 "contains": "default",
41 "values": {
42 "value": 1,
43 "min": 1,
44 "max": 5
45 }
46 },
47 "http_req_duration": {
48 "type": "trend",
49 "contains": "time",
50 "values": {
51 // actual keys depend depend on summaryTrendStats
53 "avg": 268.31137452500013,
54 "max": 846.198634,
55 "p(99.99)": 846.1969478817999,
56 // ...
57 },
58 "thresholds": {
59 "p(95)<500": {
60 "ok": false
61 }
62 }
63 },
64 "http_req_duration{staticAsset:yes}": { // sub-metric from threshold
65 "contains": "time",
66 "values": {
67 // actual keys depend on summaryTrendStats
68 "min": 135.092841,
69 "avg": 283.67766343333335,
70 "max": 846.198634,
71 "p(99.99)": 846.1973802197999,
72 // ...
73 },
74 "thresholds": {
75 "p(99)<250": {
76 "ok": false
77 }
78 },
79 "type": "trend"
80 },
81 // ...
82 }

Send reports to a remote server

You can also send the generated reports to a remote server by making an HTTP request (or using any of the other protocols k6 already supports)! Here's a simple example:

handleSummary() demo
1import http from 'k6/http';
2import k6example from '';
3export default k6example; // use some predefined example to generate some data
4export const options = { vus: 5, iterations: 10 };
6// These are still very much WIP and untested, but you can use them as is or write your own!
7import { jUnit, textSummary } from '';
9export function handleSummary(data) {
10 console.log('Preparing the end-of-test summary...');
12 // Send the results to some remote server or trigger a hook
13 const resp ='', JSON.stringify(data));
14 if (resp.status != 200) {
15 console.error('Could not send summary, got status ' + resp.status);
16 }
18 return {
19 'stdout': textSummary(data, { indent: ' ', enableColors: true }), // Show the text summary to stdout...
20 '../path/to/junit.xml': jUnit(data), // but also transform it and save it as a JUnit XML...
21 'other/path/to/summary.json': JSON.stringify(data), // and a JSON with all the details...
22 // And any other JS transformation of the data you can think of,
23 // you can write your own JS helpers to transform the summary data however you like!
24 };

The preceding snippet uses some JS helper functions to transform the summary in various formats. These helper functions might change, so keep an eye on for the latest.

Of course, we always welcome PRs to the jslib, too!

Custom output examples

These examples are community contributions. We thank everyone who has shared!

Summary options

k6 provides some options to filter or silence summary output:

  • The --summary-trend-stats option defines which Trend metric statistics to calculate and show.
  • The --summary-time-unit option forces k6 to use a fixed-time unit for all time values in the summary.
  • The --no-summary option completely disables report generation, including --summary-export and handleSummary().
  • The --summary-export option exports a summary report with a predefined JSON format to a file. Now discouraged; use the handleSummary callback instead.
Summary export to a JSON file (Discouraged)

Summary export to a JSON file

k6 also has the --summary-export=path/to/file.json option, which exports some summary report data to a JSON file.

The format of --summary-export is similar to the data parameter of the handleSummary() function. Unfortunately, the --summary-export format is limited and has a few confusing peculiarities. For example, groups and checks are unordered, and threshold values are unintuitive: true indicates the threshold failed, and false that it succeeded.

We couldn't change the --summary-export data format, because it would have broken backward compatibility in a feature that people depended on in CI pipelines. But, the recommended approach to export to a JSON file is the handleSummary() callback. The --summary-export option will likely be deprecated in the future.