No results for

Powered byAlgolia

Results output

suggest edits

k6 run has two different ways of showing the results of a load test. By default, we show an aggregated summary report at the end of the test. This report is customizable, but by default features a general overview of all groups, checks and thresholds in the load test, as well as aggregated values for all built-in and custom metrics used in the test run.

If the aggregated metric measurements are not enough and something more fine-grained is needed, k6 also supports streaming the raw metric values to one or more external outputs (e.g. InfluxDB, Kafka, StatsD, etc.) while the test is running. The raw results can also be sent to our managed k6 cloud service (e.g. when you want to test an environment behind a firewall) and they can be exported as a CSV or JSON file for later processing. All supported built-in outputs are listed below.

Standard output

k6 results - console/stdout output

When k6 displays the results to stdout, it will show the k6 logo and the following test information:

  • Test details: general test information and load options.
  • Progress bar: test status and how much time has passed.
  • Test summary: the test results (after test completion). Since k6 v0.30.0, it is possible to completely customize the output and redirect it to a file. It is also possible to save arbitrary files with machine-readable versions of the summary, like JSON, XML (e.g. JUnit, XUnit, etc.), or even nicely-formatted HTML reports meant for humans! For more details, see the handleSummary() docs.

Test details

execution: local
script: path/to/script.js
output: -
scenarios: (100.00%) 1 scenario, 50 max VUs, 5m30s max duration (incl. graceful stop):
* default: Up to 50 looping VUs for 5m0s over 3 stages (gracefulRampDown: 30s, gracefulStop: 30s)
  • execution: local shows the k6 execution mode (local or cloud).
  • output: - is the output of the granular test results. By default, no output is used, only the aggregated end-of-test summary is shown.
  • script: path/to/script.js shows the name of the script file that is being executed.
  • scenarios: ... is a summary of the scenarios that will be executed this test run and some overview information:
  • * default: ... describes the only scenario for this test run. In this case it's a scenario with a ramping VUs executor, specified via the stages shortcut option instead of using the scenarios long-form option.

End-of-test summary report

The test summary provides a general overview of your test results. By default, the summary prints to stdout the status of all:

As of k6 v0.30.0, it's possible to completely customize the summary shown to stdout, redirect it to a file or stderr, or build and export your own completely custom report (e.g. HTML, JSON, JUnit/XUnit XML, etc.) via the new handleSummary() callback.

data_received..............: 148 MB 2.5 MB/s
data_sent..................: 1.0 MB 17 kB/s
http_req_blocked...........: avg=1.92ms min=1µs med=5µs max=288.73ms p(90)=11µs p(95)=17µs
http_req_connecting........: avg=1.01ms min=0s med=0s max=166.44ms p(90)=0s p(95)=0s
http_req_duration..........: avg=143.14ms min=112.87ms med=136.03ms max=1.18s p(90)=164.2ms p(95)=177.75ms
http_req_receiving.........: avg=5.53ms min=49µs med=2.11ms max=1.01s p(90)=9.25ms p(95)=11.8ms
http_req_sending...........: avg=30.01µs min=7µs med=24µs max=1.89ms p(90)=48µs p(95)=63µs
http_req_tls_handshaking...: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
http_req_waiting...........: avg=137.57ms min=111.44ms med=132.59ms max=589.4ms p(90)=159.95ms p(95)=169.41ms
http_reqs..................: 13491 224.848869/s
iteration_duration.........: avg=445.48ms min=413.05ms med=436.36ms max=1.48s p(90)=464.94ms p(95)=479.66ms
iterations.................: 13410 223.498876/s
vus........................: 100 min=100 max=100
vus_max....................: 100 min=100 max=100

To learn more about the metrics k6 collects and reports, read the Metrics guide.

Output of trend metrics

Trend metrics collect trend statistics (min/max/avg/percentiles) for a series of values. On stdout they are printed like this:

http_req_duration..........: avg=143.14ms min=112.87ms med=136.03ms max=1.18s p(90)=164.2ms p(95)=177.75ms

You could use the summaryTrendStats option to change the stats reported for Trend metrics. You can also make k6 display time values with a fixed time unit (seconds, milliseconds or microseconds) via the summaryTimeUnit option. And, as mentioned above, you can completely customize the whole report via the handleSummary() callback.

$ k6 run --summary-trend-stats="min,avg,med,p(99),p(99.9),max,count" --summary-time-unit=ms script.js

Summary export

Additionally, the k6 run command can export the end-of-test summary report to a JSON file that includes all of the data above. This is useful to get the aggregated test results in a machine-readable format, for integration with dashboards, external alerts, etc.

This was first possible in k6 v0.26.0 with the --summary-export flag, though its use is now discouraged (see why here).

Instead, starting with k6 v0.30.0, the use of the handleSummary() callback is recommended, since it allows completely customizing the end-of-test summary and exporting the summary report data in any desired format (e.g. JSON, CSV, XML (JUnit/xUnit/etc.), HTML, TXT, etc.).

External outputs

If the aggregated end-of-test summary is insufficient, k6 can send more granular result data to different external outputs, to integrate and visualize k6 metrics on other platforms.

The available built-in outputs currently are:

Amazon CloudWatchk6 run --out statsd
Apache Kafkak6 run --out kafka
Cloudk6 run --out cloud
CSVk6 run --out csv
Datadogk6 run --out datadog
InfluxDBk6 run --out influxdb
JSONk6 run --out json
New Relick6 run --out statsd
StatsDk6 run --out statsd

Multiple outputs

You can simultaneously send metrics to several outputs by using the CLI --out flag multiple times, for example:

$ k6 run \
--out json=test.json \
--out influxdb=http://localhost:8086/k6