No results for

Powered byAlgolia

With this integration, you can export test-result metrics from the k6 Cloud to Datadog, where you can visualize and correlate k6 metrics with other monitored metrics.

⭐️  Cloud APM integrations are available on Pro and Enterprise plans, as well as the annual Team plan and Trial.

Necessary Datadog settings

To set up the integration on the k6 Cloud, you need the following Datadog settings:

  • API key
  • Application key

To get your keys, follow the Datadog documentation: "API and Application Keys".

Supported Regions

The supported regions for the Datadog integration are us/us1 (default), eu/eu1, us3, us5, us1-fed.

API and Application keys for a Datadog region won't work on a different region.

Export k6 metrics to Datadog

You must enable the Datadog integration for each test whose metrics you want to export.

After you set up the Datadog settings in the test, you can run a cloud test as usual. As the test runs, k6 Cloud will continuously send the test results metrics to Datadog.

Currently, there are two options to set up the Cloud APM settings in the test:

Configure Datadog export with the test builder

First, configure the Datadog integration for an organization.

  1. From the Main navigation, go to Manage > Cloud APM and select Datadog.

    Cloud APM - Datadog Form UI

  2. In this form, set the API and application keys that you copied previously from Datadog.

    For more information on the other input fields, see configuration parameters.

  3. Save the Datadog configuration for the current organization.

Note that configuring the Datadog settings for an organization does not enable the integration. You must manually enable each test using the test builder.

  1. Create a new test with the test builder, or select an existing test previously created using the test builder.

  2. Select the cloud apm option on the test builder sidebar to enable the integration for the test.

    Cloud APM - Datadog Test Builder UI

Configuration in the k6 script

If you script your k6 tests, you can also configure the Cloud APM settings using the apm option in the k6 script.

The parameters to export the k6 metrics to Datadog are as follows:

export const options = {
ext: {
loadimpact: {
apm: [
{
provider: 'datadog',
apiKey: '<Datadog Provided API key>',
appKey: '<Datadog Provided App key>',
// optional parameters
region: 'us',
metrics: [
'vus',
'http_req_duration',
'my_rate_metric',
'my_gauge_metric',
// create a metric by counting HTTP responses with status 500
{
sourceMetric: 'http_reqs{status="500"}',
targetMetric: 'k6.http_server_errors.count',
},
],
includeDefaultMetrics: true,
includeTestRunId: true,
},
],
},
},
};

Configuration parameters

NameDescription
provider(required)For this integration, the value must be datadog.
apiKey(required)Datadog API key.
appKey(required)Datadog application key.
regionOne of Datadog regions/sites. See the list of supported regions. Default is us.
apiURLAlternative to region. URL of Datadog Site API. Included for support of possible new or custom Datadog regions. Default is picked according to region, e.g. 'https://api.datadoghq.com' for us.
includeDefaultMetricsIf true, add default APM metrics to export: data_sent, data_received, http_req_duration, http_reqs, iterations, and vus. Default is true.
metricsList of metrics to export.
A subsequent section details how to specify metrics.
includeTestRunIdWhether all the exported metrics include a test_run_id tag whose value is the k6 Cloud test run id. Default is false.
Be aware that enabling this setting might increase the cost of your APM provider.
resampleRateSampling period for metrics in seconds. Default is 3 and supported values are integers between 1 and 60.

Metric configuration

Each entry in metrics parameter can be an object with following keys:

NameDescription
sourceMetric(required)Name of k6 builtin or custom metric to export, optionally with tag filters.
Tag filtering follows Prometheus selector syntax,
Example: http_reqs{name="http://example.com",status!="500"}
targetMetricName of resulting metric in Datadog. Default is the name of the source metric with a k6. prefix
Example: k6.http_reqs
keepTagsList of tags to preserve when exporting time series.

keepTags can have a high cost

Most cloud platforms charge clients based on the number of time series stored.

When exporting a metric, every combination of kept-tag values becomes a distinct time series in Prometheus. While this granularity can help test analysis, it will incur high costs with thousands of time series.

For example, if you add keepTags: ["name"] on http_* metrics, and your load test calls many dynamic URLs, the number of produced time series can build up very quickly. Refer to URL Grouping for how to reduce the value count for a name tag.

k6 recommends exporting only tags that are necessary and don't have many distinct values.

Read more: Counting custom metrics in Datadog documentation

Metric configuration detailed example

export const options = {
ext: {
loadimpact: {
apm: [
{
// ...
includeDefaultMetrics: false,
includeTestRunId: true,
metrics: [
// keep vus metrics for whole test run
'vus',
// total byte count for data sent/received by k6
'data_sent',
'data_received',
// export checks metric, keeping 'check' (name of the check) tag
{
sourceMetric: 'checks',
keepTags: ['check'],
},
// export HTTP durations from 'default' scenario,
// keeping only successful response codes (2xx, 3xx), using regex selector syntax
{
sourceMetric: 'http_req_duration{scenario="default",status=~"[23][0-9]{2}"}',
targetMetric: 'k6_http_request_duration', // name of metric as it appears in Datadog
keepTags: ['name', 'method', 'status'],
},
// count HTTP responses with status 500
{
sourceMetric: 'http_reqs{status="500"}',
targetMetric: 'k6_http_server_errors_count',
keepTags: ['scenario', 'group', 'name', 'method'],
},
],
},
],
},
},
};

Read more