In this tutorial, we will look into how to integrate performance testing with k6 into your GitLab setup. By integrating performance tests into your CI pipelines, you can catch performance issues earlier and ship more stable and performant applications to production.
- k6, an open-source load testing tool for testing the performance of APIs, microservices, and websites.
- GitLab is a complete DevOps Platform including a CI/CD toolchain and many more features.
The examples in this tutorial can be found here.
Write Your Performance Test Script
For the sake of this tutorial, we will create a simple k6 test for our demo API. Feel free to change this to any of the API endpoints you are looking to test.
The following test will run 50 VUs (virtual users) continuously for one minute. Throughout this duration, each VU will generate one request, sleep for 3 seconds, and then start over.
You can run the test locally using the following command. Just make sure to install k6 first.
This produces the following output:
The next step is to add your service-level objectives (SLOs) for the performance of your application. SLOs are a vital aspect of ensuring the reliability of your systems and applications. If you do not currently have any defined SLAs or SLOs, now is a good time to start considering your requirements.
You can then configure your SLOs as pass/fail criteria in your test script using thresholds. k6 evaluates these thresholds during the test execution and informs you about its results.
If a threshold in your test fails, k6 will finish with a non-zero exit code, which communicates to the CI tool that the step failed.
Now, we add one threshold to our previous script to validate that the 95th percentile response time is below 500ms. After this change, the script will look like this:
Thresholds are a powerful feature providing a flexible API to define various types of pass/fail criteria in the same test run. For example:
- The 99th percentile response time must be below 700 ms.
- The 95th percentile response time must be below 400 ms.
- No more than 1% failed requests.
- The content of a response must be correct more than 95% of the time.
- Your condition for pass/fail criteria (SLOs)
Configure GitLab CI
In the root of your project folder, create a file with this name .gitlab-ci.yml. This configuration file will trigger the CI to build whenever a push to the remote repository is detected. If you want to know more about it, check out the tutorial: Getting started with GitLab CI/CD.
Eventually, your repository should look like the following screenshot. You can also fork the k6-gitlab-example repository.
Click on the CI / CD -> Jobs section of the sidebar menu.
On this page, you will have your initiated jobs running. Click on the specific job to watch the test run and see the results when it is finished. Below is the succeeded job:
Running k6 Cloud Tests
There are two common ways to run k6 tests as part of a CI process:
- k6 run to run a test locally on the CI server.
- k6 cloud to run a test on the k6 Cloud from one or multiple geographic locations.
You might want to trigger cloud tests in these common cases:
- If you want to run a test from one or multiple geographic locations (load zones).
- If you want to run a test with high-load that will need more compute resources than provisioned by the CI server.
If any of those reasons fits your needs, then running k6 cloud tests is the way to go for you.
Before we start with the GitLab configuration, it is good to familiarize ourselves with how cloud execution works, and we recommend you to test how to trigger a cloud test from your machine.
Check out the guide to running cloud tests from the CLI to learn how to distribute the test load across multiple geographic locations and more information about the cloud execution.
Now, we will show how to trigger cloud tests from GitLab CI. If you do not have an account with k6 Cloud already, you should go register for a trial account here. After that, go to the API token page on Account Settings in k6 Cloud and copy your API token. You also need your project ID to be set as an environment variable. The project ID is visible under the project name in k6 Cloud project page.
Navigate to Settings -> CI / CD -> Variables of GitLab repository and expand variables. When GitLab runs a job, the K6_CLOUD_TOKEN environment variable will automatically authenticate you to the k6 Cloud service. You also need to set the K6_CLOUD_PROJECT_ID to your k6 Cloud project ID. Refer to create a custom variable in the UI guide for more information.
Now, you have to update the previous .gitlab-ci.yml file. It will look something like:
The change, compared to the previous configuration, is that we added a loadtest-cloud stage that uses the k6 cloud command to trigger cloud tests.
With that done, we can now go ahead and push the changes we've made in .gitlab-ci.yml to our GitLab repository. This subsequently triggers GitLab CI to build our new jobs. When all is done and good, we should see a screen like this from GitLab jobs page:
It is essential to know that GitLab CI prints the output of the k6 command, and when running cloud tests, k6 prints the URL of the test result in the k6 Cloud. You could navigate to this URL to see the result of your cloud test.
It's common to run some performance tests during the night when users do not access the system under test. For example, to isolate larger tests from other types of tests or to generate a performance report periodically.
To configure scheduled nightly build that runs at a given time of a given day or night, follow these steps:
- Head over to CI / CD -> Schedules section of your repository.
- Click New schedule and configure it.
- Fill in a cron-like value for the time you wish to trigger the job execution at.
- Click on Save pipeline schedule.
For example, to trigger a build at midnight everyday, the cron value of 0 0 * * * will do the job. The screenshot below shows the same. For this example, it is in UTC.
This is your saved job schedule. You can test it by clicking on the ▶️ button.
To learn more about GitLab pipeline schedules, we recommend reading the pipeline schedules documentation.
Hope you enjoyed reading this article. We'd be happy to hear your feedback.