📖What you will learn
- How to run a k6 test in the cloud
- The benefits of performance testing in the cloud
As discussed in the earlier blog on Performance Testing in the Cloud, performance testing tools must fit into today's DevOps workflows. This means that they should support both local testing and cloud based testing. Local testing for developers, early in the dev cycle, and cloud based testing later in the cycle for running larger tests.
Let's take a deeper dive into the cloud based testing scenario.k6 version 0.21.1 was recently released and with that version was introduced support for test execution in the cloud. Load Impact has been busy contributing plugin code that interfaces with k6, allowing k6 users to start load tests using the Load Impact cloud infrastructure from their usual k6 command line. Cloud execution frees you from having to maintain your own load generation infrastructure.
This article describes how all this works and how you can use it.
Why run tests in the cloud?
There are several reasons you may want to use the Load Impact cloud execution functionality, but the main one is probably that you prefer not to have to maintain your own load testing infrastructure, which can be both expensive and a time sink.
Another reason to use the Load Impact cloud execution functionality is to run large-scale load tests: Currently, k6 has no built-in support for distributed execution, which is required if you want to scale things up to thousands of Virtual Users (VUs) in a single test. You can work around it by starting multiple k6 instances and streaming to a single InfluxDB instance, but it means more work for you setting things up. Note also that k6 will get support for distributed execution in the near future - it is on the road map - but this will still require you to provision your load generation infrastructure. Provisioning a large test that utilizes potentially hundreds of load generator servers is not as simple as it may seem.
It is also convenient to not have to store all your test results yourself. While you can run k6 locally and just stream results to loadimpact.com for storage and later analysis, results will automatically be stored at loadimpact.com if you execute a test in the cloud.
How do you run tests in the cloud then?
Basically, just follow these simple steps:
- Get a Load Impact user account, if you don't have one (You don't have to pay anything to sign up)
- Get k6 version 0.21.1 or later. Check out the installation instructions for detailed help
- Run k6 login cloud. You will get a prompt for your Load Impact username and password and when you've entered those, k6 will perform a login operation and then display your Load Impact API key that k6 can use to run tests in the Load Impact cloud. For Single Sign-On users, an alternative authentication mechanism using a Load Impact API authentication token is necessary. And, for Docker users, there are some other gotchas to be aware of. Check out these instructions for more help.
- Now you can start a test in the cloud! Just type k6 cloud --vus 10 --duration 60s script.js if you have installed a binary release of k6.
- Follow and control the running test in the Load Impact Insights UI:
I also added a little extra configuration to the myscript.js script: a threshold specifying that the pages had to load within 100 ms and a ramp-up schedule for the test which would ramp up the load level over the course of 2 minutes to first 10 VU, then 25 VU, keep it steady at 25 for a minute and then ramp down. Here is the extra config data:
(If you want to know more about configuring k6, read about k6 options)
This configuration allows me to execute k6 tests in the cloud just by entering k6 cloud myscript.js on the command line of my laptop - all the configuration needed is in the myscript.js file itself.
The beauty of running your tests in the cloud is, of course, that you can scale up to much larger traffic levels than you can generate on any single machine you may have at work (including perhaps your laptop). You could change the "target" values in the k6 options above so that this test ramps up to 5,000 VU instead of just 25 VU. Load Impact would automatically provision enough cloud servers to be able to simulate 5,000 concurrent VUs - you don't have to worry about it. And then, during the test, Load Impact controls all the servers and running k6 instances, aggregating results data in real time.
Another useful feature of cloud-based tests is that your results all end up in one place, which makes comparisons between tests simpler. Load Impact Insights has a trend view where you can see the latest load tests you have run using a certain configuration. It can help you detect longer-term performance regressions. This is what the trend view looks like for my test (to simulate a performance regression I changed the script to load some quite slow URLs, which caused the URL load time bar to suddenly grow in size on my 5th test run):
What You See Is What You Get
There is one cool feature worth mentioning that relates to how k6 executes tests in the cloud: if you issue the command k6 cloud script.js the cloud test will always use the exact same configuration as a locally executed test would have (i.e. if you had run it with the command k6 run script.js from the same directory).
This happens because the cloud has no information at all about your test configuration until you issue the k6 cloud script.js command.
When you do this, your local k6 instance will read the script.js file and parse/execute it. The default() function will not be run at this stage, but any import statements or open() calls in your script code will be executed, allowing k6 to figure out any and all dependencies your script may have. Then k6 will package all the necessary files into an archive format and send that archive to the cloud. The cloud will provision load generator servers and send each of them a copy of the archive data. The archive data will then be unpacked by each individual k6 instance, providing all instances with a full copy of the exact same configuration that k6 saw on the machine where you issued the k6 cloud command.
The cool thing about this is that it means you'll never be in doubt as to what test configuration you're starting when you run a test in the cloud. You'll always know that running a test locally with k6 run script.js and running the same test in the cloud with k6 cloud script.js means both tests always use the exact same configuration, if issued from the same command line. If you had the latest-and-greatest version of the test code in the current working directory on your laptop when you fired off the cloud test, then that is what all the distributed, cloud-based k6 load generators will be using. If you had a locally modified copy of the test repo, with uncommitted changes, then that is what the cloud test will use.
We think the cloud execution functionality is the final piece of the puzzle; the one that means we now have a solution which can actually be called developer friendly. The standalone k6 app, combined with our test execution in the cloud and our test results storage & analysis functionality gives users:
- Convenient large-scale test execution in the cloud Using the cloud execution functionality you don't need to set up or maintain your own load generator infrastructure when running large load tests.
- Powerful cloud-based test results storage and analysis You can run a test locally and stream the test results to Load Impact Insights or you can execute the test using Load Impact's cloud infrastructure. In both cases your results are stored at loadimpact.com which means you not only eliminate the hassle of maintaining a data storage system but can also use the Load Impact Insights analysis tool to dig deeper into your results data.
Don't like clouds?
It is important to note that k6 works very well for many people that do not use the Load Impact-offered cloud functionality. The cloud options are there to make your life easier, free up your time and allow you to focus on writing test code, but k6 is fully functional as a stand-alone tool, for those who want to use it that way!