Comprehensive Guide to Performance API Testing with K6

Explore K6 examples, configuration, reporting, and Docker integration for real-world test automation frameworks.

#K6#API#API-Testing#Automation#JavaScript#Docker#CI/CD#Performance-Testing

Performance Testing Demystified: Tips, Tools, and Best Practices

Performance testing is an essential aspect of ensuring the reliability and scalability of web applications, especially those with API endpoints. K6 is a powerful open-source load testing tool that allows you to simulate thousands of concurrent users and analyze the performance of your APIs under various load conditions. Built for the cloud-native era, it is a modern, developer-friendly tool that lets you write test scripts in JavaScript and execute them using the command-line interface (CLI) or as part of your continuous integration pipeline. With K6, you can easily identify performance bottlenecks and optimize your applications, thanks to its detailed metrics and visualizations. In this article, we'll explore how to perform performance API testing using K6, with examples that you can integrate into real-world test automation frameworks.

Performance API testing involves assessing the responsiveness, reliability, and scalability of API endpoints under different load conditions. By simulating realistic user behavior and load patterns, you can identify performance issues such as slow response times, high error rates, and resource exhaustion. Performance API testing helps you ensure that your APIs can handle the expected workload and deliver a satisfactory user experience.

Before we dive into performance API testing with K6, let's first set up our environment and install K6. You can follow the instructions from the official website. Once installed, you can verify the installation by running the command below:

bash
k6 version

Now that we have K6 installed, let's create a simple test script to perform a basic API request. We'll use the http.get() function to send a GET request to a sample API endpoint and print the response to the console.

test.js
import http from "k6/http";
 
export default function () {
  const response = http.get("https://httpbin.org/get");
  console.log(`Response status: ${response.status}`);
}

You can save this script in a file with a .js extension, such as test.js, and execute it using the K6 CLI:

bash
k6 run test.js

The output should look something like this:

output
 
          /\      |‾‾| /‾‾/   /‾‾/
     /\  /  \     |  |/  /   /  /
    /  \/    \    |     (   /   ‾‾\
   /          \   |  |\  \ |  (‾)  |
  / __________ \  |__| \__\ \_____/ .io
 
  execution: local
     script: test.js
     output: -
 
  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 1 iterations for each of 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)
 
INFO[0000] Response status: 200                          source=console
 
     data_received..................: 6.0 kB 11 kB/s
     data_sent......................: 615 B  1.1 kB/s
     http_req_blocked...............: avg=363ms    min=363ms    med=363ms    max=363ms    p(90)=363ms    p(95)=363ms
     http_req_connecting............: avg=102.07ms min=102.07ms med=102.07ms max=102.07ms p(90)=102.07ms p(95)=102.07ms
     http_req_duration..............: avg=175.25ms min=175.25ms med=175.25ms max=175.25ms p(90)=175.25ms p(95)=175.25ms
       { expected_response:true }...: avg=175.25ms min=175.25ms med=175.25ms max=175.25ms p(90)=175.25ms p(95)=175.25ms
     http_req_failed................: 0.00%  ✓ 0        ✗ 1
     http_req_receiving.............: avg=133µs    min=133µs    med=133µs    max=133µs    p(90)=133µs    p(95)=133µs
     http_req_sending...............: avg=1.29ms   min=1.29ms   med=1.29ms   max=1.29ms   p(90)=1.29ms   p(95)=1.29ms
     http_req_tls_handshaking.......: avg=219.35ms min=219.35ms med=219.35ms max=219.35ms p(90)=219.35ms p(95)=219.35ms
     http_req_waiting...............: avg=173.82ms min=173.82ms med=173.82ms max=173.82ms p(90)=173.82ms p(95)=173.82ms
     http_reqs......................: 1      1.855914/s
     iteration_duration.............: avg=538.25ms min=538.25ms med=538.25ms max=538.25ms p(90)=538.25ms p(95)=538.25ms
     iterations.....................: 1      1.855914/s
 
 
running (00m00.5s), 0/1 VUs, 1 complete and 0 interrupted iterations
default ✓ [======================================] 1 VUs  00m00.5s/10m0s  1/1 iters, 1 per VU
 

Here's a straightforward example of a load test scenario targeting a single endpoint:

test.js
import { check } from "k6";
import http from "k6/http";
 
export let options = {
  stages: [
    // Ramp-up to 10 virtual users over 1 minute
    { duration: "1m", target: 10 },
    // Stay at 10 virtual users for 3 minutes
    { duration: "3m", target: 10 },
    // Ramp-down to 0 virtual users over 1 minute
    { duration: "1m", target: 0 },
  ],
  thresholds: {
    // 95% of requests should complete within 500ms
    http_req_duration: ["p(95)<500"],
    // Error rate should be less than 1%
    http_req_failed: ["rate<0.01"],
  },
};
 
export default function () {
  let response = http.get("https://httpbin.org/get");
  check(response, {
    "status is 200": (r) => r.status === 200,
  });
}

Explanation:

  • We define a load testing scenario with three stages: ramp-up, steady load, and ramp-down.
  • During the ramp-up stage, we gradually increase the number of virtual users from 0 to 10 over 1 minute.
  • In the steady load stage, we maintain 10 virtual users for 3 minutes to simulate sustained traffic.
  • Finally, in the ramp-down stage, we decrease the number of virtual users back to 0 over 1 minute.
  • We set thresholds to ensure that 95% of requests complete within 500ms and the error rate is less than 1%.

Parametrization of tests with environment variables, such as the base URL, is often necessary:

test.js
import { check } from "k6";
import http from "k6/http";
 
const BASE_URL = __ENV.BASE_URL;
 
export let options = {
  stages: [
    // Ramp-up to 10 virtual users over 1 minute
    { duration: "1m", target: 10 },
    // Stay at 10 virtual users for 3 minutes
    { duration: "3m", target: 10 },
    // Ramp-down to 0 virtual users over 1 minute
    { duration: "1m", target: 0 },
  ],
  thresholds: {
    // 95% of requests should complete within 500ms
    http_req_duration: ["p(95)<500"],
    // Error rate should be less than 1%
    http_req_failed: ["rate<0.01"],
  },
};
 
export default function () {
  let response = http.get(`${BASE_URL}/get`);
  check(response, {
    "status is 200": (r) => r.status === 200,
  });
}

Ensure environment variables are exported and execute the test:

bash
export BASE_URL="https://httpbin.org"
k6 run test.js

By default, K6 displays results in the terminal. However, if you prefer a standardized test report in jUnit format, you can utilize the k6-summary. I've contributed to this library multiple times to enhance the structure of the test report. (List of contributions)

test.js
import {
  jUnit,
  textSummary,
} from "https://jslib.k6.io/k6-summary/0.1.0/index.js";
import { check } from "k6";
import http from "k6/http";
 
const BASE_URL = __ENV.BASE_URL;
 
export let options = {
  stages: [
    // Ramp-up to 10 virtual users over 1 minute
    { duration: "1m", target: 10 },
    // Stay at 10 virtual users for 3 minutes
    { duration: "3m", target: 10 },
    // Ramp-down to 0 virtual users over 1 minute
    { duration: "1m", target: 0 },
  ],
  thresholds: {
    // 95% of requests should complete within 500ms
    http_req_duration: ["p(95)<500"],
    // Error rate should be less than 1%
    http_req_failed: ["rate<0.01"],
  },
};
 
export default function () {
  let response = http.get(`${BASE_URL}/get`);
  check(response, {
    "status is 200": (r) => r.status === 200,
  });
}
 
export function handleSummary(data) {
  console.log("Preparing the end-of-test summary...");
 
  return {
    "./test-results/results.xml": jUnit(data, options),
    stdout: textSummary(data, { indent: " ", enableColors: true }),
  };
}

Before running the script above, we need to ensure that the test-results folder exists; otherwise, the script will fail to save the test report. So, create a folder if you haven't already done so:

bash
mkdir test-results

Once the script is finished, it will print the result in terminal as it was until now, since we were pushing the result in stdout (standard output) with textSummary.

This is how the report file should look like:

test-results/results.xml
<?xml version="1.0"?>
<testsuites tests="2" failures="1">
  <testsuite name="k6 thresholds" tests="2" failures="1">
    <testcase name="http_req_duration - p(95)&lt;500" classname="Unnamed folder">
      <failure message="trend threshold failed: avg value: 168.73975892857143, min value: 103.246, med value: 106.9505, max value: 1638.793, p(90) value: 321.09240000000005, p(95) value: 556.5869499999998" />
    </testcase>
    <testcase name="http_req_failed - rate&lt;0.01" classname="Unnamed folder" />
  </testsuite>
</testsuites>

And as you might have noticed, the default name of the testsuite is k6 thresholds, and the default classname is Unnamed folder.

That is why in version 0.1.0, I contributed by adding the name and the classname as optional parameters. In case you want to have a better-formatted test report, you need to provide them in the options object:

test.js
import {
  jUnit,
  textSummary,
} from "https://jslib.k6.io/k6-summary/0.1.0/index.js";
import { check } from "k6";
import http from "k6/http";
 
const BASE_URL = __ENV.BASE_URL;
 
export let options = {
  classname: "test.js",
  name: "Load Test Example",
  stages: [
    // Ramp-up to 10 virtual users over 1 minute
    { duration: "1m", target: 10 },
    // Stay at 10 virtual users for 3 minutes
    { duration: "3m", target: 10 },
    // Ramp-down to 0 virtual users over 1 minute
    { duration: "1m", target: 0 },
  ],
  thresholds: {
    // 95% of requests should complete within 500ms
    http_req_duration: ["p(95)<500"],
    // Error rate should be less than 1%
    http_req_failed: ["rate<0.01"],
  },
};
 
export default function () {
  let response = http.get(`${BASE_URL}/get`);
  check(response, {
    "status is 200": (r) => r.status === 200,
  });
}
 
export function handleSummary(data) {
  console.log("Preparing the end-of-test summary...");
 
  return {
    "./test-results/results.xml": jUnit(data, options),
    stdout: textSummary(data, { indent: " ", enableColors: true }),
  };
}

Now, the test report should appear like this:

test-results/results.xml
<?xml version="1.0"?>
<testsuites tests="2" failures="0">
  <testsuite name="Load Test Example" tests="2" failures="0">
    <testcase name="http_req_duration - p(95)&lt;500" classname="test.js" />
    <testcase name="http_req_failed - rate&lt;0.01" classname="test.js" />
  </testsuite>
</testsuites>

This format should work pretty well with any server that reads jUnit format test reports.

Usually, your API will have authorization in place. So, to be able to hit the endpoint(s), we obviously need to provide the required access token in the headers.

This can be easily managed in the setup:

test.js
import {
  jUnit,
  textSummary,
} from "https://jslib.k6.io/k6-summary/0.1.0/index.js";
import { check } from "k6";
import http from "k6/http";
import { authenticate } from "./auth/authenticate.js";
 
const BASE_URL = __ENV.BASE_URL;
 
export function setup() {
  const authResponse = authenticate();
  return authResponse;
}
 
export let options = {
  classname: "test.js",
  name: "Load Test Example",
  stages: [
    // Ramp-up to 10 virtual users over 1 minute
    { duration: "1m", target: 10 },
    // Stay at 10 virtual users for 3 minutes
    { duration: "3m", target: 10 },
    // Ramp-down to 0 virtual users over 1 minute
    { duration: "1m", target: 0 },
  ],
  thresholds: {
    // 95% of requests should complete within 500ms
    http_req_duration: ["p(95)<500"],
    // Error rate should be less than 1%
    http_req_failed: ["rate<0.01"],
  },
};
 
export default function (data) {
  const params = {
    headers: {
      "Content-Type": "application/json",
      Authorization: `Bearer ${data.token}`,
    },
  };
 
  let response = http.get(`${BASE_URL}/get`, params);
  check(response, {
    "status is 200": (r) => r.status === 200,
  });
}
 
export function handleSummary(data) {
  console.log("Preparing the end-of-test summary...");
 
  return {
    "./test-results/results.xml": jUnit(data, options),
    stdout: textSummary(data, { indent: " ", enableColors: true }),
  };
}

where authenticate() will be your logic on how to fetch the access token. For demonstration purposes, it will return the hardcoded object, although the endpoint we are targeting doesn't require authorization.

auth/authenticate.js
export function authenticate() {
  // Logic to authenticate and return the auth response
  return { token: "my-token" };
}

Note that we need to provide data as an input in the default function, which will be provided by the setup function.

Additionally, params consists of headers object, in which we dynamically populate the access token on each test execution.

In case you want to avoid installing K6 every time in the runner, you can use the open-source Docker image. We can create a docker-compose.yml file with all required configuration.

Here is the Official docker-compose configuration.

docker-compose.yml
version: "3.4"
 
networks:
  k6:
 
services:
  influxdb:
    image: influxdb:1.8
    networks:
      - k6
    ports:
      - "8086:8086"
    environment:
      - INFLUXDB_DB=k6
 
  k6:
    image: grafana/k6:latest
    user: root
    networks:
      - k6
    ports:
      - "6565:6565"
    environment:
      - K6_OUT=influxdb=http://influxdb:8086/k6
      - BASE_URL=${BASE_URL}
    working_dir: /app
    volumes:
      - ./:/app
    command: >
      run test.js

Most of it is the default configuration. The highlighted lines are, in fact, customized to run our script. Now, you can run it with:

bash
docker-compose -f docker-compose.yml up --build --abort-on-container-exit --exit-code-from k6

In case you don't have Docker installed, you can follow the instructions here.

Performance API testing is a critical part of ensuring the reliability and scalability of web applications. With K6, you can easily create and execute performance tests for your API endpoints, allowing you to identify performance issues and optimize your applications for better performance. By following the examples and best practices outlined in this article, you can integrate performance API testing into your test automation frameworks and deliver high-quality, performant software to your users.


Related Posts: