As indicated in the official loadimpact/k6 documentation, we are able to execute a single k6 script as follows:
k6 run ../tests/http_get.js
How would I go about executing multiple script files in a single run? Specifically all scripts that reside in a given local directory. Something like:
k6 run ../tests/
Is this supported out of the box by k6?
A single k6 process will efficiently use all CPU cores on a load generator machine. A single instance of k6 is often enough to generate load of 30,000-40,000 simultaneous users (VUs). This number of VUs can generate upwards of 300,000 requests per second (RPS).
any time duration like 60s, 5m30s, 10m, 2h, etc.; if no unit is specified (e.g. ttl=3000), k6 assumes milliseconds.
k6 is free, developer-centric, and extensible. Using k6, you can test the reliability and performance of your systems and catch performance regressions and problems earlier. k6 will help you to build resilient and performant applications that scale.
Depending on your setup there are a couple different ways you can solve this. A pretty straight forward way is to fork the k6 run command inside bash.
#!/bin/sh
k6 run test1_spec.js &
k6 run test2_spec.js &
k6 run test3_spec.js
You could easily write some more complicated bash scripting to read in everything from the /tests/ directory and run them like that. I chose to do it like this though because I had some custom input params to give to each specific test.
Another way would be to write a docker compose script to do pretty much the same thing. This would start up a docker container for each test and run it inside there. The k6 docker image is nothing more than a tiny linux image with the k6 binary added to it.
version: '3'
services:
k6_test:
image: loadimpact/k6
container_name: test_k6
volumes:
- ./:/specs
command: run /tests/test_spec.js
ports:
- "6565:6565"
k6_test2:
image: loadimpact/k6
container_name: test2_k6
volumes:
- ./:/specs
command: run /tests/test2_spec.js
ports:
- "6566:6566"
Both of these methods should allow you to run multiple tests at the same time in a CI environment as well as on your local machine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With