Running via docker image

HI all,

I’m wanting to run k6 via docker to enable the devs/qa to execute without any local dependencies.

I’m no docker expert :frowning: so am struggling getting it working.

Following the guide here Running k6 I was able to get a simple script working as expected.

However, I have more complex scripts which access other modules, data and config files which are located in sub-directories.

  • plugins (e.g. modules uuid.js)
  • config (e.g. json files with vu, duration, etc)
  • data (csv files for test execution)

I suspect I’ll be having to use the docker -v command but not sure how best to do so.

Any tips?

Hi @BobRuub!

Let me try to help you!

Yes, considering the docker -run v is the right direction.

let’s say you have a directory that contains some files:

├── config.json
├── data
│   └── users.json
└── http_get.js

Here the config.json is k6’s config file. The users.json is just some data file that I will use in the tests. And the http_get.js is the test script.

If I run the command:

docker run --rm -v $(pwd):/scripts loadimpact/k6:latest run /scripts/http_get.js

It will mount the current directory (what the pwd returns) as the /scripts inside the container. And execute the following command inside the container.

k6 run /scripts/http_get.js

My http_get.js can use any resource located in the same directory. Just use a relative path, like:

const f = open('data/users.json');

If I want to use the config.json to run my k6, it will look like this:

docker run --rm -v $(pwd):/scripts loadimpact/k6:latest run --config /scripts/config.json /scripts/http_get.js

Let me know if that helps!

1 Like

Awesome, thanks so much.

Doesn’t work that well on my windows machine as directory sharing with docker is a bit patchy but works brilliantly on my mac which is what all the developers use anyway :slight_smile:

I’m happy that it helped :relaxed: