How to run a k6 script that works onprem using k6 oss to work in the k6 cloud (external files, certs, apikeys, other secrets)

i have a complex script that works in k6 oss but I am unable to get it to work in the k6 cloud. i bundle’d it using the archive feature and get the following error…
k6 cloud archive.tar
Command “cloud” is deprecated, the k6 team is in the process of modifying and deprecating the “k6 cloud” command behavior.
In the future, the “cloud” command will only display a help text, instead of running tests in the Grafana Cloud k6.

To run tests in the cloud, users are now invited to migrate to the “k6 cloud run” command instead.

      /\      |‾‾| /‾‾/   /‾‾/
 /\  /  \     |  |/  /   /  /
/  \/    \    |     (   /   ‾‾\

/ \ | |\ \ | (‾) |
/ __________ \ |__| _\ ____/ .io

 execution: cloud
    script: archive.tar
    output: https://carmax.grafana.net/a/k6-app/runs/3323110

 scenarios: (100.00%) 1 scenario, 20 max VUs, 7m30s max duration (incl. graceful stop):
          * default: Up to 20 looping VUs for 7m0s over 12 stages (gracefulRampDown: 30s, gracefulStop: 30s)

ERRO[0037] error while tailing cloud logs error=“websocket: bad handshake”
test status: Aborted (script error)

Run [--------------------------------------] Aborted (script error)
ERRO[0065] The test has failed

additionally i have to also get this script to be able to be invoked from an ado pipeline. I have done some research but I am finding it difficult to find a recipe to do this.

re ran test with “k6 cloud run archive.tar” and got the same result.

Hi @hdecouto !

Welcome to the community forums! :wave:

The symptoms that you described is usually means some test resource that you use in your test doesn’t available in the cloud (e.g. not bundled into test archive).

Do you use in your test script some absolute file paths or maybe try to open file by using dynamic variable?

Cheers!

Hi Oleg,

Yes my local github repo has folders off the main folder that I am accessing in the script using …\scripts\script.js for example. Does everything need to be in a single folder?
also I have pem cert & key as well as apikey being passed in using an env variable. I also am trying to write a custom log file containing the request results etc. and I am using groups to group/separate each set of request metrics that I need to analyze.

thanks in advance for your help.

Hi @hdecouto

No, it doesn’t mean everything should be in the same folder. My point was that to run the test in the cloud k6 everything needs to be packed into an archive and uploaded to the cloud. If you open the file in the script by dynamic name (variable) or absolute file paths (here, I need to check more), it could be so that the same path won’t be correctly resolved in the cloud.

So, as a workaround, I recommend you switch to relative paths if possible, e.g., do not use C:/my/path, etc.

But I also need to investigate better the k6 with an absolute path.

Cheers!

1 Like

thanks for that answer. so if my folder is c:\my\path and the scripts are in c:\my\path\scripts and certs are in c:\my\path\certs how would i access via a relative path? “…\certs\cert1.pem” ? should i assume that the archive knows about my …\certs folder and packages it up? what is the criteria? also if i create an output file in my onprem script will it allow me to open and write to the same output in the cloud? finally i pass in apikey’s via an environment variable on the command line using -e APIKEY=… Will this feature work in the cloud? thanks again and sorry for all the questions. I’ve been digging through the documentation and could not find a clear answer as to what exactly is available in the k6 cloud vs. onprem/oss. Also is that Archive stored in a secure location since it contains everthing in my folder?