Hi All,
apologies for the question as it may sound silly BUT, I’m currently on the lookout for a new load testing tool. Currently we use NeoLoad and find it very user unfriendly and their support is poor to non existent. I’ve been pointed towards Grafana K6 and have it currently running in a docker container for my investigatory work. I love the coding aspect of it as well as the lovely results graphs.
We have our own console app which generates pre signed payloads to use in our http requests (ideally I’d just want to do it within K6 itself but i do not understand crypto at all ).
Our K6 script reads from that generated json file and we are able to send in http requests to our API fine.
My issue is that if our generated file has more than 15k records (json objects) the test tries to initialize and just fails part way through.
Please see current script:
import http from ‘k6/http’;
import { check, sleep } from ‘k6’;
import { Counter } from ‘k6/metrics’;
import { SharedArray } from ‘k6/data’;
import { scenario } from ‘k6/execution’;
// Load configuration from config.json
const config = JSON.parse(open(‘/config/config.json’));
// Extract MPIDs with include
set to true and their corresponding names
const includedMPIDs = config.mpidDetails
.filter((item) => item.include === true)
.map((item) => item.name);
// Load data for all included MPIDs into a shared array
const allData = ;
includedMPIDs.forEach((mpid) => {
const mpidName = mpid.toLowerCase();
const data = JSON.parse(open(/data/${mpidName}.json
));
data.forEach((row) => {
allData.push({ …row, mpid: mpidName }); // Include MPID info with each row
});
});
// Use a shared array to distribute the data
const sharedData = new SharedArray(‘Combined Data’, () => allData);
// Counter for errors
export let errorCount = new Counter(‘errors’);
// Dynamically set VUs and iterations based on the number of records
// const totalRecords = includedMPIDs.reduce((sum, mpid) => sum + allData[mpid].length, 0);
// This is the test configuration
export let options = {
scenarios: {
‘process-all-data’: {
executor: ‘shared-iterations’,
vus: 100, // Number of VUs
iterations: sharedData.length, // Total number of iterations
maxDuration: ‘1h’, // Maximum test duration
},
},
cloud: {
distribution: {
distributionLabel1: { loadZone: ‘amazon:gb:london’, percent: 100 },
},
},
tlsAuth: [
{
cert: -----BEGIN CERTIFICATE----- sometext -----END CERTIFICATE-----
, // Path to your PEM client certificate
key: -----BEGIN PRIVATE KEY----- some text -----END PRIVATE KEY-----
, // Path to your PEM private key
},
]
};
// Log start time in setup()
export function setup() {
const startTime = new Date().toISOString();
console.log(Test started at: ${startTime}
);
return { startTime }; // Pass start time to teardown
}
// Function to make requests with data
function makeRequest(dataRow) {
const { mpid, if_no, payload, dip_sig, dip_time, dip_hash, dip_sig_cert } = dataRow;
let endpoint = ${config.endpoint}${config.subPath}${mpid.toLowerCase()}/${dataRow.if_no.toLowerCase()}
;
if (mpid.toLowerCase() == “xxx”) {
endpoint = ${config.endpoint}${config.subPath}${dataRow.if_no.toLowerCase()}
;
}
const headers = {
‘Content-Type’: ‘application/json’,
‘X-DIP-Signature’: dip_sig,
‘X-DIP-Signature-Date’: dip_time,
‘X-DIP-Content-Hash’: dip_hash,
‘X-DIP-Signature-Certificate’: dip_sig_cert,
};
const res = http.post(endpoint, payload, { headers });
let responseBody;
try {
responseBody = res.body.trim() !== ‘’ ? res.json() : ‘Empty response body’;
} catch (e) {
responseBody = res.body;
}
// console.log(JSON.stringify({
// mpid: mpid,
// if_no: dataRow.if_no,
// status: res.status,
// response: responseBody,
// timestamp: new Date().toISOString(),
// }));
const successStatusCode = check(res, {
‘status is 201’: (r) => r.status === 201,
});
const successBodyParse = check(res, {
‘response body can be parsed to JSON’: (r) => {
try {
if (r.body.trim() !== ‘’) {
JSON.parse(r.body);
return true;
}
} catch (e) {
console.error(Failed to parse response body: ${e.message}
);
}
return false;
},
});
if (!successStatusCode || !successBodyParse) {
errorCount.add(1);
}
}
// Main test function
export default function () {
// Get the current iteration’s data based on scenario.iterationInTest
const iterationIndex = scenario.iterationInTest;
const dataRow = sharedData[iterationIndex];
// Process the data
makeRequest(dataRow);
sleep(1); // Simulate realistic pacing
}
// Log end time in teardown()
export function teardown(setupData) {
const endTime = new Date().toISOString();
console.log(Test started at: ${setupData.startTime}
);
console.log(Test ended at: ${endTime}
);
}
My script is looking at the following config.json (obfuscated soem values)
{
“endpoint”: “https://xxx/”,
“subPath”: “xxx/”,
“mpidDetails”: [
{
“name”: “xxx”,
“include”: true,
“includedIfTypes”: [“IF-003”],
“apiKey”: “”
}
]
}
I’m aware there are experimental packages that i potentially could use but i do not understand them enough to be able to incorporate.
I guess I’m asking a few things…
- Can K6 handle reading external files of potentially >1GB in size or multiple files when combined are > 1GB? assuming sharedArray is not the correct route
- what is the alternative that best fits this scenario? And how/where might i implement this?
- if i had skeleton payloads, would it be better to use k6 to update them then use subtle crypto for the message signing and do it in real time e.g. generate + sign the payloads, rather than reading from a file?
Thanks