Hi team,
Thanks in advance for reading.
I’m attempting to run a test that uses a constant-arrival-rate
executor to reach 100RPS, but I’m consistently falling short of that target. I believe that I have sufficient resources for my testing environment to run this test, and I have been able to reach ~220RPS if I use the constant-vus
executor, and ~140RPS if I instead set my target to 200 RPS using the constant-arrival-rate
executor.
vCPU: 32
Memory: 64 GiB
Network Bandwidth: 15 Gbps
This is my test script:
import http from 'k6/http';
import {check} from 'k6';
import {SharedArray} from 'k6/data';
export const options = {
discardResponseBodies: true,
scenarios: {
average_load_test: {
executor: 'constant-arrival-rate',
duration: '30s',
rate: 100,
timeUnit: '1s',
preAllocatedVUs: 600,
maxVUs: 1200,
},
},
};
const data = new SharedArray("urls", function () {
return JSON.parse(open(__ENV.URLS_FILENAME));
});
export default function () {
const url = data[Math.floor(Math.random() * data.length)];
const response = http.get(url);
check(response, {
'response code was 200': (res) => res.status == 200,
});
}
My test output is as follows.
As you can see, I only reached 79.957096/s
. There were no warnings regarding k6 running out of available VUs.
execution: local
script: /local/load_test/tests/k6test.js
output: dashboard
scenarios: (100.00%) 1 scenario, 1200 max VUs, 1m0s max duration (incl. graceful stop):
* average_load_test: 100.00 iterations/s for 30s (maxVUs: 600-1200, gracefulStop: 30s)
✓ response code was 200
█ teardown
checks.........................: 100.00% ✓ 3001 ✗ 0
data_received..................: 1.5 GB 38 MB/s
data_sent......................: 11 MB 261 kB/s
http_req_blocked...............: avg=32.24ms min=223ns med=5.5ms max=347.45ms p(90)=97.05ms p(95)=276.72ms
http_req_connecting............: avg=11.97ms min=0s med=878.85µs max=198.8ms p(90)=32.43ms p(95)=136.6ms
http_req_duration..............: avg=611.73ms min=487.28µs med=189.3ms max=13.15s p(90)=938.17ms p(95)=1.61s
{ expected_response:true }...: avg=611.73ms min=487.28µs med=189.3ms max=13.15s p(90)=938.17ms p(95)=1.61s
http_req_failed................: 0.00% ✓ 0 ✗ 3229
http_req_receiving.............: avg=111.64ms min=14.56µs med=7.67ms max=5.22s p(90)=373.34ms p(95)=608.14ms
http_req_sending...............: avg=46.12µs min=13.66µs med=47.86µs max=120.09µs p(90)=57.62µs p(95)=61.6µs
http_req_tls_handshaking.......: avg=19.76ms min=0s med=4.51ms max=289.05ms p(90)=94.07ms p(95)=139.91ms
http_req_waiting...............: avg=500.04ms min=424.27µs med=140.82ms max=13.13s p(90)=710.21ms p(95)=1.2s
http_reqs......................: 3229 79.957096/s
iteration_duration.............: avg=693.62ms min=1.52ms med=225.59ms max=13.16s p(90)=1.1s p(95)=1.77s
iterations.....................: 3001 74.311317/s
vus............................: 0 min=0 max=83
vus_max........................: 600 min=600 max=600
Updating my options to attempt to reach 200RPS, and I can see that my test was able to reach ~140RPS. Still not meeting the target of 200RPS, but hopefully it shows that my environment is capable of at least 100RPS.
export const options = {
discardResponseBodies: true,
scenarios: {
average_load_test: {
executor: 'constant-arrival-rate',
duration: '30s',
rate: 200,
timeUnit: '1s',
preAllocatedVUs: 600,
maxVUs: 1200,
},
},
};
execution: local
script: /local/load_test/tests/k6test.js
output: dashboard
scenarios: (100.00%) 1 scenario, 1200 max VUs, 1m0s max duration (incl. graceful stop):
* average_load_test: 200.00 iterations/s for 30s (maxVUs: 600-1200, gracefulStop: 30s)
✓ response code was 200
█ teardown
checks.........................: 100.00% ✓ 6001 ✗ 0
data_received..................: 3.1 GB 68 MB/s
data_sent......................: 21 MB 458 kB/s
http_req_blocked...............: avg=31.55ms min=200ns med=5.53ms max=703.16ms p(90)=96.46ms p(95)=276.08ms
http_req_connecting............: avg=11.54ms min=0s med=873.24µs max=215.31ms p(90)=31.19ms p(95)=94.71ms
http_req_duration..............: avg=648.77ms min=503.62µs med=203.51ms max=24.85s p(90)=980.87ms p(95)=1.58s
{ expected_response:true }...: avg=648.77ms min=503.62µs med=203.51ms max=24.85s p(90)=980.87ms p(95)=1.58s
http_req_failed................: 0.00% ✓ 0 ✗ 6438
http_req_receiving.............: avg=104.7ms min=13.98µs med=7.56ms max=10.14s p(90)=340.7ms p(95)=608.45ms
http_req_sending...............: avg=44.12µs min=10.57µs med=45.54µs max=917.9µs p(90)=56.24µs p(95)=60.47µs
http_req_tls_handshaking.......: avg=19.45ms min=0s med=4.54ms max=701.72ms p(90)=55.11ms p(95)=139.57ms
http_req_waiting...............: avg=544.03ms min=434.18µs med=149.12ms max=24.84s p(90)=757.64ms p(95)=1.26s
http_reqs......................: 6438 141.161241/s
iteration_duration.............: avg=730.82ms min=1.28ms med=243.92ms max=24.85s p(90)=1.15s p(95)=1.72s
iterations.....................: 6001 131.579466/s
vus............................: 0 min=0 max=152
vus_max........................: 600 min=600 max=600
running (0m45.6s), 0000/0600 VUs, 6001 complete and 0 interrupted iterations
average_load_test ✓ [======================================] 0000/0600 VUs 30s 200.00 iters/s
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 999 100 921 100 78 3177 269 --:--:-- --:--:-- --:--:-- 3444
Or, if I use the constant-vus
executor, I’m able to reach ~220RPS. There are some non 200
responses on this test, but it was the same test, and hopefully (again) shows that my environment is capable.
export const options = {
discardResponseBodies: true,
scenarios: {
average_load_test: {
executor: 'constant-vus',
vus: 200,
duration: '15s',
},
},
};
execution: local
script: /local/load_test/tests/k6test.js
output: -
scenarios: (100.00%) 1 scenario, 200 max VUs, 45s max duration (incl. graceful stop):
* average_load_test: 200 looping VUs for 15s (gracefulStop: 30s)
✗ response code was 200
↳ 90% — ✓ 4600 / ✗ 478
checks.........................: 90.58% ✓ 4600 ✗ 478
data_received..................: 2.4 GB 101 MB/s
data_sent......................: 16 MB 679 kB/s
http_req_blocked...............: avg=40.82ms min=215ns med=5.29ms max=1.06s p(90)=99.13ms p(95)=280.91ms
http_req_connecting............: avg=12.83ms min=0s med=868.42µs max=255.41ms p(90)=34.96ms p(95)=136.66ms
http_req_duration..............: avg=576.89ms min=482.56µs med=172.95ms max=14.46s p(90)=1.27s p(95)=1.86s
{ expected_response:true }...: avg=614.5ms min=482.56µs med=205.92ms max=14.46s p(90)=1.29s p(95)=1.89s
http_req_failed................: 8.75% ✓ 478 ✗ 4984
http_req_receiving.............: avg=102.99ms min=13.17µs med=5.75ms max=7.11s p(90)=341.21ms p(95)=606.96ms
http_req_sending...............: avg=178.36µs min=9.83µs med=45.45µs max=737.38ms p(90)=57.42µs p(95)=61.99µs
http_req_tls_handshaking.......: avg=25.71ms min=0s med=4.17ms max=1.06s p(90)=94.87ms p(95)=140.61ms
http_req_waiting...............: avg=473.72ms min=421.78µs med=142.12ms max=14.26s p(90)=923.98ms p(95)=1.57s
http_reqs......................: 5462 226.459655/s
iteration_duration.............: avg=664.61ms min=997.36µs med=217.89ms max=14.47s p(90)=1.36s p(95)=2.07s
iterations.....................: 5078 210.538654/s
vus............................: 1 min=1 max=200
vus_max........................: 200 min=200 max=200
Is there something obvious that I’m missing, or misunderstanding about this executor?
I really appreciate any/all help that you can provide