High memory usage

Hi there!

I have some simple tests - 2 GET and 1 POST operations.
VU configured as 100 over 30 minutes run time.

Running tests in docker or terminal leads to high memory usage.
On docker I stop it at ~22GB and in terminal ~8GB (and still growing)

Here is my test file

import http from 'k6/http';
import { sleep } from 'k6';
import { check } from 'k6';
import { FormData } from 'https://jslib.k6.io/formdata/0.0.2/index.js';

const EXCEL_FILE_BIN = open('./XXX.xlsm', 'b')

export const options = {
  stages: [
      { duration: '5m', target: 100 }, // traffic ramp-up from 1 to 100 users over 5 minutes.
      { duration: '19m', target: 100 }, // stay at 100 users for 30 minutes
      { duration: '5m', target: 0 }, // ramp-down to 0 users
    ],
};

export function setup() {

  const params = {
    headers: { "Content-Type": "application/x-www-form-urlencoded" },
  }

  const body = {
    "username": "XXXXXX",
    "password": "XXXXXX",
    "client_id": "XXXXXX",
    "client_secret": "XXXXXX",
    "grant_type": "XXXXXX",
  }

  const response = http.post("XXXXXX", body, params)

  return response.json()['XXXXXX']
}


export default function (data) {

  const req_params = {
    headers: { 'Authorization': `Bearer ${data}` },
  }
  
  const req_post_engagement_document_form_data =  new FormData();
  req_post_engagement_document_form_data.append('XXXXXX', 'XXXXXX');
  req_post_engagement_document_form_data.append('XXXXXX', 'XXXXXX');
  req_post_engagement_document_form_data.append('XXXXXX', 'XXXXXX');
  req_post_engagement_document_form_data.append('XXXXXX', 'XXXXXX');
  req_post_engagement_document_form_data.append('XXXXXX', 'XXXXXX');
  req_post_engagement_document_form_data.append('XXXXXX', http.file(EXCEL_FILE_BIN, 'XXXXXX'));

  const req_post_engagement_document_params = {
    headers: {
      'Authorization': `Bearer ${data}`,
      'Content-Type': 'multipart/form-data; boundary=' + req_post_engagement_document_form_data.boundary
    },
  }

  const req_get_users = {
    method: 'GET',
    url: 'XXXXXX1',
    params: req_params
  };

  const req_get_work_types = {
    method: 'GET',
    url: 'XXXXXX2',
    params: req_params
  };

  const req_post_engagement_document = {
    method: 'POST',
    url: 'XXXXXX3',
    params: req_post_engagement_document_params,
    body: req_post_engagement_document_form_data.body(),
  }

  const responses = http.batch([req_get_users, req_get_work_types, req_post_engagement_document]);

  check(responses[0], {
    'response code was 200': (response) => response.status == 200,
  });

  check(responses[1], {
    'response code was 200': (response) => response.status == 200,
  });

  check(responses[2], {
    'response code was 201': (response) => response.status == 201,
  });
}

What can I do to reduce memory usage?

Hey @nktedo001,
welcome to the community forum :tada:

It seems you’re hitting the following issue Improving handling of large files in k6 · Issue #2974 · grafana/k6 · GitHub. Can you report how big your file xlsm is, please?

Hi, thanks for the answer!

For now it is ~2.2MB.
In further tests I have goal to try to POST/PATCH files like 1GB.

I think it would be helpful if we could prepare some data in setup() not only in init() or test function since setup is called once so we could use some kind of pointer in test to binary data instead of loading it for every VU/iteration.

Also I move preparation for form-data to init section but its not help.

Please take a look at SharedArray!

SharedArrays do not support xlsm as binary data

1 Like

Hey @nktedo001,

as I shared before, you’ve mostly hit the limit of the file handling in k6.
We are actively working on it and in the the next release (8w from now) we may have a new experimental API for files. It will not solve the entire issue as we still need to integrate better with other components like the k6/http module but it will be a starting point and an important improvement.

The suggestion here is to reduce the current memory impact by

  • VUs: as each of them create a copy of the file in memory
  • File size: as the amount of copy to do for each

I know it isn’t very valuable but we are aware of it and it is the reason why we are working on alternative solutions.

1 Like

Hi,

I have similar needs - POSTing large files to the system under test.

I don’t need a binary file so am generating a large array as a SharedArray and posting that as the test data. I think this should then share that amongst the VUs without re-using all that memory - is that correct?

I’m currently trying 50mb (running k6 cloud bigtest2.js) and one of the requests makes it and the file is uploaded, but then I get “Aborted (by limit)”.

I’m trying to get distributed execution to work - should that get round the limit?

Want to understand if I’m on the correct path here.

Thanks very much.

Hey @andyt1,
please, can you open a dedicated topic as your issue doesn’t seem to be connected with the original topic?

If you can also add a simple example about the file you are loading into the SharedArray then it would help us in troubleshooting it.

In the event, that you are a k6 Cloud customer then consider contacting directly the support as we may give you a higher priority.

I hope it helps.

1 Like