Long data type precission loss

Hi, I’m working on a test using xk6-kafka extension as the service I need to stress uses kafka events as input.

However, one of the events I need to produce has to be exactly a certain long number such as “7333547711914737990”, which is long and loses some precision when I convert it to a long, being the result “7333547711914738000”.

I need to convert it to a long as in the avro schema I’m using that field is a long, meaning that I can’t send the string.
How can I achieve that? I can’t find either BigInt as an option.


Hi @vvargas, Welcome to the community forum!

numbers in javascript are float by specification and as such big ones loose precesion.

There also is BigInt, but:

  1. k6 does not currently support BigInts
  2. even if it did I expect xk6-kafka might not

my expectation is that if you put it in as a string it might work, but this depends on the xk6-kafka/avro parts of which I am not certain.

This might be better question for @mostafa who is the maintainer of xk6-kafka

Hey @mstoykov thanks for the reply!

Indeed if I use it as String I can have the entire number, but given the AVRO schema is specified to be long I’m forced to convert it to that data type and then is when I lose the precision of the last digits.

It is a known issue and it has some workarounds as for example this one Error: potential precision loss · Issue #53 · kafkajs/confluent-schema-registry · GitHub but I’m afraid the xk6-kafka extension doesn’t support that kind of configuration as of today.

Thanks for the insights!

I would recommend still opening an issue with GitHub - mostafa/xk6-kafka: k6 extension to load test Apache Kafka with support for various serialization formats, SASL, TLS, compression, Schema Registry client and beyond so that maybe it is looked into.

Again even if k6 does support BigInt - xk6-kafka will likely still need to do more work for it.