Hi team,
Am getting below error while posting message on kafka topic :
ERRO[0004] GoError: Failed to encode data, OriginalError: cannot decode textual record "com.m.Plan": cannot
decode textual union: cannot decode textual map: cannot determine codec: "book" for key: "b"
at github.com/mostafa/xk6-kafka.(*Kafka).schemaRegistryClientClass.func4 (native)
Here is my script snippet :
import {LoadJKS,
SCHEMA_TYPE_JSON,
TLS_1_2,
SASL_SCRAM_SHA512,
Writer,
SchemaRegistry,
SCHEMA_TYPE_AVRO,
KEY,
VALUE,
RECORD_NAME_STRATEGY} from "k6/x/kafka"; // import kafka extension
import { check } from "k6";
const surl="https://pp.net";
const topic = "topic1";
const jks = LoadJKS({
path: "./truststore.jks",
password: "Yz",
clientCertAlias: "ca-1"
});
const jks1 = LoadJKS({
path: "./keystore-sr.jks",
password: "YzB",
clientKeyAlias: "retina-sr"
});
const jks2 = LoadJKS({
path: "./ca.jks",
password: "shabda",
clientKeyAlias: "retina-sr",
serverCaAlias: "ca-1"
});
const tlsConfig = {
enableTls: true,
insecureSkipTlsVerify: false,
minVersion: TLS_1_2,
clientCertPem: jks["clientCertsPem"], // The first certificate in the chain
clientKeyPem: jks1["clientKeyPem"],
serverCaPem: jks2["serverCaPem"],
};
const saslConfig= {
username: "demo.book.test",
password: "password1",
algorithm: SASL_SCRAM_SHA512,
}
const writer = new Writer({
brokers: ["pp-broker:443"],
topic: topic,
sasl: saslConfig,
tls: tlsConfig
});
const schemaRegistry = new SchemaRegistry();
const valueSchemaObject=JSON.stringify({
type: "record",
namespace: "com.m",
name: "ServiceP",
doc: "The",
fields: [
{
name: "serviceN",
doc: "Unique number",
type: "string"
},
{
name: "eventType",
doc: "Internal",
type: [
"null",
"string"
],
default: null
},
{
name: "createdTime",
doc: "When CREATED",
type: [
"null",
{
type: "long",
logicalType: "timestamp-millis"
}
],
default: null
},
{
name: "lastModifiedTime",
doc: "When ISSUE",
type: [
"null",
{
type: "long",
logicalType: "timestamp-millis"
}
],
default: null
},
{
name: "hasOrigin",
doc: "origin",
type: [
"null",
"boolean"
],
default: null
},
{
name: "sTypeModes",
doc: "stype",
type: [
"null",
{
type: "array",
items: {
type: "record",
namespace: "com.m.Plan",
name: "SType",
doc: "Type",
fields: [
{
name: "originService",
doc: "Origin",
type: [
"null",
{
type: "enum",
name: "OriginServiceType",
namespace: "com.m.Plan.Modes",
symbols: [
"FAMILY",
"CAR"
]
}
],
default: null
},
{
name: "destService",
doc: "service",
type: [
"null",
{
type: "enum",
name: "ServiceType",
namespace: "com.m.Plan.Type",
symbols: [
"FAMILY",
"CAR"
]
}
],
default: null
}
]
}
}
],
default: null
},
{
name: "b",
doc: "book",
type: [
"null",
{
type: "record",
namespace: "com.m.Plan",
name: "book",
doc: "demo",
fields: [
{
name: "bNumber",
doc: "The 23456",
type: [
"null",
"string"
],
default: null
}
});
export default function () {
{
let messages = [
{
value: schemaRegistry.serialize({
data: {
// data
},
schema: { schema: valueSchemaObject },
schemaType: SCHEMA_TYPE_AVRO,
}),
headers: {
Authorization: "Bearer demo",
},
// offset: 3,
// partition: 0,
// time: new Date(), // Will be converted to timestamp automatically
}
];
writer.produce({ messages: messages });
}
}
Schema body is 6k lines long …i have provided the snippet where issue was coming ( a changed part of schema body ) .
Data i have removed and just commented data .