We have set up a Kafka Lambda Trigger in AWS to consume from a Kafka read API. The connection was successful and I can receive Json Objects in my C# Lambda function. However, the Json Objects contain a value property with the serialised data and I'm struggeling to deserialise it using our AVRO schema. I'm using the nugget packages Confluent.Kafka and Confluent.SchemaRegistry.Serdes. This is my attempt:
var value = myRecord.GetProperty("value").GetBytesFromBase64();
var schemaRegistryConfig = new SchemaRegistryConfig
{
Url = "mySchemaRegistry"
};
using var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);
var deserializer = new AvroDeserializer<MyAvroGeneratedClass>(schemaRegistry);
try
{
var result = deserializer.DeserializeAsync(value, false, SerializationContext.Empty).Result;
return result;
}
catch(Exception ex)
{
return null;
}
Currently, DeserializeAsync throws the following Exception:
AvroDeserializer only accepts type parameters of int, bool, double, string, float, long, byte[], instances of ISpecificRecord and subclasses of SpecificFixed.
Does someone has experience with a similar setup and can guide me how this can work?
I've tried to use other types than byte[], but this will not work with the AvroDeserializer class.
The Kafka Lambda Trigger might do some magic that changes the byte array that I'm retrieving.
Hi , I am looking for an equivalent solution either in Java or pure vanilla Node Javascript. Haven´t seen anything like it yet freely on the internet. Has anyone gotten a reply from AWS developer support?