Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Production of AVRO specific record seems to be creating a Generic Record - v1.1.19/Lambda/Java 17 - Protobuf reading a typed record doesn´t find the class to map to #339

Open
bjanischevsky opened this issue Mar 18, 2024 · 0 comments

Comments

@bjanischevsky
Copy link

I am testing all features on this library and found that if I produce an AVRO specific record, when I try to read it as specific too, I get this error:

Caused by: java.lang.NullPointerException: Cannot invoke "java.lang.Class.newInstance()" because "readerClass" is null
at com.amazonaws.services.schemaregistry.deserializers.avro.DatumReaderInstance.from(DatumReaderInstance.java:42)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:114)
at com.amazonaws.services.schemaregistry.deserializers.avro.AvroDeserializer$DatumReaderCache.load(AvroDeserializer.java:111)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045)

but when I read is as generic record it all works. I have the same ticket open at AWS´ developer support for more than 72 hours and nobody is responding. Is this the right channel here for this?

Here´s the code that produces it:

            config.put(AWSSchemaRegistryConstants.AVRO_RECORD_TYPE, AvroRecordType.SPECIFIC_RECORD.getName());
            try (var producerSpecific = new KafkaProducer<String, simpleschema>(config)) {
                simpleschema simpleschema = gson.fromJson(jsonMessage, simpleschema.class);
                outputText = gson.toJson(simpleschema);
                AppSingleton.logger.log("Specific record type: " + simpleschema.getClass().getName() + " " + outputText);
                var record = new ProducerRecord<>(typedMessage.topic, key, simpleschema);
                var producerResult = producerSpecific.send(record);
                producerResult.get();
                producerSpecific.flush();
                // send the record to kafka
                AppSingleton.logger.log("Finished producing a specific message to Kafka with settings: " + gson.toJson(config));
            }

I have a similar issue with Protobuf, when I read the data, the library tries to instantiate an object of a given type but fails to find it with error:

Caused by: java.lang.ClassNotFoundException: /PersonSchemaDynamic$Person
at java.base/java.lang.Class.forName0(Native Method)
at java.base/java.lang.Class.forName(Unknown Source)
at com.amazonaws.services.schemaregistry.deserializers.protobuf.ProtobufWireFormatDecoder.deserializeToPojo(ProtobufWireFormatDecoder.java:72)
... 28 more

I made sure the class file is embeded at diferent places of the JAR file usef for deployment of the Lambda, but it always get the same error.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant