-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check topic existence on Producer '_produceWrapper()` #24
Comments
I suspect this has to do with a not found topic. Double check that the topic is properly created and populated from the Schema Registry. Making a note to check for existence of topic at https://github.com/waldophotos/kafka-avro/blob/master/lib/kafka-producer.js#L91 |
Tried the above code, just changed the topic name to 'node_avro'. My Kafka set-up allows for auto-topic creation and I could see that the topic indeed got created. A message got pushed to it that looks like below. However, I didn't see a schema get created on my schema registry. 'Appreciate any lead. Raw Message:
|
Spent some time debugging this. Looks like the "var pos = type.encode(val, buf, 5);" line is the one that's throwing that error. Hoping this would help.
|
Thank you for the research @jobetdelima, it will take some time for me to have a look at this as I no longer use kafka for my job. If you need this to be fixed fast I'd appreciate a PR |
We got it working in this way. Tested with node v8.5.0 and v9.7.1. The key was to make sure schema registry had schemas for the topic keys and values named using the convention producer.js: const KafkaAvro = require('kafka-avro');
const sleep = require('sleep');
// If running on host machine instead of inside docker, make sure to add
// hostnames 'kafka', 'schema-registry', 'zookeeper', etc to 127.0.0.1
const schemaRegistryUrl = 'http://schema-registry:8081';
const broker = 'kafka:9092';
const topicName = 'avrotest';
// Ensure schema is in registry:
/*
Note that the key is a record so the Kafka Rest Proxy can also work (it doesn't support simple string type)
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"record\", \"name\": \"avrotest_key\", \"fields\": [{\"type\": \"string\", \"name\": \"id\"}]}"}' http://schema-registry:8081/subjects/avrotest-key/versions
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\": \"record\", \"name\": \"avrotest_value\", \"fields\": [{\"type\": \"string\", \"name\": \"id\"}]}"}' http://schema-registry:8081/subjects/avrotest-value/versions
*/
let kafkaAvro = new KafkaAvro({
kafkaBroker: broker,
schemaRegistry: schemaRegistryUrl,
topics: [topicName],
fetchAllVersions: true
});
kafkaAvro.init()
.then(() => {
console.log('Ready to use');
return kafkaAvro.getProducer({debug: 'all'});
})
.then(producer => {
producer.on('producer error', err => {
console.log(err);
process.exit(1);
});
producer.on('disconnected', arg => {
console.log('producer disconnected. ' + JSON.stringify(arg));
});
let topic = producer.Topic(topicName, {'request.required.acks': 1});
while (true) {
sleep.msleep(100);
let key = {id: Math.random().toString()};
let value = {id: Math.random().toString()};
let partition = -1;
console.log(key, value);
producer.produce(topic, partition, value, key);
}
}).catch(err => {
console.error('A problem occurred');
console.error(err);
process.exit(1);
}); consumer.js: const KafkaAvro = require('kafka-avro');
// If running on host machine instead of inside docker, make sure to add
// hostnames 'kafka', 'schema-registry', 'zookeeper', etc to 127.0.0.1
const schemaRegistryUrl = 'http://schema-registry:8081';
const broker = 'kafka:9092';
const topicName = 'avrotest';
// Ensure schema is in registry. See producer.js.
let kafkaAvro = new KafkaAvro({
kafkaBroker: broker,
schemaRegistry: schemaRegistryUrl,
topics: [topicName],
fetchAllVersions: true
});
kafkaAvro.init()
.then(() => {
console.log('Ready to use');
return kafkaAvro.getConsumer({
'group.id': 'avrotest',
'socket.keepalive.enable': true,
'enable.auto.commit': true,
});
})
.then(consumer => {
let stream = consumer.getReadStream(topicName, {
waitInterval: 0
});
stream.on('error', err => {
console.log('stream error ' + err);
process.exit(1);
});
consumer.on('consumer error', err => {
console.log(err);
process.exit(1);
});
stream.on('data', function(data) {
console.log(data);
});
})
.catch(err => {
console.error('A problem occurred');
console.error(err);
process.exit(1);
}); package.json: {
"name": "node-kafka-avro",
"version": "1.0.0",
"description": "",
"main": "",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"kafka-avro": "^0.8.1",
"sleep": "^5.1.1"
}
} When running
When running
|
Hi,
I am trying out a fairly simple example. I have the following topic and avro key and value schemas created:
When I attempt to produce a subsequent message with kafka-avro I get the following error:
Ready to use A problem occurred when sending our message Error: invalid "string": undefined at throwInvalidError (/mnt/d/dev/lambda-kafka/src/test2/node_modules/avsc/lib/types.js:2688:9) at StringType._write (/mnt/d/dev/lambda-kafka/src/test2/node_modules/avsc/lib/types.js:743:5) at RecordType.writeUser [as _write] (eval at RecordType._createWriter (/mnt/d/dev/lambda-kafka/src/test2/node_modules/avsc/lib/types.js:2005:10), :4:6) at RecordType.Type.encode (/mnt/d/dev/lambda-kafka/src/test2/node_modules/avsc/lib/types.js:294:8) at Object.magicByte.toMessageBuffer (/mnt/d/dev/lambda-kafka/src/test2/node_modules/kafka-avro/lib/magic-byte.js:29:18) at Ctor.Producer.serialize (/mnt/d/dev/lambda-kafka/src/test2/node_modules/kafka-avro/lib/kafka-producer.js:110:28) at Ctor.Producer._produceWrapper (/mnt/d/dev/lambda-kafka/src/test2/node_modules/kafka-avro/lib/kafka-producer.js:94:23) at kafkaAvro.getProducer.then.producer (/mnt/d/dev/lambda-kafka/src/test2/handler.js:28:30) at tryCatcher (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/util.js:16:23) at Promise._settlePromiseFromHandler (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/promise.js:512:31) at Promise._settlePromise (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/promise.js:569:18) at Promise._settlePromise0 (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/promise.js:614:10) at Promise._settlePromises (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/promise.js:693:18) at Async._drainQueue (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/async.js:133:16) at Async._drainQueues (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/async.js:143:10) at Immediate.Async.drainQueues (/mnt/d/dev/lambda-kafka/src/test2/node_modules/bluebird/js/release/async.js:17:14)
Here is the code:
Thanks
The text was updated successfully, but these errors were encountered: