I am currently working with the tkafka components and for the moment all is working on a success scenario, but i am facing lot of issues on the error handling. For example the advanced settings, i gave ack=all, retries=10, but none of these seems to get considered. I tried to test a scenario where i brought down a broker with ack=all, but the kafka input doesn't try to turn down that transaction rather continued to process the next message and finished with success, i had message lost during this transaction.
Is this a bug ? or there are other ways of configuring these additional properties ?
Would be really helpful with some pointers on this topic.
For further information about the consumer properties you can define in Kafka consumer properties, please see the section describing the consumer configuration in Kafka's documentation in http://kafka.apache.org/documentation.html#consumerconfigs.
Am getting the following error "Exception in component tKafkaInput_1
kafka.common.MessageSizeTooLargeException: Found a message larger than the maximum fetch size of this consumer on topic "
I have increased my fetch size as attached . Please assist .
We see that this topic is set as resolved.
Is this solution "clear your additional Kafka properties as in your attached config and try using the "fetch.message.max.bytes" parameter" Ok with you?
What's kafka API version are you using? The kafka API changed between 0.8 and 0.9.
Watch the recorded webinar!
Accelerate your data lake projects with an agile approach
Create systems and workflow to manage clean data ingestion and data transformation.
Introduction to Talend Open Studio for Data Integration.