rmoff's random ramblings
about talks

jdbc sink

Mar 12, 2021
Mar 12, 2021

Kafka Connect JDBC Sink deep-dive: Working with Primary Keys

The Kafka Connect JDBC Sink can be used to stream data from a Kafka topic to a database such as Oracle, Postgres, MySQL, DB2, etc.

It supports many permutations of configuration around how primary keys are handled. The documentation details these. This article aims to illustrate and expand on this.

Mar 11, 2021
Mar 11, 2021

Kafka Connect - SQLSyntaxErrorException: BLOB/TEXT column … used in key specification without a key length

I got the error SQLSyntaxErrorException: BLOB/TEXT column 'MESSAGE_KEY' used in key specification without a key length with Kafka Connect JDBC Sink connector (v10.0.2) and MySQL (8.0.23)

Feb 25, 2020
Feb 25, 2020

Kafka Connect JDBC Sink - setting the key field name

I wanted to get some data from a Kafka topic:

ksql> PRINT PERSON_STATS FROM BEGINNING;
Key format: KAFKA (STRING)
Value format: AVRO
rowtime: 2/25/20 1:12:51 PM UTC, key: robin, value: {"PERSON": "robin",
 "LOCATION_CHANGES":1, "UNIQUE_LOCATIONS": 1}

into Postgres, so did the easy thing and used Kafka Connect with the JDBC Sink connector.

Oct 15, 2019
Oct 15, 2019

Skipping bad records with the Kafka Connect JDBC sink connector

The Kafka Connect framework provides generic error handling and dead-letter queue capabilities which are available for problems with [de]serialisation and Single Message Transforms. When it comes to errors that a connector may encounter doing the actual pull or put of data from the source/target system, it’s down to the connector itself to implement logic around that. For example, the Elasticsearch sink connector provides configuration (behavior.on.malformed.documents) that can be set so that a single bad record won’t halt the pipeline. Others, such as the JDBC Sink connector, don’t provide this yet. That means that if you hit this problem, you need to manually unblock it yourself. One way is to manually move the offset of the consumer on past the bad message.

TL;DR : You can use kafka-consumer-groups --reset-offsets --to-offset <x> to manually move the connector past a bad message

Oct 11, 2018
Oct 11, 2018

Flatten CDC records in KSQL

The problem - nested messages in Kafka

Data comes into Kafka in many shapes and sizes. Sometimes it’s from CDC tools, and may be nested like this:


Robin Moffatt

Robin Moffatt works on the DevRel team at Confluent. He likes writing about himself in the third person, eating good breakfasts, and drinking good beer.

Story logo

© 2025