2019-09-19 08:30:18,347 INFO Postgres|audit|records-stream-producer Different column count 4 present in the server message as schema in memory contains 3; refreshing table schema [io.debezium.connector.postgresql.RecordsStreamProducer] 2019-09-19 08:30:18,467 ERROR Postgres|audit|records-stream-producer Failed to properly convert key value for 'schema.a_table.a_column' of type varchar for row [50, 424439ef-bb53-455f-a9cd-fbb228f477e8, null, null]: [io.debezium.relational.TableSchemaBuilder] org.apache.kafka.connect.errors.DataException: Invalid value: null used for required field: "a_column", schema type: STRING at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:220) at org.apache.kafka.connect.data.Struct.put(Struct.java:216) at io.debezium.relational.TableSchemaBuilder.lambda$createKeyGenerator$1(TableSchemaBuilder.java:174) at io.debezium.relational.TableSchema.keyFromColumnData(TableSchema.java:124) at io.debezium.connector.postgresql.RecordsStreamProducer.generateUpdateRecord(RecordsStreamProducer.java:325) at io.debezium.connector.postgresql.RecordsStreamProducer.process(RecordsStreamProducer.java:264) at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$streamChanges$1(RecordsStreamProducer.java:133) at io.debezium.connector.postgresql.connection.wal2json.NonStreamingWal2JsonMessageDecoder.processMessage(NonStreamingWal2JsonMessageDecoder.java:62) at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.deserializeMessages(PostgresReplicationConnection.java:269) at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.read(PostgresReplicationConnection.java:254) at io.debezium.connector.postgresql.RecordsStreamProducer.streamChanges(RecordsStreamProducer.java:133) at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$start$0(RecordsStreamProducer.java:119) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 2019-09-19 08:30:18,556 WARN || [Producer clientId=producer-5] Error while fetching metadata with correlation id 41 : {audit.topic1=LEADER_NOT_AVAILABLE} [org.apache.kafka.clients.NetworkClient] 2019-09-19 08:30:18,711 WARN || [Producer clientId=producer-5] Error while fetching metadata with correlation id 43 : {audit.topic2=LEADER_NOT_AVAILABLE} [org.apache.kafka.clients.NetworkClient] 2019-09-19 08:30:18,830 INFO || WorkerSourceTask{id=debezium-connector-0} Committing offsets [org.apache.kafka.connect.runtime.WorkerSourceTask] 2019-09-19 08:30:18,831 INFO || WorkerSourceTask{id=debezium-connector-0} flushing 2 outstanding messages for offset commit [org.apache.kafka.connect.runtime.WorkerSourceTask] 2019-09-19 08:30:18,944 INFO || WorkerSourceTask{id=debezium-connector-0} Finished commitOffsets successfully in 109 ms [org.apache.kafka.connect.runtime.WorkerSourceTask] 2019-09-19 08:30:18,952 ERROR || WorkerSourceTask{id=debezium-connector-0} Task threw an uncaught and unrecoverable exception [org.apache.kafka.connect.runtime.WorkerTask] org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104) at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:267) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:294) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:229) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.kafka.connect.errors.DataException: Conversion error: null value for field that is required and has no default value at org.apache.kafka.connect.json.JsonConverter.convertToJson(JsonConverter.java:589) at org.apache.kafka.connect.json.JsonConverter.convertToJson(JsonConverter.java:683) at org.apache.kafka.connect.json.JsonConverter.convertToJsonWithEnvelope(JsonConverter.java:570) at org.apache.kafka.connect.json.JsonConverter.fromConnectData(JsonConverter.java:324) at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$1(WorkerSourceTask.java:267) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162) ... 11 more