Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-7536

Schema backward compatibility broken on adding column of type text array with default value

XMLWordPrintable

    • False
    • None
    • False

      In order to make your issue reports as actionable as possible, please provide the following information, depending on the issue type.

      Bug report

      For bug reports, provide this information, please:

      What Debezium connector do you use and what version?

      Any version

      What behaviour do you expect?

      Since we are adding a column with a default value this should ideally be backward compatible.

      What behaviour do you see?

      The connector breaks with the exception

      [2024-02-12 20:07:04,869] ERROR [debezium-postgres-source|task-0] WorkerSourceTask{id=debezium-postgres-source-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:237)
      org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
      	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:244)
      	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:166)
      	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.convertTransformedRecord(AbstractWorkerSourceTask.java:513)
      	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.sendRecords(AbstractWorkerSourceTask.java:408)
      	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.execute(AbstractWorkerSourceTask.java:373)
      	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:229)
      	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:284)
      	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.run(AbstractWorkerSourceTask.java:80)
      	at org.apache.kafka.connect.runtime.isolation.Plugins.lambda$withClassLoader$1(Plugins.java:237)
      	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
      	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
      	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      	at java.base/java.lang.Thread.run(Thread.java:829)
      Caused by: org.apache.kafka.common.config.ConfigException: Failed to access Avro data from topic asgard.public.customers-raw : Schema being registered is incompatible with an earlier schema for subject "asgard.public.customers-raw-value", details: [{errorType:'MISSING_UNION_BRANCH', description:'The new schema is missing a type inside a union field at path '/fields/0/type/1' in the old schema', additionalInfo:'reader union lacking writer type: RECORD'}, {errorType:'MISSING_UNION_BRANCH', description:'The new schema is missing a type inside a union field at path '/fields/1/type/1' in the old schema', additionalInfo:'reader union lacking writer type: RECORD'}, {oldSchemaVersion: 1}, {oldSchema: '{"type":"record","name":"Envelope","namespace":"asgard.public.customers","fields":[{"name":"before","type":["null",{"type":"record","name":"Value","fields":[{"name":"id","type":{"type":"int","connect.default":0},"default":0},{"name":"first_name","type":["null","string"],"default":null},{"name":"last_name","type":["null","string"],"default":null},{"name":"email","type":["null","string"],"default":null},{"name":"gender","type":["null","string"],"default":null},{"name":"club_status","type":["null","string"],"default":null},{"name":"comments","type":["null","string"],"default":null},{"name":"create_ts","type":[{"type":"long","connect.default":0,"connect.name":"io.debezium.time.MicroTimestamp","connect.version":1},"null"],"default":0},{"name":"update_ts","type":[{"type":"long","connect.default":0,"connect.name":"io.debezium.time.MicroTimestamp","connect.version":1},"null"],"default":0},{"name":"arraytext","type":{"type":"array","items":["null","string"]}}],"connect.name":"asgard.public.customers.Value"}],"default":null},{"name":"after","type":["null","Value"],"default":null},{"name":"source","type":{"type":"record","name":"Source","namespace":"io.debezium.connector.postgresql","fields":[{"name":"version","type":"string"},{"name":"connector","type":"string"},{"name":"name","type":"string"},{"name":"ts_ms","type":"long"},{"name":"snapshot","type":[{"type":"string","connect.default":"false","connect.name":"io.debezium.data.Enum","connect.parameters":{"allowed":"true,last,false,incremental"},"connect.version":1},"null"],"default":"false"},{"name":"db","type":"string"},{"name":"sequence","type":["null","string"],"default":null},{"name":"schema","type":"string"},{"name":"table","type":"string"},{"name":"txId","type":["null","long"],"default":null},{"name":"lsn","type":["null","long"],"default":null},{"name":"xmin","type":["null","long"],"default":null}],"connect.name":"io.debezium.connector.postgresql.Source"}},{"name":"op","type":"string"},{"name":"ts_ms","type":["null","long"],"default":null},{"name":"transaction","type":["null",{"type":"record","name":"block","namespace":"event","fields":[{"name":"id","type":"string"},{"name":"total_order","type":"long"},{"name":"data_collection_order","type":"long"}],"connect.name":"event.block","connect.version":1}],"default":null}],"connect.name":"asgard.public.customers.Envelope","connect.version":1}'}, {validateFields: 'false', compatibility: 'BACKWARD'}]; error code: 409
      	at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:112)
      	at org.apache.kafka.connect.runtime.AbstractWorkerSourceTask.lambda$convertTransformedRecord$9(AbstractWorkerSourceTask.java:513)
      	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:190)
      	at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:224)
      	... 13 more
      
      

      Do you see the same behaviour using the latest relesead Debezium version?

      (Ideally, also verify with latest Alpha/Beta/CR version)

      Yes

      How to reproduce the issue using our tutorial deployment?

      1. Start a postgres connector with Avro converter
      2. Add a new column of type text[] with default {}.
      3. Insert a value without providing a value for this new column, see the connector breaks.

      Feature request or enhancement

      For feature requests or enhancements, provide this information, please:

      Which use case/requirement will be addressed by the proposed feature?

      Fixing this bug.

            Unassigned Unassigned
            akanimesh7 Animesh Kumar
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated: