Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-694

Error on initial load for records with negative timestamp

    XMLWordPrintable

Details

    • Hide

      1. Have a PostgeSQL instance with wal2Json plugin (I used PostgreSQL 10.3)
      2. Have a PostgreSQL connector stood up, my config:
      {
      "connector.class": "io.debezium.connector.postgresql.PostgresConnector",
      "plugin.name": "wal2json",
      "tasks.max": "1",
      "database.hostname": "**",
      "database.port": "5432",
      "database.user": "**",
      "database.password": "**",
      "database.dbname": "postgres",
      "database.server.name": "elasticsearch",
      "table.whitelist": "public.dbo.testmssqltable",
      "decimal.handling.mode": "double"
      }
      3. Create a source table with column of timestamp data type.
      4. Insert record with timestamp value that has NanoOfSecond and has minus sign, e.g.: 1936-10-25 22:10:12.608
      5. Error is thrown e.g.:
      [2018-05-10 08:45:45,860] ERROR Failed to properly convert data value for 'public.dbo.testmssqltable.DateTime_Test' of type timestamp for row [..., 1918-08-20 02:30:08.123, ...]: (io.debezium.relational.TableSchemaBuilder)
      kafka-connect_1 | java.time.DateTimeException: Invalid value for NanoOfSecond (valid values 0 - 999999999): -877000000
      kafka-connect_1 | at java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311)
      kafka-connect_1 | at java.time.temporal.ChronoField.checkValidValue(ChronoField.java:703)
      kafka-connect_1 | at java.time.LocalTime.of(LocalTime.java:342)
      kafka-connect_1 | at java.time.LocalDateTime.of(LocalDateTime.java:362)
      kafka-connect_1 | at io.debezium.time.Conversions.toLocalDateTime(Conversions.java:140)
      kafka-connect_1 | at io.debezium.time.NanoTimestamp.toEpochNanos(NanoTimestamp.java:68)
      kafka-connect_1 | at io.debezium.jdbc.JdbcValueConverters.convertTimestampToEpochNanos(JdbcValueConverters.java:496)
      kafka-connect_1 | at io.debezium.connector.postgresql.PostgresValueConverter.convertTimestampToEpochNanos(PostgresValueConverter.java:402)
      kafka-connect_1 | at io.debezium.jdbc.JdbcValueConverters.lambda$converter$24(JdbcValueConverters.java:320)
      kafka-connect_1 | at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$2(TableSchemaBuilder.java:197)
      kafka-connect_1 | at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:111)
      kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.generateReadRecord(RecordsSnapshotProducer.java:311)
      kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.readTable(RecordsSnapshotProducer.java:270)
      kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.lambda$takeSnapshot$6(RecordsSnapshotProducer.java:195)
      kafka-connect_1 | at io.debezium.jdbc.JdbcConnection.queryWithBlockingConsumer(JdbcConnection.java:407)
      kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.takeSnapshot(RecordsSnapshotProducer.java:193)
      kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.lambda$start$0(RecordsSnapshotProducer.java:80)
      kafka-connect_1 | at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626)
      kafka-connect_1 | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      kafka-connect_1 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      kafka-connect_1 | at java.lang.Thread.run(Thread.java:745)

      You can move points 3 and 4, before starting connector - I observe same behaviour.
      All records that follow looks like are streamed correctly.

      If initial records or records inserted in first batch doesn't have any such vale (minus timestamp), Error is not present and everything is streamed correctly.

      Show
      1. Have a PostgeSQL instance with wal2Json plugin (I used PostgreSQL 10.3) 2. Have a PostgreSQL connector stood up, my config: { "connector.class": "io.debezium.connector.postgresql.PostgresConnector", "plugin.name": "wal2json", "tasks.max": "1", "database.hostname": "**", "database.port": "5432", "database.user": "**", "database.password": "**", "database.dbname": "postgres", "database.server.name": "elasticsearch", "table.whitelist": "public.dbo.testmssqltable", "decimal.handling.mode": "double" } 3. Create a source table with column of timestamp data type. 4. Insert record with timestamp value that has NanoOfSecond and has minus sign, e.g.: 1936-10-25 22:10:12.608 5. Error is thrown e.g.: [2018-05-10 08:45:45,860] ERROR Failed to properly convert data value for 'public.dbo.testmssqltable.DateTime_Test' of type timestamp for row [..., 1918-08-20 02:30:08.123, ...] : (io.debezium.relational.TableSchemaBuilder) kafka-connect_1 | java.time.DateTimeException: Invalid value for NanoOfSecond (valid values 0 - 999999999): -877000000 kafka-connect_1 | at java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311) kafka-connect_1 | at java.time.temporal.ChronoField.checkValidValue(ChronoField.java:703) kafka-connect_1 | at java.time.LocalTime.of(LocalTime.java:342) kafka-connect_1 | at java.time.LocalDateTime.of(LocalDateTime.java:362) kafka-connect_1 | at io.debezium.time.Conversions.toLocalDateTime(Conversions.java:140) kafka-connect_1 | at io.debezium.time.NanoTimestamp.toEpochNanos(NanoTimestamp.java:68) kafka-connect_1 | at io.debezium.jdbc.JdbcValueConverters.convertTimestampToEpochNanos(JdbcValueConverters.java:496) kafka-connect_1 | at io.debezium.connector.postgresql.PostgresValueConverter.convertTimestampToEpochNanos(PostgresValueConverter.java:402) kafka-connect_1 | at io.debezium.jdbc.JdbcValueConverters.lambda$converter$24(JdbcValueConverters.java:320) kafka-connect_1 | at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$2(TableSchemaBuilder.java:197) kafka-connect_1 | at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:111) kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.generateReadRecord(RecordsSnapshotProducer.java:311) kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.readTable(RecordsSnapshotProducer.java:270) kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.lambda$takeSnapshot$6(RecordsSnapshotProducer.java:195) kafka-connect_1 | at io.debezium.jdbc.JdbcConnection.queryWithBlockingConsumer(JdbcConnection.java:407) kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.takeSnapshot(RecordsSnapshotProducer.java:193) kafka-connect_1 | at io.debezium.connector.postgresql.RecordsSnapshotProducer.lambda$start$0(RecordsSnapshotProducer.java:80) kafka-connect_1 | at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626) kafka-connect_1 | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) kafka-connect_1 | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) kafka-connect_1 | at java.lang.Thread.run(Thread.java:745) You can move points 3 and 4, before starting connector - I observe same behaviour. All records that follow looks like are streamed correctly. If initial records or records inserted in first batch doesn't have any such vale (minus timestamp), Error is not present and everything is streamed correctly.

    Description

      Hi, I need help with Debezium Postgres Source connector (or confirmation what I observe is a bug)
      I see this error:
      ERROR Failed to properly convert data value for 'public.dbo.testmssqltable.DateTime_Test' of type timestamp for row [...]: (io.debezium.relational.TableSchemaBuilder)
      java.time.DateTimeException: Invalid value for NanoOfSecond (valid values 0 - 999999999): -57000000
      at java.time.temporal.ValueRange.checkValidValue(ValueRange.java:311)
      at java.time.temporal.ChronoField.checkValidValue(ChronoField.java:703)
      at java.time.LocalTime.of(LocalTime.java:342)
      at java.time.LocalDateTime.of(LocalDateTime.java:362)
      at io.debezium.time.Conversions.toLocalDateTime(Conversions.java:140)
      at io.debezium.time.NanoTimestamp.toEpochNanos(NanoTimestamp.java:68)
      at io.debezium.jdbc.JdbcValueConverters.convertTimestampToEpochNanos(JdbcValueConverters.java:479)
      but only for initial load or first batch of records streamed
      All records that follow are streamed correctly (with timestamps that have negative values)
      Moreover if in initial records there is no negative timestamp, then this error is not present at all.

      The result is a streamed record without value in this timestamp column, which prevents us from using this connector.

      Attachments

        Activity

          People

            jpechane Jiri Pechanec
            przemekak Przemek Pawlowski (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: