Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-1161

Postgresql Snapshot with a table that has > 8192records hangs

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed (View Workflow)
    • Priority: Major
    • Resolution: Done
    • Affects Version/s: 0.9.2.Final
    • Fix Version/s: 0.9.3.Final
    • Component/s: postgresql-connector
    • Labels:
      None
    • Environment:

      Running in Docker kafka-connect docker image from confluent. Using Linux Kubuntu 18.10.

    • Steps to Reproduce:
      Hide

      Every table I have that has > 8192 rows hangs. It appears to hang on the rs.next() line in this function in RecordsSnapshotProducer.java

          private void readTable(TableId tableId, ResultSet rs,
                                 BlockingConsumer<ChangeEvent> consumer,
                                 AtomicInteger rowsCounter) throws SQLException, InterruptedException {
              Table table = schema().tableFor(tableId);
              assert table != null;
              final int numColumns = table.columns().size();
              final Object[] row = new Object[numColumns];
              final ResultSetMetaData metaData = rs.getMetaData();
              while (rs.next()) {
                  rowsCounter.incrementAndGet();
                  sendCurrentRecord(consumer);
                  for (int i = 0, j = 1; i != numColumns; ++i, ++j) {
                      row[i] = valueForColumn(rs, j, metaData);
                  }
                  generateReadRecord(tableId, row);
              }
          }
      
      Show
      Every table I have that has > 8192 rows hangs. It appears to hang on the rs.next() line in this function in RecordsSnapshotProducer.java private void readTable(TableId tableId, ResultSet rs, BlockingConsumer<ChangeEvent> consumer, AtomicInteger rowsCounter) throws SQLException, InterruptedException { Table table = schema().tableFor(tableId); assert table != null ; final int numColumns = table.columns().size(); final Object [] row = new Object [numColumns]; final ResultSetMetaData metaData = rs.getMetaData(); while (rs.next()) { rowsCounter.incrementAndGet(); sendCurrentRecord(consumer); for ( int i = 0, j = 1; i != numColumns; ++i, ++j) { row[i] = valueForColumn(rs, j, metaData); } generateReadRecord(tableId, row); } }

      Description

      As soon as the postgres connector encounters a table that has > 8192 rows it hangs without any errors. Given the number? Is there some kind of configuration setting I am missing?
      Note this did not happen using the same configuration in 0.8

        Gliffy Diagrams

          Attachments

            Activity

              People

              • Assignee:
                jpechanec Jiri Pechanec
                Reporter:
                splace Sandy Place
              • Votes:
                1 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: