Uploaded image for project: 'Debezium'
  1. Debezium
  2. DBZ-1229

SQL Server Connector - Error registering Avro schema

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • None
    • 0.9.3.Final
    • sqlserver-connector
    • None
    • Hide

      SQL:

      /* casino */                                                            
      USE casino
      GO
      SELECT db_id()[DB_ID],@@SERVERNAME[SERVERNAME],DB_Name()[DB_Name]
      GO
      IF (object_id('tb_bob') is not null)
        DROP TABLE tb_bob
      
      CREATE TABLE tb_bob (
        ID INT identity Primary Key
        ,lastname NVARCHAR(100)
        ,Firstname NVARCHAR(100)
        )
      
      EXECUTE sys.sp_cdc_enable_table  
          @source_schema = N'dbo'  
        , @source_name = N'tb_bob'  
        , @role_name = NULL;  
      GO  
      INSERT INTO tb_bob (
        Lastname
        ,Firstname
        )
      VALUES (
        'Smith'
        ,'John'
        )
        ,(
        'Flintstone'
        ,'Fred'
        )
      GO
      select * from cdc.dbo_tb_bob_CT
      GO
      

      DEBEZIUM/Kafka:
      1) unpack confluent package to /opt/kafka
      2) unpack debezium package to /opt/kafka/confluent-5.2.1/share/java/
      3) confluent start connect
      4) curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{ "name": "CasinoDB", "config":
      {
      "connector.class": "io.debezium.connector.sqlserver.SqlServerConnector"
      , "database.hostname": "10.1.176.107"
      , "database.port": "1402"
      , "database.user": "sa"
      , "database.password": "!@#$%A1"
      , "database.dbname": "casino"
      , "database.server.name": "gamingdb1
      inst2"
      , "table.whitelist": "dbo.tb_bob"
      , "database.history.kafka.bootstrap.servers": "localhost:9092"
      , "database.history.kafka.topic": "dbhistory.casino"
      }
      }'

      Show
      SQL: /* casino */ USE casino GO SELECT db_id()[DB_ID],@@SERVERNAME[SERVERNAME],DB_Name()[DB_Name] GO IF (object_id('tb_bob') is not null) DROP TABLE tb_bob CREATE TABLE tb_bob ( ID INT identity Primary Key ,lastname NVARCHAR(100) ,Firstname NVARCHAR(100) ) EXECUTE sys.sp_cdc_enable_table @source_schema = N'dbo' , @source_name = N'tb_bob' , @role_name = NULL; GO INSERT INTO tb_bob ( Lastname ,Firstname ) VALUES ( 'Smith' ,'John' ) ,( 'Flintstone' ,'Fred' ) GO select * from cdc.dbo_tb_bob_CT GO DEBEZIUM/Kafka: 1) unpack confluent package to /opt/kafka 2) unpack debezium package to /opt/kafka/confluent-5.2.1/share/java/ 3) confluent start connect 4) curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{ "name": "CasinoDB", "config": { "connector.class": "io.debezium.connector.sqlserver.SqlServerConnector" , "database.hostname": "10.1.176.107" , "database.port": "1402" , "database.user": "sa" , "database.password": "!@#$%A1" , "database.dbname": "casino" , "database.server.name": "gamingdb1 inst2" , "table.whitelist": "dbo.tb_bob" , "database.history.kafka.bootstrap.servers": "localhost:9092" , "database.history.kafka.topic": "dbhistory.casino" } }'

      error in connect.stdout -->

      [2019-04-12 11:15:13,796] ERROR WorkerSourceTask{id=CasinoDB-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177)
      

      After running the steps to reproduce the topic(dbhistory.casino) is created and populated with information from tb_bob(good news) but I get the following output from the connect log:
      roryb@FPRORBOLL1:/tmp/confluent.0YrgUFdz/connect$ cat connect.stdout |grep error

      [2019-04-12 11:15:05,229] INFO Requested thread factory for connector SqlServerConnector, id = gamingdb1\inst2 named = error-handler (io.debezium.util.Threads:249)
      org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
      Caused by: org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic gamingdb1_inst2.dbo.tb_bob :
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
      Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"Key","namespace":"gamingdb1_inst2.dbo.tb_bob","fields":[{"name":"ID","type":"int"}],"connect.name":"gamingdb1_inst2.dbo.tb_bob.Key"}
       at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 1]; error code: 50005
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
      

      I have attached the full file - Please let me know if there is anything else I should include...

      Thanks
      Rory

            Unassigned Unassigned
            phalynx Rory Bolle (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated: