[2018-03-14 22:13:56,911] INFO Kafka Connect distributed worker initializing ... (org.apache.kafka.connect.cli.ConnectDistributed:60) [2018-03-14 22:13:56,925] INFO WorkerInfo values: jvm.args = -Xmx256M, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../logs, -Dlog4j.configuration=file:./bin/../etc/kafka/connect-log4j.properties jvm.spec = Oracle Corporation, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_73, 25.73-b02 jvm.classpath = /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/aopalliance-repackaged-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/argparse4j-0.7.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/avro-1.8.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-beanutils-1.8.3.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-codec-1.9.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-collections-3.2.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-compress-1.8.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-digester-1.8.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-lang3-3.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-lang3-3.5.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-logging-1.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/commons-validator-1.4.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/connect-api-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/connect-file-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/connect-json-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/connect-runtime-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/connect-transforms-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/guava-20.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/hk2-api-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/hk2-locator-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/hk2-utils-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/httpclient-4.5.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/httpcore-4.4.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/httpmime-4.5.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-annotations-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-core-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-core-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-databind-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-jaxrs-base-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-jaxrs-json-provider-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-mapper-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jackson-module-jaxb-annotations-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javassist-3.20.0-GA.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javassist-3.21.0-GA.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javax.annotation-api-1.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javax.inject-1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javax.inject-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javax.servlet-api-3.1.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/javax.ws.rs-api-2.0.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-client-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-common-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-container-servlet-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-container-servlet-core-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-guava-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-media-jaxb-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jersey-server-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-continuation-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-http-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-io-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-security-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-server-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-servlet-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-servlets-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jetty-util-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/jopt-simple-5.0.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka-clients-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka-log4j-appender-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka-streams-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka-streams-examples-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka-tools-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1-javadoc.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1-scaladoc.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1-sources.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1-test-sources.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1-test.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/kafka_2.11-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/log4j-1.2.17.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/lz4-java-1.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/maven-artifact-3.5.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/metrics-core-2.2.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/osgi-resource-locator-1.0.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/paranamer-2.7.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/plexus-utils-3.0.24.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/reflections-0.9.11.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/rocksdbjni-5.7.3.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/scala-library-2.11.11.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/slf4j-api-1.7.25.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/slf4j-log4j12-1.7.25.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/snappy-java-1.1.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/support-metrics-client-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/support-metrics-common-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/validation-api-1.1.0.Final.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/xz-1.5.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/zkclient-0.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/zookeeper-3.4.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/build-tools-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/common-config-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/common-metrics-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/common-utils-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/jline-0.9.94.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/log4j-1.2.17.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/netty-3.10.5.Final.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/slf4j-api-1.7.25.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/zkclient-0.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/zookeeper-3.4.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/avro-1.8.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/commons-compress-1.8.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/jackson-annotations-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/jackson-core-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/jackson-core-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/jackson-databind-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/jackson-mapper-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/kafka-avro-serializer-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/kafka-connect-avro-converter-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/kafka-json-serializer-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/kafka-schema-registry-client-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/paranamer-2.7.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/snappy-java-1.1.1.3.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/xz-1.5.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/aopalliance-repackaged-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/argparse4j-0.7.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/avro-1.8.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-beanutils-1.8.3.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-codec-1.9.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-collections-3.2.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-compress-1.8.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-digester-1.8.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-lang3-3.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-lang3-3.5.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-logging-1.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/commons-validator-1.4.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/connect-api-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/connect-file-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/connect-json-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/connect-runtime-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/connect-transforms-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/guava-20.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/hk2-api-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/hk2-locator-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/hk2-utils-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/httpclient-4.5.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/httpcore-4.4.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/httpmime-4.5.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-annotations-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-core-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-core-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-databind-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-jaxrs-base-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-mapper-asl-1.9.13.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.9.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javassist-3.20.0-GA.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javassist-3.21.0-GA.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javax.annotation-api-1.2.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javax.inject-1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javax.inject-2.5.0-b32.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/javax.ws.rs-api-2.0.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-client-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-common-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-container-servlet-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-container-servlet-core-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-guava-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-media-jaxb-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jersey-server-2.25.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-continuation-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-http-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-io-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-security-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-server-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-servlet-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-servlets-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jetty-util-9.2.22.v20170606.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka-clients-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka-log4j-appender-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka-streams-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka-streams-examples-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka-tools-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-javadoc.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-scaladoc.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-sources.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-test-sources.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-test.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/log4j-1.2.17.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/lz4-java-1.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/maven-artifact-3.5.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/metrics-core-2.2.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/osgi-resource-locator-1.0.1.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/paranamer-2.7.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/plexus-utils-3.0.24.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/reflections-0.9.11.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/rocksdbjni-5.7.3.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/scala-library-2.11.11.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/slf4j-api-1.7.25.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/slf4j-log4j12-1.7.25.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/snappy-java-1.1.4.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/support-metrics-client-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/support-metrics-common-4.0.0.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/validation-api-1.1.0.Final.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/xz-1.5.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/zkclient-0.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/kafka/zookeeper-3.4.10.jar:/Users/rajat.gangwar/apl/confluent-4.0.0/bin/../share/java/confluent-support-metrics/*:/usr/share/java/confluent-support-metrics/* os.spec = Mac OS X, x86_64, 10.10.5 os.vcpus = 4 (org.apache.kafka.connect.runtime.WorkerInfo:71) [2018-03-14 22:13:56,926] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:69) [2018-03-14 22:13:56,942] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/camus (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:13:59,324] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/camus/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:13:59,349] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:13:59,589] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/confluent-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:13:59,591] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/debezium-connector-mysql (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:13:59,815] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/debezium-connector-mysql/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:13:59,815] INFO Added plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:13:59,815] INFO Added plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:13:59,815] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:13:59,827] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:02,051] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:02,052] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,052] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,054] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,054] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,054] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,054] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,054] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,055] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,056] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,057] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,058] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,058] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,058] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,059] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-elasticsearch (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:02,364] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-elasticsearch/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:02,365] INFO Added plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:02,365] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-hdfs (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:07,500] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-hdfs/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:07,500] INFO Added plugin 'io.confluent.connect.hdfs.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,501] INFO Added plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,501] INFO Added plugin 'io.confluent.connect.storage.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,501] INFO Added plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,573] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-jdbc (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:07,820] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-jdbc/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:07,820] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,820] INFO Added plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:07,832] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-s3 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:12,730] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-s3/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:12,730] INFO Added plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:135) [2018-03-14 22:14:12,767] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-storage-common (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:17,708] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-connect-storage-common/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:17,788] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-rest (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:19,125] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-rest/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:19,131] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:19,423] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/kafka-serde-tools/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:19,425] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/rest-utils (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:19,875] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/rest-utils/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:19,884] INFO Loading plugin from: /Users/rajat.gangwar/apl/confluent-4.0.0/share/java/schema-registry (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179) [2018-03-14 22:14:20,909] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/rajat.gangwar/apl/confluent-4.0.0/share/java/schema-registry/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:24,731] INFO Registered loader: sun.misc.Launcher$AppClassLoader@18b4aac2 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:202) [2018-03-14 22:14:24,732] INFO Added aliases 'ElasticsearchSinkConnector' and 'ElasticsearchSink' to plugin 'io.confluent.connect.elasticsearch.ElasticsearchSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,733] INFO Added aliases 'HdfsSinkConnector' and 'HdfsSink' to plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,733] INFO Added aliases 'JdbcSinkConnector' and 'JdbcSink' to plugin 'io.confluent.connect.jdbc.JdbcSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,733] INFO Added aliases 'JdbcSourceConnector' and 'JdbcSource' to plugin 'io.confluent.connect.jdbc.JdbcSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,734] INFO Added aliases 'S3SinkConnector' and 'S3Sink' to plugin 'io.confluent.connect.s3.S3SinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,734] INFO Added aliases 'MySqlConnector' and 'MySql' to plugin 'io.debezium.connector.mysql.MySqlConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,734] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,734] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,735] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,735] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,735] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,735] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,736] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,736] INFO Added aliases 'AvroConverter' and 'Avro' to plugin 'io.confluent.connect.avro.AvroConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,736] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,736] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,736] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:330) [2018-03-14 22:14:24,738] INFO Added alias 'ByLogicalTableRouter' to plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:327) [2018-03-14 22:14:24,738] INFO Added alias 'UnwrapFromEnvelope' to plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:327) [2018-03-14 22:14:24,739] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:327) [2018-03-14 22:14:24,739] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:327) [2018-03-14 22:14:24,740] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:327) [2018-03-14 22:14:24,757] INFO DistributedConfig values: access.control.allow.methods = access.control.allow.origin = bootstrap.servers = [localhost:9092] client.id = config.storage.replication.factor = 1 config.storage.topic = connect-configs connections.max.idle.ms = 540000 group.id = connect-cluster heartbeat.interval.ms = 3000 internal.key.converter = class org.apache.kafka.connect.json.JsonConverter internal.value.converter = class org.apache.kafka.connect.json.JsonConverter key.converter = class org.apache.kafka.connect.json.JsonConverter metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 offset.flush.interval.ms = 10000 offset.flush.timeout.ms = 5000 offset.storage.partitions = 25 offset.storage.replication.factor = 1 offset.storage.topic = connect-offsets plugin.path = [share/java] rebalance.timeout.ms = 60000 receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 40000 rest.advertised.host.name = null rest.advertised.port = null rest.host.name = null rest.port = 8083 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS status.storage.partitions = 5 status.storage.replication.factor = 1 status.storage.topic = connect-status task.shutdown.graceful.timeout.ms = 5000 value.converter = class org.apache.kafka.connect.json.JsonConverter worker.sync.timeout.ms = 3000 worker.unsync.backoff.ms = 300000 (org.apache.kafka.connect.runtime.distributed.DistributedConfig:223) [2018-03-14 22:14:24,923] INFO Logging initialized @28665ms (org.eclipse.jetty.util.log:186) [2018-03-14 22:14:25,161] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,161] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,246] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,246] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,250] INFO Kafka Connect distributed worker initialization took 28335ms (org.apache.kafka.connect.cli.ConnectDistributed:91) [2018-03-14 22:14:25,250] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:49) [2018-03-14 22:14:25,251] INFO Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder:203) [2018-03-14 22:14:25,252] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:142) [2018-03-14 22:14:25,252] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:108) [2018-03-14 22:14:25,252] INFO Starting KafkaBasedLog with topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:124) [2018-03-14 22:14:25,252] INFO Starting REST server (org.apache.kafka.connect.runtime.rest.RestServer:98) [2018-03-14 22:14:25,257] INFO AdminClientConfig values: bootstrap.servers = [localhost:9092] client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:223) [2018-03-14 22:14:25,276] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,276] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,276] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,276] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,277] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,278] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,279] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,279] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,352] INFO jetty-9.2.22.v20170606 (org.eclipse.jetty.server.Server:327) [2018-03-14 22:14:25,615] INFO ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [localhost:9092] buffer.memory = 33554432 client.id = compression.type = none confluent.batch.expiry.ms = 30000 connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:223) [2018-03-14 22:14:25,646] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,646] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,649] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,649] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,649] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,650] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,651] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,651] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,667] INFO ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [localhost:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = connect-cluster heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:223) [2018-03-14 22:14:25,706] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,706] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,707] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,707] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,707] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,708] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,708] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,708] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,708] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,708] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,709] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,709] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,709] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,709] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,709] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,710] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,710] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,710] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,753] INFO [Consumer clientId=consumer-1, groupId=connect-cluster] Discovered coordinator 172.20.60.218:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:25,822] INFO Finished reading KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:153) [2018-03-14 22:14:25,822] INFO Started KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:155) [2018-03-14 22:14:25,824] INFO Finished reading offsets topic and starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:110) [2018-03-14 22:14:25,825] INFO Worker started (org.apache.kafka.connect.runtime.Worker:147) [2018-03-14 22:14:25,825] INFO Starting KafkaBasedLog with topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:124) [2018-03-14 22:14:25,825] INFO AdminClientConfig values: bootstrap.servers = [localhost:9092] client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:223) [2018-03-14 22:14:25,828] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,828] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,828] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,829] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:25,830] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,830] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,946] INFO ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [localhost:9092] buffer.memory = 33554432 client.id = compression.type = none confluent.batch.expiry.ms = 30000 connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:223) [2018-03-14 22:14:25,951] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,952] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,953] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,954] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:25,954] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,954] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,956] INFO ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [localhost:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = connect-cluster heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:223) [2018-03-14 22:14:25,962] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,962] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,962] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,962] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,963] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,963] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,963] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,963] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,963] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,964] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,964] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,964] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,965] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,965] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,966] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,966] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:25,967] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:25,968] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:25,993] INFO [Consumer clientId=consumer-2, groupId=connect-cluster] Discovered coordinator 172.20.60.218:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:26,062] INFO Finished reading KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:153) [2018-03-14 22:14:26,064] INFO Started KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:155) [2018-03-14 22:14:26,065] INFO Starting KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:244) [2018-03-14 22:14:26,067] INFO Starting KafkaBasedLog with topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:124) [2018-03-14 22:14:26,071] INFO AdminClientConfig values: bootstrap.servers = [localhost:9092] client.id = connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 120000 retries = 5 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS (org.apache.kafka.clients.admin.AdminClientConfig:223) [2018-03-14 22:14:26,073] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,074] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,075] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,076] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,076] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:231) [2018-03-14 22:14:26,076] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:26,076] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:26,186] INFO ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [localhost:9092] buffer.memory = 33554432 client.id = compression.type = none confluent.batch.expiry.ms = 30000 connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:223) [2018-03-14 22:14:26,190] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,191] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,192] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,193] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,193] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:231) [2018-03-14 22:14:26,193] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:26,193] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:26,194] INFO ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [localhost:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = connect-cluster heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:223) [2018-03-14 22:14:26,197] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,197] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,198] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,198] WARN The configuration 'internal.key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,198] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,198] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,198] WARN The configuration 'internal.key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'internal.value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'internal.value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,199] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,200] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:231) [2018-03-14 22:14:26,200] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:26,200] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:26,219] INFO [Consumer clientId=consumer-3, groupId=connect-cluster] Discovered coordinator 172.20.60.218:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:26,231] INFO Removed connector inventory-connector due to null configuration. This is usually intentional and does not indicate an issue. (org.apache.kafka.connect.storage.KafkaConfigBackingStore:513) [2018-03-14 22:14:26,233] INFO Removed connector inventory-connector due to null configuration. This is usually intentional and does not indicate an issue. (org.apache.kafka.connect.storage.KafkaConfigBackingStore:513) [2018-03-14 22:14:26,234] INFO Finished reading KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:153) [2018-03-14 22:14:26,235] INFO Started KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:155) [2018-03-14 22:14:26,236] INFO Started KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:249) [2018-03-14 22:14:26,236] INFO Herder started (org.apache.kafka.connect.runtime.distributed.DistributedHerder:207) [2018-03-14 22:14:26,241] INFO [Worker clientId=connect-1, groupId=connect-cluster] Discovered coordinator 172.20.60.218:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:26,244] INFO [Worker clientId=connect-1, groupId=connect-cluster] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:336) [2018-03-14 22:14:26,260] INFO [Worker clientId=connect-1, groupId=connect-cluster] Successfully joined group with generation 15 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:26,262] INFO Joined group and got assignment: Assignment{error=0, leader='connect-1-37da60ce-eb6c-477e-b3ba-69109c7d10ae', leaderUrl='http://172.20.60.218:8083/', offset=13, connectorIds=[inventory-connector], taskIds=[inventory-connector-0]} (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1192) [2018-03-14 22:14:26,264] WARN Catching up to assignment's config offset. (org.apache.kafka.connect.runtime.distributed.DistributedHerder:762) [2018-03-14 22:14:26,264] INFO Current config state offset -1 is behind group assignment 13, reading to end of config log (org.apache.kafka.connect.runtime.distributed.DistributedHerder:807) [2018-03-14 22:14:26,355] INFO Started o.e.j.s.ServletContextHandler@7df5a248{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:744) [2018-03-14 22:14:26,375] INFO Started ServerConnector@34f3c84{HTTP/1.1}{0.0.0.0:8083} (org.eclipse.jetty.server.ServerConnector:266) [2018-03-14 22:14:26,377] INFO Started @30119ms (org.eclipse.jetty.server.Server:379) [2018-03-14 22:14:26,379] INFO REST server listening at http://172.20.60.218:8083/, advertising URL http://172.20.60.218:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:150) [2018-03-14 22:14:26,379] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:55) [2018-03-14 22:14:26,735] INFO Finished reading to end of log and updated config snapshot, new config log offset: 13 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:811) [2018-03-14 22:14:26,736] INFO Starting connectors and tasks using config offset 13 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:837) [2018-03-14 22:14:26,738] INFO Starting connector inventory-connector (org.apache.kafka.connect.runtime.distributed.DistributedHerder:890) [2018-03-14 22:14:26,738] INFO Starting task inventory-connector-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:851) [2018-03-14 22:14:26,739] INFO Creating task inventory-connector-0 (org.apache.kafka.connect.runtime.Worker:361) [2018-03-14 22:14:26,740] INFO ConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:223) [2018-03-14 22:14:26,740] INFO ConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig:223) [2018-03-14 22:14:26,742] INFO EnrichedConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:223) [2018-03-14 22:14:26,742] INFO EnrichedConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:223) [2018-03-14 22:14:26,742] INFO Creating connector inventory-connector of type io.debezium.connector.mysql.MySqlConnector (org.apache.kafka.connect.runtime.Worker:205) [2018-03-14 22:14:26,745] INFO TaskConfig values: task.class = class io.debezium.connector.mysql.MySqlConnectorTask (org.apache.kafka.connect.runtime.TaskConfig:223) [2018-03-14 22:14:26,745] INFO Instantiated connector inventory-connector with version 0.7.4 of type class io.debezium.connector.mysql.MySqlConnector (org.apache.kafka.connect.runtime.Worker:208) [2018-03-14 22:14:26,746] INFO Instantiated task inventory-connector-0 with version 0.7.4 of type io.debezium.connector.mysql.MySqlConnectorTask (org.apache.kafka.connect.runtime.Worker:376) [2018-03-14 22:14:26,748] INFO ProducerConfig values: acks = all batch.size = 16384 bootstrap.servers = [localhost:9092] buffer.memory = 33554432 client.id = compression.type = none confluent.batch.expiry.ms = 30000 connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer linger.ms = 0 max.block.ms = 9223372036854775807 max.in.flight.requests.per.connection = 1 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 2147483647 retries = 2147483647 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer (org.apache.kafka.clients.producer.ProducerConfig:223) [2018-03-14 22:14:26,755] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:26,756] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:26,769] INFO Finished creating connector inventory-connector (org.apache.kafka.connect.runtime.Worker:227) [2018-03-14 22:14:26,770] INFO SourceConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.SourceConnectorConfig:223) [2018-03-14 22:14:26,771] INFO EnrichedConnectorConfig values: connector.class = io.debezium.connector.mysql.MySqlConnector key.converter = null name = inventory-connector tasks.max = 1 transforms = null value.converter = null (org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:223) [2018-03-14 22:14:26,772] INFO Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:847) [2018-03-14 22:14:26,875] INFO Starting MySqlConnectorTask with configuration: (io.debezium.connector.common.BaseSourceTask:40) [2018-03-14 22:14:26,880] INFO connector.class = io.debezium.connector.mysql.MySqlConnector (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO database.user = debezium (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO database.server.id = 1 (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO tasks.max = 1 (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO database.history.kafka.bootstrap.servers = localhost:9092 (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO database.history.kafka.topic = dbhistory.inventory (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,881] INFO database.server.name = dbserver1 (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,882] INFO database.port = 3306 (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,882] INFO include.schema.changes = false (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,882] INFO table.whitelist = products (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,882] INFO task.class = io.debezium.connector.mysql.MySqlConnectorTask (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,883] INFO database.hostname = localhost (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,883] INFO database.password = dbz (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,884] INFO name = inventory-connector (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:26,884] INFO database.whitelist = inventory (io.debezium.connector.common.BaseSourceTask:42) [2018-03-14 22:14:27,280] INFO KafkaDatabaseHistory Consumer config: {enable.auto.commit=false, value.deserializer=org.apache.kafka.common.serialization.StringDeserializer, group.id=inventory-connector-dbhistory, auto.offset.reset=earliest, session.timeout.ms=10000, bootstrap.servers=localhost:9092, client.id=inventory-connector-dbhistory, key.deserializer=org.apache.kafka.common.serialization.StringDeserializer, fetch.min.bytes=1} (io.debezium.relational.history.KafkaDatabaseHistory:163) [2018-03-14 22:14:27,280] INFO KafkaDatabaseHistory Producer config: {bootstrap.servers=localhost:9092, value.serializer=org.apache.kafka.common.serialization.StringSerializer, buffer.memory=1048576, retries=1, key.serializer=org.apache.kafka.common.serialization.StringSerializer, client.id=inventory-connector-dbhistory, linger.ms=0, batch.size=32768, max.block.ms=10000, acks=1} (io.debezium.relational.history.KafkaDatabaseHistory:164) [2018-03-14 22:14:27,294] INFO ProducerConfig values: acks = 1 batch.size = 32768 bootstrap.servers = [localhost:9092] buffer.memory = 1048576 client.id = inventory-connector-dbhistory compression.type = none confluent.batch.expiry.ms = 30000 connections.max.idle.ms = 540000 enable.idempotence = false interceptor.classes = null key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 10000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 1 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer (org.apache.kafka.clients.producer.ProducerConfig:223) [2018-03-14 22:14:27,299] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:27,299] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:27,334] INFO Found existing offset: {ts_sec=1521038006, file=mysql-bin.000003, pos=5662, row=1, server_id=1, event=2} (io.debezium.connector.mysql.MySqlConnectorTask:72) [2018-03-14 22:14:27,345] INFO ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [localhost:9092] check.crcs = true client.id = inventory-connector-dbhistory connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = inventory-connector-dbhistory heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:223) [2018-03-14 22:14:27,351] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:27,351] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:27,391] INFO ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [localhost:9092] check.crcs = true client.id = inventory-connector-dbhistory connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = inventory-connector-dbhistory heartbeat.interval.ms = 3000 interceptor.classes = null internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 305000 retry.backoff.ms = 100 sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer (org.apache.kafka.clients.consumer.ConsumerConfig:223) [2018-03-14 22:14:27,395] INFO Kafka version : 1.0.0-cp1 (org.apache.kafka.common.utils.AppInfoParser:109) [2018-03-14 22:14:27,395] INFO Kafka commitId : bedb2a8697fecd0d (org.apache.kafka.common.utils.AppInfoParser:110) [2018-03-14 22:14:27,408] INFO [Consumer clientId=inventory-connector-dbhistory, groupId=inventory-connector-dbhistory] Discovered coordinator 172.20.60.218:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:27,409] INFO [Consumer clientId=inventory-connector-dbhistory, groupId=inventory-connector-dbhistory] Revoking previously assigned partitions [] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:341) [2018-03-14 22:14:27,409] INFO [Consumer clientId=inventory-connector-dbhistory, groupId=inventory-connector-dbhistory] (Re-)joining group (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:336) [2018-03-14 22:14:27,419] INFO [Consumer clientId=inventory-connector-dbhistory, groupId=inventory-connector-dbhistory] Successfully joined group with generation 3 (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:341) [2018-03-14 22:14:27,422] INFO [Consumer clientId=inventory-connector-dbhistory, groupId=inventory-connector-dbhistory] Setting newly assigned partitions [dbhistory.inventory-0] (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:341) [2018-03-14 22:14:27,481] INFO Step 0: Get all known binlogs from MySQL (io.debezium.connector.mysql.MySqlConnectorTask:309) [2018-03-14 22:14:27,488] INFO MySQL has the binlog file 'mysql-bin.000003' required by the connector (io.debezium.connector.mysql.MySqlConnectorTask:324) [2018-03-14 22:14:27,511] INFO Requested thread factory for connector MySqlConnector, id = dbserver1 named = binlog-client (io.debezium.util.Threads:231) [2018-03-14 22:14:27,525] INFO Creating thread debezium-mysqlconnector-dbserver1-binlog-client (io.debezium.util.Threads:247) [2018-03-14 22:14:27,528] INFO Creating thread debezium-mysqlconnector-dbserver1-binlog-client (io.debezium.util.Threads:247) [2018-03-14 22:14:27,534] INFO Connected to MySQL binlog at localhost:3306, starting at binlog file 'mysql-bin.000003', pos=5662, skipping 2 events plus 1 rows (io.debezium.connector.mysql.BinlogReader:918) [2018-03-14 22:14:27,534] INFO WorkerSourceTask{id=inventory-connector-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask:158) [2018-03-14 22:14:27,535] INFO Creating thread debezium-mysqlconnector-dbserver1-binlog-client (io.debezium.util.Threads:247) [2018-03-14 22:14:27,562] ERROR Error during binlog processing. Last offset stored = null, binlog reader near position = mysql-bin.000003/5876 (io.debezium.connector.mysql.BinlogReader:944) [2018-03-14 22:14:27,562] ERROR Failed due to error: Error processing binlog event (io.debezium.connector.mysql.BinlogReader:165) org.apache.kafka.connect.errors.ConnectException: Failed to parse statement 'CREATE TABLE `new_table` (id INT(11) UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT) ' at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:186) at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:164) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:443) at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793) at java.lang.Thread.run(Thread.java:745) Caused by: io.debezium.text.ParsingException: Failed to parse statement 'CREATE TABLE `new_table` (id INT(11) UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT) ' at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:292) at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:267) at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:322) at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:614) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:427) ... 5 more Caused by: io.debezium.text.ParsingException: Expecting token type 128 at line 1, column 85 but found '': EY AUTO_INCREMENT) ===>> at io.debezium.text.TokenStream.consume(TokenStream.java:737) at io.debezium.relational.ddl.DdlParser.consumeStatement(DdlParser.java:568) at io.debezium.relational.ddl.DdlParser.parseUnknownStatement(DdlParser.java:376) at io.debezium.connector.mysql.MySqlDdlParser.parseNextStatement(MySqlDdlParser.java:176) at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:286) ... 9 more [2018-03-14 22:14:27,564] INFO Error processing binlog event, and propagating to Kafka Connect so it stops this connector. Future binlog events read before connector is shutdown will be ignored. (io.debezium.connector.mysql.BinlogReader:448) [2018-03-14 22:14:28,003] INFO WorkerSourceTask{id=inventory-connector-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:306) [2018-03-14 22:14:28,004] INFO WorkerSourceTask{id=inventory-connector-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:323) [2018-03-14 22:14:28,004] ERROR WorkerSourceTask{id=inventory-connector-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172) org.apache.kafka.connect.errors.ConnectException: Failed to parse statement 'CREATE TABLE `new_table` (id INT(11) UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT) ' at io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:186) at io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:164) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:443) at com.github.shyiko.mysql.binlog.BinaryLogClient.notifyEventListeners(BinaryLogClient.java:1055) at com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:913) at com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:559) at com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:793) at java.lang.Thread.run(Thread.java:745) Caused by: io.debezium.text.ParsingException: Failed to parse statement 'CREATE TABLE `new_table` (id INT(11) UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT) ' at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:292) at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:267) at io.debezium.connector.mysql.MySqlSchema.applyDdl(MySqlSchema.java:322) at io.debezium.connector.mysql.BinlogReader.handleQueryEvent(BinlogReader.java:614) at io.debezium.connector.mysql.BinlogReader.handleEvent(BinlogReader.java:427) ... 5 more Caused by: io.debezium.text.ParsingException: Expecting token type 128 at line 1, column 85 but found '': EY AUTO_INCREMENT) ===>> at io.debezium.text.TokenStream.consume(TokenStream.java:737) at io.debezium.relational.ddl.DdlParser.consumeStatement(DdlParser.java:568) at io.debezium.relational.ddl.DdlParser.parseUnknownStatement(DdlParser.java:376) at io.debezium.connector.mysql.MySqlDdlParser.parseNextStatement(MySqlDdlParser.java:176) at io.debezium.relational.ddl.DdlParser.parse(DdlParser.java:286) ... 9 more [2018-03-14 22:14:28,005] ERROR WorkerSourceTask{id=inventory-connector-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173) [2018-03-14 22:14:28,006] INFO [Producer clientId=producer-4] Closing the Kafka producer with timeoutMillis = 30000 ms. (org.apache.kafka.clients.producer.KafkaProducer:341) [2018-03-14 22:14:36,767] INFO WorkerSourceTask{id=inventory-connector-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:306) [2018-03-14 22:14:36,768] INFO WorkerSourceTask{id=inventory-connector-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:323) [2018-03-14 22:14:46,769] INFO WorkerSourceTask{id=inventory-connector-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask:306) [2018-03-14 22:14:46,770] INFO WorkerSourceTask{id=inventory-connector-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask:323) [2018-03-14 22:14:50,761] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65) [2018-03-14 22:14:50,761] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:154) [2018-03-14 22:14:50,764] INFO Stopped ServerConnector@34f3c84{HTTP/1.1}{0.0.0.0:8083} (org.eclipse.jetty.server.ServerConnector:306) [2018-03-14 22:14:50,774] INFO Stopped o.e.j.s.ServletContextHandler@7df5a248{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:865) [2018-03-14 22:14:50,775] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:165) [2018-03-14 22:14:50,775] INFO Herder stopping (org.apache.kafka.connect.runtime.distributed.DistributedHerder:389) [2018-03-14 22:14:50,776] INFO Stopping connectors and tasks that are still assigned to this worker. (org.apache.kafka.connect.runtime.distributed.DistributedHerder:363) [2018-03-14 22:14:50,777] INFO Stopping connector inventory-connector (org.apache.kafka.connect.runtime.Worker:304) [2018-03-14 22:14:50,777] INFO Stopping task inventory-connector-0 (org.apache.kafka.connect.runtime.Worker:464) [2018-03-14 22:14:50,777] INFO Stopping MySQL connector task (io.debezium.connector.mysql.MySqlConnectorTask:239) [2018-03-14 22:14:50,777] INFO Stopping the binlog reader (io.debezium.connector.mysql.ChainedReader:115) [2018-03-14 22:14:50,778] INFO Stopped connector inventory-connector (org.apache.kafka.connect.runtime.Worker:320) [2018-03-14 22:14:50,779] INFO Stopped reading binlog after 0 events, no new offset was recorded (io.debezium.connector.mysql.BinlogReader:907) [2018-03-14 22:14:50,780] INFO [Producer clientId=inventory-connector-dbhistory] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:341) [2018-03-14 22:14:50,784] INFO Connector task finished all work and is now shutdown (io.debezium.connector.mysql.MySqlConnectorTask:269) [2018-03-14 22:14:50,784] WARN [Worker clientId=connect-1, groupId=connect-cluster] Close timed out with 1 pending requests to coordinator, terminating client connections (org.apache.kafka.clients.consumer.internals.AbstractCoordinator:241) [2018-03-14 22:14:50,786] INFO Stopping KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:159) [2018-03-14 22:14:50,786] INFO [Producer clientId=producer-2] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:341) [2018-03-14 22:14:50,790] INFO Stopped KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:185) [2018-03-14 22:14:50,790] INFO Closing KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:254) [2018-03-14 22:14:50,790] INFO Stopping KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:159) [2018-03-14 22:14:50,790] INFO [Producer clientId=producer-3] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:341) [2018-03-14 22:14:50,793] INFO Stopped KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:185) [2018-03-14 22:14:50,793] INFO Closed KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:256) [2018-03-14 22:14:50,793] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:154) [2018-03-14 22:14:50,793] INFO Stopping KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:115) [2018-03-14 22:14:50,794] INFO Stopping KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:159) [2018-03-14 22:14:50,794] INFO [Producer clientId=producer-1] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. (org.apache.kafka.clients.producer.KafkaProducer:341) [2018-03-14 22:14:50,797] INFO Stopped KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:185) [2018-03-14 22:14:50,797] INFO Stopped KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:117) [2018-03-14 22:14:50,797] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:175) [2018-03-14 22:14:50,798] INFO Herder stopped (org.apache.kafka.connect.runtime.distributed.DistributedHerder:215) [2018-03-14 22:14:50,799] INFO Herder stopped (org.apache.kafka.connect.runtime.distributed.DistributedHerder:409) [2018-03-14 22:14:50,799] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)