I have created a debezium connector on docker using curl on terminal, and I'm stuck at modifying the existing connector.
My docker file:
---
version: '3'
services:
kafka-connect-02:
image: confluentinc/cp-kafka-connect:latest
container_name: kafka-connect-02
ports:
- 8083:8083
environment:
CONNECT_LOG4J_APPENDER_STDOUT_LAYOUT_CONVERSIONPATTERN: "[%d] %p %X{connector.context}%m (%c:%L)%n"
CONNECT_CUB_KAFKA_TIMEOUT: 300
CONNECT_BOOTSTRAP_SERVERS: "https://***9092"
CONNECT_REST_ADVERTISED_HOST_NAME: 'kafka-connect-02'
CONNECT_REST_PORT: 8083
CONNECT_GROUP_ID: kafka-connect-group-01-v04
CONNECT_CONFIG_STORAGE_TOPIC: _kafka-connect-group-01-v04-configs
CONNECT_OFFSET_STORAGE_TOPIC: _kafka-connect-group-01-v04-offsets
CONNECT_STATUS_STORAGE_TOPIC: _kafka-connect-group-01-v04-status
CONNECT_KEY_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
CONNECT_KEY_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
CONNECT_VALUE_CONVERTER: io.confluent.connect.avro.AvroConverter
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: "https://***9092"
CONNECT_VALUE_CONVERTER_BASIC_AUTH_CREDENTIALS_SOURCE: "USER_INFO"
CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_BASIC_AUTH_USER_INFO: "***:***"
CONNECT_INTERNAL_KEY_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
CONNECT_INTERNAL_VALUE_CONVERTER: 'org.apache.kafka.connect.json.JsonConverter'
CONNECT_LOG4J_ROOT_LOGLEVEL: 'INFO'
CONNECT_LOG4J_LOGGERS: 'org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR'
CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: '3'
CONNECT_PLUGIN_PATH: '/usr/share/java,/usr/share/confluent-hub-components/'
# Confluent Cloud config
CONNECT_REQUEST_TIMEOUT_MS: "20000"
CONNECT_RETRY_BACKOFF_MS: "500"
CONNECT_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_SASL_MECHANISM: "PLAIN"
CONNECT_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
#
CONNECT_CONSUMER_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_CONSUMER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_CONSUMER_SASL_MECHANISM: "PLAIN"
CONNECT_CONSUMER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
CONNECT_CONSUMER_REQUEST_TIMEOUT_MS: "20000"
CONNECT_CONSUMER_RETRY_BACKOFF_MS: "500"
#
CONNECT_PRODUCER_SECURITY_PROTOCOL: "SASL_SSL"
CONNECT_PRODUCER_SSL_ENDPOINT_IDENTIFICATION_ALGORITHM: "https"
CONNECT_PRODUCER_SASL_MECHANISM: "PLAIN"
CONNECT_PRODUCER_SASL_JAAS_CONFIG: "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"***\" password=\"**";"
CONNECT_PRODUCER_REQUEST_TIMEOUT_MS: "20000"
CONNECT_PRODUCER_RETRY_BACKOFF_MS: "500"
# External secrets config
# See https://docs.confluent.io/current/connect/security.html#externalizing-secrets
CONNECT_CONFIG_PROVIDERS: 'file'
CONNECT_CONFIG_PROVIDERS_FILE_CLASS: 'org.apache.kafka.common.config.provider.FileConfigProvider'
command:
- bash
- -c
- |
echo "Installing connector plugins"
confluent-hub install --no-prompt debezium/debezium-connector-sqlserver:0.10.0
confluent-hub install --no-prompt snowflakeinc/snowflake-kafka-connector:0.5.5
#
echo "Launching Kafka Connect worker"
/etc/confluent/docker/run &
#
sleep infinity
My debezium connector:
curl -i -X PUT -H "Content-Type:application/json" http://localhost:8083/connectors/Procura_CDC/config -d '{ "connector.class":"io.debezium.connector.sqlserver.SqlServerConnector",
"tasks.max":"1",
"database.server.name":"***",
"database.hostname":"***",
"database.port":"***",
"database.user":"Kafka",
"database.password":"***",
"database.dbname":"Procura_Prod",
"database.history.kafka.bootstrap.servers":"*****",
"database.history.kafka.topic":"dbhistory.procura",
"table.whitelist":"dbo.CLIENTS,dbo.VISITS",
"poll.interval.ms":"2000",
"snapshot.fetch.size":"2000",
"snapshot.mode":"initial",
"snapshot.isolation.mode":"snapshot",
"transforms":"unwrap,dropPrefix",
"transforms.unwrap.type":"io.debezium.transforms.ExtractNewRecordState",
"transforms.unwrap.drop.tombstones":"false",
"transforms.unwrap.delete.handling.mode":"rewrite",
"transforms.dropPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter",
"transforms.dropPrefix.regex":"procura.dbo.(.*)",
"transforms.dropPrefix.replacement":"$1" }'
I'm getting an error modifying the debezium connector which was already created using the above code. Using PUT
or POST
method doesn't work either. Getting an error "curl: (7) failed to connect to localhost port 8083: Connection refused"
or "curl: (56) Recv failure: connection reset by peer."
or "error_code":500,"message":"Request timed out"
Don't know how to modify it. New to docker, any help would be appreciated. Thanks
You can use PUT
to create, and update, connector configurations. Here's an example:
curl -i -X PUT -H "Content-Type:application/json" \
http://localhost:8083/connectors/source-file-01/config \
-d '{
"connector.class": "org.apache.kafka.connect.file.FileStreamSourceConnector",
"file": "/tmp/totail.txt",
"topic": "foo",
"tasks.max": 6
}'
This creates (or modifies) the connector called source-file-01
. If you want to change its configuration you can just reissue the PUT
whilst changing the necessary values.
This ability to re-run the command is why I always prefer PUT
to POST
in creating connectors, because you don't need to change how you run it based on whether the connector already exists or not.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With