ticdc增量同步到kafka报错误

kafka认证方式,没有证书:
security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-256
sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username=‘user’ password=‘password’;

命令:tiup ctl:v6.1.3 cdc changefeed create --pd=http://10.24.7.1:2379 --sink-uri=“kafka://kafka-rshaig1-0.kafka.com:9092,kafka-rshaig1-1.kafka.com:9092,kafka-rshaig1-2.kafka.com:9092/das_news?kafka-version=2.7.1&sasl-user=user&sasl-password=password&sasl-mechanism=SCRAM-SHA-256&partition-num=3&max-message-bytes=10485760&replication-factor=3&protocol=canal-json” --changefeed-id=“changefeed-vv1” --config=/home/tidb/changefeed-vv1.toml

返回信息:
Starting component ctl: /home/tidb/.tiup/components/ctl/v6.1.3/ctl cdc changefeed create --pd=http://10.24.7.1:2379 --sink-uri=kafka://kafka-rshaig1-0.kafka.com:9092,kafka-rshaig1-1.kafka.com:9092,kafka-rshaig1-2.kafka.com:9092/das_news?kafka-version=2.7.1&sasl-user=user&sasl-password=password&sasl-mechanism=SCRAM-SHA-256&partition-num=3&max-message-bytes=10485760&replication-factor=3&protocol=canal-json --changefeed-id=changefeed-vv1 --config=/home/tidb/changefeed-vv1.toml
[2023/07/10 11:03:22.674 +08:00] [WARN] [sink.go:167] [“protocol is specified in both sink URI and config filethe value in sink URI will be usedprotocol in sink URI:canal-json, protocol in config file:default”]
[WARN] some tables are not eligible to replicate, []model.TableName{model.TableName{Schema:“news_v8”, Table:“news_class”, TableID:0, IsPartition:false}}
Could you agree to ignore those tables, and continue to replicate [Y/N]
Y
[2023/07/10 11:03:24.971 +08:00] [WARN] [sink.go:167] [“protocol is specified in both sink URI and config filethe value in sink URI will be usedprotocol in sink URI:canal-json, protocol in config file:canal-json”]
Error: [CDC:ErrKafkaNewSaramaProducer]new sarama producer: Cluster authorization failed.
Usage:
cdc cli changefeed create [flags]

Flags:
-c, --changefeed-id string Replication task (changefeed) ID
–config string Path of the configuration file
–cyclic-filter-replica-ids uints (Experimental) Cyclic replication filter replica ID of changefeed (default [])
–cyclic-replica-id uint (Experimental) Cyclic replication replica ID of changefeed
–cyclic-sync-ddl (Experimental) Cyclic replication sync DDL of changefeed (default true)
–disable-gc-check Disable GC safe point check
-h, --help help for create
–no-confirm Don’t ask user whether to ignore ineligible table
–opts key=value Extra options, in the key=value format
–schema-registry string Avro Schema Registry URI
–sink-uri string sink uri
–sort-engine string sort engine used for data sort (default “unified”)
–start-ts uint Start ts of changefeed
–sync-interval duration (Experimental) Set the interval for syncpoint in replication(default 10min) (default 10m0s)
–sync-point (Experimental) Set and Record syncpoint in replication(default off)
–target-ts uint Target ts of changefeed
–tz string timezone used when checking sink uri (changefeed timezone is determined by cdc server) (default “SYSTEM”)

Global Flags:
–ca string CA certificate path for TLS connection
–cert string Certificate path for TLS connection
-i, --interact Run cdc cli with readline
–key string Private key path for TLS connection
–log-level string log level (etc: debug|info|warn|error) (default “warn”)
–pd string PD address, use ‘,’ to separate multiple PDs (default “http://127.0.0.1:2379”)

[CDC:ErrKafkaNewSaramaProducer]new sarama producer: Cluster authorization failed.
Error: exit status 1

1.已排查域名,账号权限,均无问题,请问命令怎么改

最好确认一下,

到底是明文,还是 SCRAM-SHA-256

看你配置的参数,没啥问题,还是认证没通过…

不是明文,提供方给的是这个:
security.protocol=SASL_PLAINTEXT
sasl.mechanism=SCRAM-SHA-256

你用其他的客户端工具,可以连上么?或者验证的工具

可以的 我专门找人用测试了一下可读可写

ok,参考下这篇帖子中的回答,然后试试:

https://pingkai.cn/tidbcommunity/forum/t/topic/1005687/3

DEBUG没有有用的信息
[kafka-client]这个配置是在哪,cdc.conf吗

不是,在配置文件中,可以指定

6.1.3版本支持这些参数吗
[sink.kafka-config]
sasl-mechanism = “SCRAM-SHA-256”
sasl-user = “user”
sasl-password = “password”

报错:
component TiCDC changefeed’s config file /home/tidb/changefeed-news-v8.toml contained unknown configuration options: sink.kafka-config, sink.kafka-config.sasl-mechanism, sink.kafka-config.sasl-user, sink.kafka-config.sasl-password

6.1 好像不支持, :rofl: :face_with_spiral_eyes:

6.5之后好像才支持这种配置模式
https://github.com/pingcap/docs-cn/blob/release-6.5/ticdc/ticdc-changefeed-config.md


你用命令行的配置模式,还是不行么?

$ kafka-console-producer.sh --broker-list --topic --producer.config --property “sasl.mechanism=SCRAM-SHA-256” --property “sasl.jaas.config=org.apache.kafka.common.security.scram.ScramLoginModule required username= password=;”

参考这个命令是成功的,对么?


kafka可以正常读写。目标端kafka版本为2.7.1
今天我用高版本kafka3.1.2 client端 .kafka-console-producer.sh测试会报Cluster authorization failed.
用kafka2.7.1 client端.kafka-console-producer.sh测试可以正常读写。
请问ticdc的kafka client端是不是也有版本区分,ticdc是v6.1.3,此版本是否支持kafka2.7.1

这儿有版本的描述

我们下游kafka版本是2.7.1是支持的呀,为什么认证还是有问题

不用认证的方式,明文的方式可以么?


感觉还是支持得不够好…

无认证是可以的

那估计是这个版本还不支持… 如果你要用 6.5.x,最好用 6.5.2,6.5.3 有大的bug,ticdc,这个要注意。

如果明文可以,就目前这个版本对接明文就好了,不折腾了 :upside_down_face: :upside_down_face: :upside_down_face: :upside_down_face:

好 我们测试环境测试一下6.5.2

原来我的kafka账号只有WRITE和READ权限,后来加了
对 Topic 资源类型的 Create 、Describe(这) 权限。
对 Cluster 资源类型的 DescribeConfigs 权限。

官网写了

我们没有注意到这里 现在问题解决了 多谢支持

额,我刚才还在看 6.1.7 的发布明细,修复了不少bug,可以考虑下
https://docs.pingcap.com/zh/tidb/stable/release-6.1.7

此话题已在最后回复的 60 天后被自动关闭。不再允许新回复。