
我正在嘗試讓 filebeat 使用 kafka 輸入來消費來自 kafka 的訊息。由於某種原因,我無法使用 SASL 進行身份驗證,我不確定為什麼會這樣。當嘗試將 Kafka 和 Filebeat 與 SASL 一起使用時,其文件都有點缺乏。
我的filebeat配置如下:
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
filebeat.inputs:
- type: kafka
hosts: 'the.kafka.server.com:9092'
topics: 'my_topic'
group_id: 'my_group'
ssl.enabled: yes
username: "$ConnectionString"
password: "org.apache.kafka.common.security.plain.PlainLoginModule required username='my_username' password='my_password';"
processors:
- add_cloud_metadata: ~
- add_docker_metadata: ~
output.console:
pretty: true
輸出顯示
INFO input/input.go:114 Starting input of type: kafka; ID: 14409252276502564738
INFO kafka/log.go:53 kafka message: Initializing new client
INFO kafka/log.go:53 client/metadata fetching metadata for all topics from broker the.kafka.server.com:9092
INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
INFO cfgfile/reload.go:171 Config reloader started
INFO cfgfile/reload.go:226 Loading of config files completed.
INFO kafka/log.go:53 kafka message: Successful SASL handshake. Available mechanisms: %!(EXTRA []string=[PLAIN OAUTHBEARER])
INFO kafka/log.go:53 Failed to read response while authenticating with SASL to broker the.kafka.server.com:9092: EOF
INFO kafka/log.go:53 Closed connection to broker the.kafka.server.com:9092
INFO kafka/log.go:53 client/metadata got error from broker -1 while fetching metadata: EOF
我不確定這裡發生了什麼事。我還嘗試添加compression: none
沒有幫助的內容,並使用 openssl 驗證伺服器憑證是否能夠驗證。我在這裡做錯了什麼?有問題的 kafka 伺服器是雲端託管的 kafka 伺服器,我看不到伺服器配置,我從 kafka 的雲端 UI 中獲得了「連接字串」。
答案1
我發現了這個問題,該$ConnectionString
語法不適用於融合的雲端 kafka 叢集。正確的語法如下:
filebeat.inputs:
- type: kafka
hosts: 'the.kafka.server.com:9092'
topics: 'my_topic'
group_id: 'my_group'
ssl.enabled: yes
username: <API KEY>
password: <API SECRET>
這足以讓它連接和消耗