-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Closed
Description
Describe the bug
I am trying to reference environment var for the password in fluentd as shown below
rdkafka_options {
"log_level" : 7,
"sasl.mechanism" : "SCRAM-SHA-512",
"security.protocol" : "sasl_ssl",
"sasl.username" : "<user>",
"sasl.password" : "#{ENV['KAFKA_PASS']}"
}
but the variable value is not available although I can see it inside fluentd container env vars and failing with Authentication failure unless I hardcoded the password inside the file which I don't want
To Reproduce
create env var KAFKA_PASS and then try to use inside the config
rdkafka_options {
"log_level" : 7,
"sasl.mechanism" : "SCRAM-SHA-512",
"security.protocol" : "sasl_ssl",
"sasl.username" : "<user>",
"sasl.password" : "#{ENV['KAFKA_PASS']}"
}
Expected behavior
the variable should be expanded correctly
Your Environment
- Fluentd version: 1.14.0
- TD Agent version:
- Operating system: Alpine Linux v3.13
- Kernel version: 5.15.58-flatcarYour Configuration
<source>
@type tail
path /logs/quarkus.log
tag file.all
<parse>
@type regexp
expression /^(?<datetime>[0-9- :,]+) (?<host>[0-9a-zA-Z\-\.\+]+) (?<processname>.+?) (?<loglevel>.+) +\[(?<logger>[a-zA-Z-.]+?)\] \((?<thread>.+?)\) (?<logmessage>.+)$/
</parse>
</source>
<match file.all>
@type rewrite_tag_filter
<rule>
key logmessage
pattern /ULFFRecord\:\ (?<ulffrecord>.+)$/
tag file.ulff
</rule>
<rule>
key logmessage
pattern /./
tag file.generic
</rule>
</match>
<filter file.ulff>
@type parser
key_name logmessage
<parse>
@type regexp
expression /^ULFFRecord\:\ (?<ulffrecord>.+)$/
</parse>
</filter>
<filter file.ulff>
@type parser
format json
key_name ulffrecord
</filter>
<match file.ulff>
@type rdkafka2
brokers "<broker>"
get_kafka_client_log true
default_topic ulff
flush_interval 3s
use_event_time true
rdkafka_options {
"log_level" : 7,
"sasl.mechanism" : "SCRAM-SHA-512",
"security.protocol" : "sasl_ssl",
"sasl.username" : "<user>",
"sasl.password" : "#{ENV['KAFKA_PASS']}"
}
<buffer>
flush_mode interval
flush_interval 2s
</buffer>
<format>
@type "json"
</format>
</match>
<match file.generic>
@type rdkafka2
enable_ruby
brokers "<broker>"
get_kafka_client_log true
default_topic custom
use_event_time true
rdkafka_options {
"log_level" : 7,
"sasl.mechanism" : "SCRAM-SHA-512",
"security.protocol" : "sasl_ssl",
"sasl.username" : "<user>",
"sasl.password" : "#{ENV['KAFKA_PASS']}"
}
<buffer>
flush_mode interval
flush_interval 2s
</buffer>
<format>
@type "json"
</format>
</match>
### Your Error Log
```shell
SASL authentication error: Authentication failed during authentication due to invalid credentials with SASL mechanism
Additional context
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels