Open
Description
Hello there!
Our system generates a large amount of logs, each containing many unused fields. If we use the following json filter:
filter {
json {
source => "message"
target => "parsed"
remove_field => ["message"]
}
}
then all unused fields will be unmarshaled into key/value pairs, which leads to significant CPU cost.
Is there any method to achieve something like this:
json {
parse_keys => ["@timestamp", "message", "k8s_pod_namespace", "k8s_pod"]
source => "message"
target => "parsed"
remove_field => ["message"]
}
This way, only the specified keys will be parsed, potentially reducing CPU usage significantly. We believe this feature would be extremely beneficial for optimizing performance in environments with large log volumes.
Thank you for considering this request!