-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
File watchers might not be handled properly causing gradual increase in CPU/Memory usage #4381
Comments
Thanks for your report!
So, |
@daipom In this case I had Do you want me to open a new issue just for tracking? |
@uristernik I'd like to know if there is any difference between If there is no particular difference, we are fine with this for now. |
We are facing the same issue. Error Message We are using
Memory keeps on gradually growing too! Any resolution on this? |
@shadowshot-x Sorry for my late response. Thanks for your report. |
Describe the bug
Fluentd tail plugin was outputting
If you keep getting this message, please restart Fluentd
. After coming across #3614, we implemented the workaround suggested there.follow_inodes
totrue
rotate_wait
to0
Since than we are not seeing the original
If you keep getting this message, please restart Fluentd
but still seeing lots ofSkip update_watcher because watcher has been already updated by other inotify event
.This is paired with a pattern of memory leaking and gradual increase in CPU usage until a restart occurs.
To mitigate this I added
pos_file_compaction_interval 20m
as suggested here but this had no affect on the resource usage.Related to #3614. More specifically #3614 (comment)
The suspicion is that some Watchers are not handled properly thus leaking and increasing CPU/Memory consumption until the next restart.
To Reproduce
Deploy fluentd (version v1.16.3-debian-forward-1.0) as a daemonset in a dynamic kubernetes cluster. Cluster is consisting of 50-100 nodes. This is the fluentd config:
Expected behavior
CPU / Memory should stay stable.
Your Environment
Your Configuration
Additional context
#3614
The text was updated successfully, but these errors were encountered: