Skip to content

The file descriptor has not been released #18015

@xiaoertong

Description

@xiaoertong

Logstash information:

Please include the following information:

  1. Logstash version 8.3.3
  2. Logstash installation source :docker
  3. How is Logstash being run kubernetes

Plugins installed: (bin/logstash-plugin list --verbose)
logstash-codec-avro (3.4.0)
logstash-codec-cef (6.2.5)
logstash-codec-collectd (3.1.0)
logstash-codec-dots (3.0.6)
logstash-codec-edn (3.1.0)
logstash-codec-edn_lines (3.1.0)
logstash-codec-es_bulk (3.1.0)
logstash-codec-fluent (3.4.1)
logstash-codec-graphite (3.0.6)
logstash-codec-json (3.1.0)
logstash-codec-json_lines (3.1.0)
logstash-codec-line (3.1.1)
logstash-codec-msgpack (3.1.0)
logstash-codec-multiline (3.1.1)
logstash-codec-netflow (4.2.2)
logstash-codec-plain (3.1.0)
logstash-codec-rubydebug (3.1.0)
logstash-filter-aggregate (2.10.0)
logstash-filter-anonymize (3.0.6)
logstash-filter-cidr (3.1.3)
logstash-filter-clone (4.2.0)
logstash-filter-csv (3.1.1)
logstash-filter-date (3.1.15)
logstash-filter-de_dot (1.0.4)
logstash-filter-dissect (1.2.5)
logstash-filter-dns (3.1.5)
logstash-filter-drop (3.0.5)
logstash-filter-elasticsearch (3.12.0)
logstash-filter-fingerprint (3.4.1)
logstash-filter-geoip (7.2.12)
logstash-filter-grok (4.4.2)
logstash-filter-http (1.4.1)
logstash-filter-json (3.2.0)
logstash-filter-kv (4.7.0)
logstash-filter-memcached (1.1.0)
logstash-filter-metrics (4.0.7)
logstash-filter-mutate (3.5.6)
logstash-filter-prune (3.0.4)
logstash-filter-ruby (3.1.8)
logstash-filter-sleep (3.0.7)
logstash-filter-split (3.1.8)
logstash-filter-syslog_pri (3.1.1)
logstash-filter-throttle (4.0.4)
logstash-filter-translate (3.3.1)
logstash-filter-truncate (1.0.5)
logstash-filter-urldecode (3.0.6)
logstash-filter-useragent (3.3.3)
logstash-filter-uuid (3.0.5)
logstash-filter-xml (4.1.3)
logstash-input-azure_event_hubs (1.4.4)
logstash-input-beats (6.4.0)
└── logstash-input-elastic_agent (alias)
logstash-input-couchdb_changes (3.1.6)
logstash-input-dead_letter_queue (1.1.12)
logstash-input-elasticsearch (4.14.0)
logstash-input-exec (3.4.0)
logstash-input-file (4.4.3)
logstash-input-ganglia (3.1.4)
logstash-input-gelf (3.3.1)
logstash-input-generator (3.1.0)
logstash-input-graphite (3.0.6)
logstash-input-heartbeat (3.1.1)
logstash-input-http (3.6.0)
logstash-input-http_poller (5.3.1)
logstash-input-imap (3.2.0)
logstash-input-jms (3.2.2)
logstash-input-pipe (3.1.0)
logstash-input-redis (3.7.0)
logstash-input-s3 (3.8.4)
logstash-input-snmp (1.3.1)
logstash-input-snmptrap (3.1.0)
logstash-input-sqs (3.3.2)
logstash-input-stdin (3.4.0)
logstash-input-syslog (3.6.0)
logstash-input-tcp (6.3.0)
logstash-input-twitter (4.1.0)
logstash-input-udp (3.5.0)
logstash-input-unix (3.1.1)
logstash-integration-elastic_enterprise_search (2.2.1)
├── logstash-output-elastic_app_search
└── logstash-output-elastic_workplace_search
logstash-integration-jdbc (5.3.0)
├── logstash-input-jdbc
├── logstash-filter-jdbc_streaming
└── logstash-filter-jdbc_static
logstash-integration-kafka (10.12.0)
├── logstash-input-kafka
└── logstash-output-kafka
logstash-integration-rabbitmq (7.3.0)
├── logstash-input-rabbitmq
└── logstash-output-rabbitmq
logstash-output-cloudwatch (3.0.10)
logstash-output-csv (3.0.8)
logstash-output-elasticsearch (11.6.0)
logstash-output-email (4.1.1)
logstash-output-file (4.3.0)
logstash-output-graphite (3.1.6)
logstash-output-http (5.5.0)
logstash-output-lumberjack (3.1.9)
logstash-output-nagios (3.0.6)
logstash-output-null (3.0.5)
logstash-output-pipe (3.0.6)
logstash-output-redis (5.0.0)
logstash-output-s3 (4.3.7)
logstash-output-sns (4.0.8)
logstash-output-sqs (6.0.0)
logstash-output-stdout (3.1.4)
logstash-output-tcp (6.1.0)
logstash-output-udp (3.2.0)
logstash-output-webhdfs (3.0.6)
logstash-patterns-core (4.3.4)
JVM (e.g. java -version):

OS version (uname -a if on a Unix-like system):
Linux log-collect-prqfn 3.10.0-1160.el7.x86_64 #1 SMP Mon Oct 19 16:18:59 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
Description of the problem including expected versus actual behavior:
Hi, After the file was no longer written, the file handle should have been released after some time, but it was not released, Only one error was found in the log.

Steps to reproduce:

Please include a minimal but complete recreation of the problem,
including (e.g.) pipeline definition(s), settings, locale, etc. The easier
you make for us to reproduce it, the more likely that somebody will take the
time to look at it.

  1. my conf like:
    input {
    file {
    path => ["/host/var/lib/kubelet/pods/abc7a28a-f5c9-4fb8-bae9-a7e65b5886c8/volume-subpaths/logs/flink-main-container/0/flink-*"]
    start_position => "end"
    close_older => 150
    codec => "plain"
    }
    }
    filter {
    mutate {
    add_field => {
    "k8s_pod" => "xxxxxxxxxxx"
    "k8s_pod_namespace" => "xxxxxx"
    "k8s_container_name" => "xxxxx"
    "k8s_hostname" => "xxxxxx"
    "k8s_hostip" => "xxxxxx"
    "svc" => "xxxxxxx"
    "kafka_storage_type" => "hdfs"
    }
    remove_field => ["event"]
    }
    ruby {
    code => '
    begin
    # 计算事件大小
    event_json = event.to_json
    size = event_json.bytesize
    if size > 921600 # 900KB
    logger.warn("Dropping large event",
    :size => size,
    :pod => event.get("k8s_pod"),
    :namespace => event.get("k8s_pod_namespace"),
    :message_preview => event.get("message").to_s[0..100]
    )
    event.cancel
    end
    rescue => e
    logger.error("Error processing event", :error => e.message)
    event.cancel
    end
    '
    }
    }

output {
kafka {
bootstrap_servers => "xxxxxxx"
security_protocol => "SASL_PLAINTEXT"
sasl_mechanism => "PLAIN"
jaas_path => "/root/kafka-client-jaas.conf"
topic_id => "xxxxxxx"
codec => "json"
message_key => "%{k8s_pod}"
}
}

Provide logs (if relevant):
[2025-08-20T07:29:56,033][ERROR][logstash.javapipeline ][229cab5dae63d044eea800f9cdd532291af76c28d15c369c99010d958bd7332a-2214205829][61c74adac2d4809f7194641ef12fd982dd0bef5e0e09308f468e672594eb8978] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:229cab5dae63d044eea800f9cdd532291af76c28d15c369c99010d958bd7332a-2214205829
Plugin: <LogStash::Inputs::File start_position=>"end", path=>["/host/var/lib/kubelet/pods/e483ca96-6ee2-4f47-a274-2989ef7436f5/volume-subpaths/logs/flink-main-container/0/flink-*"], codec=><LogStash::Codecs::Plain id=>"plain_a269136c-75a0-4598-a3e3-b9604fa19675", enable_metric=>true, charset=>"UTF-8">, id=>"61c74adac2d4809f7194641ef12fd982dd0bef5e0e09308f468e672594eb8978", close_older=>150.0, enable_metric=>true, stat_interval=>1.0, discover_interval=>15, sincedb_write_interval=>15.0, delimiter=>"\n", mode=>"tail", file_completed_action=>"delete", sincedb_clean_after=>1209600.0, file_chunk_size=>32768, file_chunk_count=>140737488355327, file_sort_by=>"last_modified", file_sort_direction=>"asc", exit_after_read=>false, check_archive_validity=>false>
Error: undefined method sysseek' for nil:NilClass Exception: NoMethodError Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/watched_file.rb:224:in file_seek'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/handlers/shrink.rb:7:in handle_specifically' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/handlers/base.rb:25:in handle'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/processor.rb:47:in shrink' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/processor.rb:241:in block in process_active'
org/jruby/RubyArray.java:1821:in each' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/processor.rb:228:in process_active'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/tail_mode/processor.rb:75:in process_all_states' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/watch.rb:70:in iterate_on_state'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/watch.rb:44:in subscribe' /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/filewatch/observing_tail.rb:12:in subscribe'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-file-4.4.3/lib/logstash/inputs/file.rb:370:in run' /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410:in inputworker'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401:in `block in start_input'
[2025-08-20T07:29:57,034][INFO ][filewatch.observingtail ][229cab5dae63d044eea800f9cdd532291af76c28d15c369c99010d958bd7332a-2214205829][61c74adac2d4809f7194641ef12fd982dd0bef5e0e09308f468e672594eb8978] QUIT - closing all files and shutting down.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions