Skip to content

Create opensearch.py #4028

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
65 changes: 65 additions & 0 deletions sos/report/plugins/opensearch.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
# Copyright (C) 2025 Henry AlOudaimy <[email protected]>
#
# This file is part of the sos project: https://github.com/sosreport/sos
#
# This copyrighted material is made available to anyone wishing to use,
# modify, copy, or redistribute it subject to the terms and conditions of
# version 2 of the GNU General Public License.
#
# See the LICENSE file in the source distribution for further information.

import re
from sos.report.plugins import Plugin, IndependentPlugin


class OpenSearch(Plugin, IndependentPlugin):

short_desc = 'OpenSearch service'
plugin_name = 'opensearch'
profiles = ('services', )

packages = ('opensearch',)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a service we should monitor? If so, it may be worth adding:

services = ('opensearch',)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added this service

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

def get_hostname_port(self, opensearch_config_file):
""" Get hostname and port number """
hostname = "localhost"
port = "9200"
try:
with open(opensearch_config_file, encoding='UTF-8') as fread:
for line in fread:
network_host = re.search(r'(^network.host):(.*)', line)
network_port = re.search(r'(^http.port):(.*)', line)
if network_host and len(network_host.groups()) == 2:
hostname = network_host.groups()[-1].strip()
hostname = re.sub(r'"|\'', '', hostname)
continue
if network_port and len(network_port.groups()) == 2:
port = network_port.groups()[-1].strip()
except Exception as err: # pylint: disable=broad-except
self._log_info(f"Failed to parse {opensearch_config_file}: {err}")
return hostname, port

def setup(self):
opensearch_config_file = self.path_join(
"/etc/opensearch/opensearch.yml"
)
Comment on lines +44 to +46
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not simple opensearch_config_file = "/etc/opensearch/opensearch.yml"? Is it because you open the file in get_hostname_port and the absolute filename is different inside a container (due to changed sysroot)?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This reference path can be relative to a different sysroot (rather than hardcoding the full path).. Also this is almost a replica of the elasticsearch.py plugin, which works almost similarly for OpenSearch (except for the charmed versions, which might require customizations)

self.add_copy_spec(opensearch_config_file)

if self.get_option("all_logs"):
self.add_copy_spec("/var/log/opensearch/*")
else:
self.add_copy_spec("/var/log/opensearch/*.log")

host, port = self.get_hostname_port(opensearch_config_file)
endpoint = host + ":" + port
self.add_cmd_output([
f"curl -X GET '{endpoint}/_cluster/settings?pretty'",
f"curl -X GET '{endpoint}/_cluster/health?pretty'",
f"curl -X GET '{endpoint}/_cluster/stats?pretty'",
f"curl -X GET '{endpoint}/_cat/nodes?v'",
f"curl -X GET '{endpoint}/_cat/indices'",
f"curl -X GET '{endpoint}/_cat/shards'",
f"curl -X GET '{endpoint}/_cat/aliases'",
])
Comment on lines +56 to +64
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How robust is to invoke so many curl commands? If there would be some timeouting problem with the (local) peer - something that can happen as sos report is called to diagnose any kind of problems - then invoking these curls will get much time till the commands or plugin timeout.

Is that acceptable behaviour? Isnt it worth decreasing the commands timeout to limit this possible negative impact? (i.e. if you know any curl command usually finishes in a few seconds, put timeout to, say 30s).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the curl output shouldn't take long (simple queries), plus this is a replica of the elasticsearch.py plugin which works fine, I believe there should be no issue in those APIs..


# vim: set et ts=4 sw=4 :