You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Aug 23, 2024. It is now read-only.
Copy file name to clipboardExpand all lines: README.md
+11-7Lines changed: 11 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,9 +1,13 @@
1
-
Copyright 2016 Distributed Management Task Force, Inc. All rights reserved.
1
+
Copyright 2016-2018 Distributed Management Task Force, Inc. All rights reserved.
2
+
2
3
# Redfish Service Conformance Check Tool
4
+
3
5
This tool checks an operational Redfish Service to see that it conforms to the normative statements from the Redfish specification (see assertions in redfish-assertions.xlxs). Assertion coverage is growing (development in process) and future revisions of the tool will increase coverage of the Assertions. To see which Assertions are covered by a revision - run the tool and look at the markup to the SUTs copy of the xlxs file after the check is run.
4
6
5
7
This program has been tested with Python 2.7.10 and 3.4.3.
6
-
## Installation and Invocation ##
8
+
9
+
## Installation and Invocation
10
+
7
11
1. Install one of the python revisions noted as 'verified' (any 2.7+ or 3.4+ "should" work... but the current release was checked only with the 'Verified/operational' ones)
8
12
2. This tool imports `openpyxl`, `requests` and `beautifulsoup4`, which are not installed by default in python. You will need to install them using 'pip'. Execute the following command:
9
13
@@ -23,15 +27,15 @@ This program has been tested with Python 2.7.10 and 3.4.3.
23
27
- Set the parameters for Event Subscription and related Test Event generation. Note that the Event related assertions do not verify that a Test Event actually gets delivered to the "Destination" you specify - but the assertions will create a Subscription and request that the Service issue a Test Event to the Subscription "Destination" using the Test Event parameters you set here
24
28
5. For operational results, open a DOS box and cd to the directory where you placed the files included with this package (example C:\rf_client_dir) and then run rf_client.py. (Make sure openpyxl is installed with this version of python else it will error out.)
25
29
26
-
C:\rf_client_dir> python rf_client.py
30
+
`C:\rf_client_dir> python rf_client.py`
27
31
6. Check results:
28
-
- rf_client.py will log results to rf-assertions-log.txt (append) and creates a <timestamp>_rf-assertions-run.xlxs under script_dir/logs/<DisplayName>/ folder.
29
-
- The text log is an appended log for all test runs for SUT <DisplayName> but the xlxs files are created each time assertions are run for <DisplayName>.
30
-
- For example, if properties.json has SUTs['DisplayName'] "Contoso_server1" then "log/ContosoServer1/ will be created and <timestamp>_rf-assertions-run.xlxs" will be created each time you run rf_client.py with "ContosoServer1" configured in properties.json.
32
+
- rf_client.py will log results to rf-assertions-log.txt (append) and creates \<timestamp\>_rf-assertions-run.xlxs under script_dir/logs/\<DisplayName\>/ folder.
33
+
- The text log is an appended log for all test runs for SUT \<DisplayName\> but the xlxs files are created each time assertions are run for \<DisplayName\>.
34
+
- For example, if properties.json has SUTs['DisplayName'] "Contoso_server1" then "log/ContosoServer1/ will be created and \<timestamp\>_rf-assertions-run.xlxs" will be created each time you run rf_client.py with "ContosoServer1" configured in properties.json.
31
35
- Red/Yellow/Green = Fail/Warn/Pass.
32
36
- The Assertions which are not covered by the check are not color marked in the xlxs.
33
37
34
-
35
38
## Work in progress items/limitations:
39
+
36
40
1. Work in progress items are either annotated with 'WIP' or 'todo'. They dont affect the completed portion of the tool which should successfully run.
37
41
2. Current implementation for schemas found in local directory (or remotely retrieved) does not guarantee that SUT service is using the same version of the schema files. It is a WIP for this tool. Please make sure that schema file version found in $metadata for SUT is the same as the version of files in the directory to get correct results
0 commit comments