You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/installation.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,7 +41,7 @@ following this you can call any of the SDK methods and the client will performan
41
41
-`NoAPIKeyProvidedError` - If the API key is not set.
42
42
-`ValueError` - If the API key does not conform to the expected format (which looks like eight tokens separated by `-`).
43
43
44
-
#### Using TLS Certificates
44
+
#### Using TLS certificates
45
45
46
46
Command Centre optionally allows you to use self signed client side TLS certificates for authentication. You can use this along side your API key as an additional layer of security.
47
47
@@ -74,7 +74,7 @@ The rest of the requests and operations remain the same, the library will use an
74
74
75
75
> Our testsuites are configured to run with and without TLS certificates to ensure that we support both modes of operation.
76
76
77
-
In instances (such as Github actions, where we store the certificate and key in the Github secrets manager) where you can't store the certficiate and key in the filesystem, you can use Python's `tempfile` module to create temporary files and clean up once you are done using them.
77
+
In instances (such as Github actions, where we store the certificate and key in the Github secrets manager) where you can't store the certificate and key in the filesystem, you can use Python's `tempfile` module to create temporary files and clean up once you are done using them.
This library uses [httpx](https://www.python-httpx.org) as the HTTP transport and [pydantic](https://pydantic.dev) to construct and ingest payloads. We use [taskfile](https://taskfile.dev) to run tasks. Our test suite is setup using `pytest`.
124
124
@@ -140,7 +140,7 @@ Some of the `task` targets take parameters e.g.
140
140
141
141
`task test` will run the entire test suite, while `task test -- test_cardholder.py` will run only the tests in `test_cardholder.py`.
142
142
143
-
### Building the Docs
143
+
### Building the docs
144
144
145
145
The documentation is build using [mkdocs](https://www.mkdocs.org) and hosted on [Github pages](https://anomaly.github.io/gallagher/). The project repository is configured to build and publish the documentation on every commit to the `master` branch.
Copy file name to clipboardExpand all lines: docs/docs/python-sdk.md
+34-6Lines changed: 34 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ poetry add gallagher
26
26
27
27
For production application please make sure you target a particular version of the API client to avoid breaking changes.
28
28
29
-
## Data Transfer Objects (DTO) Premiere
29
+
## Data Transfer Objects (DTO) premiere
30
30
31
31
The Data Transfer Objects or DTOs are the centre piece of the Python SDK. These are built using the much loved [pyndatic](https://pydantic.dev) library. The aim is strict validation of responses and request payloads to ensure that the SDK never falls out of line with Gallagher' REST API.
32
32
@@ -47,7 +47,7 @@ In addition to DTOs, you will see a number of :
47
47
48
48
If you are fetching a `detail` then they are returned on their own as part of the response. They typically contain `href` to related objects.
49
49
50
-
## API Endpoint Lifecycle
50
+
## API endpoint lifecycle
51
51
52
52
You do not need to look under the hood to work with the API client. This section was written for you to understand how we implement Gallagher's requirements for standard based development. Each endpoint inherits from a base class called `APIEndpoint` defined in `gallagher/cc/core.py` and provides a configuration that describes the behaviour of the endpoint (in accordance with the Command Centre API).
In cases where you are targeting a local Command Centre, you can set the `api_base` to the FQDN or IP address of the Command Centre that's locally accessible on the network.
117
117
118
-
### Proxy Support
118
+
### Proxy support
119
119
120
120
Thanks to `httpx` we have proxy support built in out of the box. By default the `proxy` is set to `None` indicating that one isn't in use. If you wish to use a proxy for your use case, then simply set the `proxy` attribute on the `cc` object like you would the `api_base` or `api_key`.
121
121
@@ -270,7 +270,7 @@ while items_summary.next:
270
270
determined from the response object. This ensures that we can update the SDK as the API changes
271
271
leaving your code intact.
272
272
273
-
#Updates and Changes
273
+
## Follow for changes
274
274
275
275
Entities like `Cardholders`, `Alarms`, `Items`, and `Event` provide `updates` or `changes`, that can be monitored for updates. Essentially these are long poll endpoints that:
As a breaking change in `8.90` the operator must have the 'Create Events and Alarms' privilege in the division of the source item, if your request specifies a source item. Current versions only require that the operator has that privilege on at least one division.
346
+
343
347
## Error Handling
344
348
345
349
### Exceptions
@@ -363,6 +367,7 @@ Personal Data Definitions are fields associated to a cardholder and are defined
363
367
- children of the `personalDataFields` key in the cardholder detail
364
368
- accessible via key name prefixed with the `@` symbol i.e the personal data field `Email` is accessible via the key `@Email`
365
369
370
+
366
371
!!! tip
367
372
368
373
Note that the `personDataFields` has a `list` of objects, and each object has a single key which is the nae of the personal data field and the value is the related data.
@@ -395,10 +400,33 @@ and we had used the API client to fetch the cardholder detail (partial example):
395
400
cardholder =await Cardholder.retrieve(340)
396
401
```
397
402
398
-
you could access the `Email` field either via iterating over `cardholder.personal_data_definitions` and looking to match the `key` attribute of the object to `@Email` or using the parsed shortcut `cardholder.pdf.email`.
403
+
`cardholder` would have two fields:
404
+
-`personal_data_definitions` which is a list of `CardholderPersonalDataDefinition` objects
405
+
-`pdf` which is a parsed object of the personal data fields
399
406
400
-
The above is achieved by dynamically populating a placeholder object with dynamically generated keys. These are parsed and populate _once_ when the object has successfully parsed the `JSON` payload.
407
+
`cardholder.personal_data_definitions` is iterable, each instance exposing a `name` and `contents` fields. Use the `value` attribute of `contents` to access the PDF value:
408
+
409
+
```python
410
+
for pdf in cardholder.personal_data_definitions:
411
+
if pdf.name =='@Email':
412
+
print(pdf.name, pdf.contents.value)
413
+
```
401
414
402
415
!!! tip
403
416
404
417
See pyndatic's [Model validator](https://docs.pydantic.dev/latest/concepts/validators/#model-validators) feature in v2, in particular the `@model_validator(mode='after')` constructor.
418
+
419
+
The `cardholder` object will also expose a special attribute called `pdf`. Each instance available in the `personal_data_definitions` field will be mapped to a Pythonic `snake_cased` key, that lets you access the same `CardholderPersonalDataDefinition` object via the `@` prefixed key name. So the above example of accessing the `@Email` field can be done as follows:
420
+
421
+
```python
422
+
cardholder.pdf.email.value
423
+
```
424
+
425
+
The `pdf` attribute is dynamically populated object with dynamically generated keys. Here are some examples of how `PDF` field names are mapped to `snake_case` keys:
426
+
427
+
-`@Cardholder UID` would become `pdf.cardholder_uid`
428
+
-`@City` would become `pdf.city`
429
+
-`@Company Name` would become `pdf.company_name`
430
+
-`@PINNumber` would become `pdf.pin_number`
431
+
432
+
Both approaches have their merits and you should use the one that suits your use case.
0 commit comments