Creates FHIR resources from Averbis Health Discovery NLP Annotations.
Set the required environment variables:
export AHD_URL=http://host.docker.internal:9999/health-discovery
export AHD_API_TOKEN=1bbd10e7a18f01fd51d03cb81d505e0c6cfdcd73b0fc98e8300592afa4a90148
export AHD_PROJECT=test
export AHD_PIPELINE=discharge
export IMAGE_TAG=latest # see https://github.com/miracum/ahd2fhir/releases for immutable tagsLaunch the ahd2fhir service which is exposed on port 8080 by default:
docker compose up -dSend a FHIR DocumentReference to the service and receive a bundle of FHIR resources back:
curl -X POST \
-H "Content-Type: application/fhir+json" \
-d @tests/resources/fhir/documentreference.json \
http://localhost:8080/fhir/\$analyze-documentThe service supports both individual FHIR DocumentReference resources as well as Bundles of them.
You can also access the Swagger API documentation at http://localhost:8080/docs.
| Environment variable | Description | Default |
|---|---|---|
AHD_URL |
URL of the AHD installation. Should not end with a trailing '/'. | "" |
AHD_API_TOKEN |
An API token to access the AHD REST API. | "" |
AHD_USERNAME |
Username for username+password based authentication against the API | "" |
AHD_PASSWORD |
Password for username+password based authentication against the API | "" |
AHD_ENSURE_PROJECT_IS_CREATED_AND_PIPELINE_IS_STARTED |
If enabled, attempt to create the specified project and start the pipeline. Requires the use of username+password for auth. | false |
AHD_PROJECT |
Name of the AHD project. This needs to be created before ahd2fhir is started. | "" |
AHD_PIPELINE |
Name of the AHD pipeline. This needs to be created before ahd2fhir is started. | "" |
Most relevant Kafka settings. See config.py for a complete list.
As the settings are composed of pydantic settings,
use the corresponding env_prefix value to override defaults.
| Environment variable | Description | Default |
|---|---|---|
KAFKA_ENABLED |
Whether to enable support for reading resources from Apache Kafka. | false |
KAFKA_BOOTSTRAP_SERVERS |
Host and port of the Kafka bootstrap servers. | localhost:9094 |
KAFKA_SECURITY_PROTOCOL |
The security protocol used to connect with the Kafka brokers. | PLAINTEXT |
KAFKA_CONSUMER_GROUP_ID |
The Kafka consumer group id. | ahd2fhir |
KAFKA_INPUT_TOPIC |
The input topic to read FHIR DocumentReferences or Bundles thereof from. | fhir.documents |
KAFKA_OUTPUT_TOPIC |
The output topic to write the extracted FHIR resources to. | fhir.nlp-results |
pip install -r requirements-dev.txtStarts an AHD server:
docker login registry.averbis.com -u "Username" -p "Password"
docker compose -f compose.dev.yml upStarts both AHD and Kafka and starts constantly filling a fhir.documents topic with sample DocumentReference resources.
docker compose -f compose.dev.yml --profile=kafka upNote If you set
AHD_ENSURE_PROJECT_IS_CREATED_AND_PIPELINE_IS_STARTED=true, ahd2fhir will attempt to create the necessary project and run the pipeline on startup. You won't need to manually do the steps below.
- Open AHD on http://localhost:9999/health-discovery/#/login and login as
adminwith passwordadmin. - Click on
Project Administration->Create Project. - Set
Nametotest. - Click on the newly created project
test - Click on
Pipeline Configuration - Select the
dischargepipeline and click onStart Pipeline - In the top-right corner, click on
admin->Manage API Token - Click on
Generatefollowed byCopy to clipboard - Paste the new API token in the .env.development file as the value for the
AHD_API_TOKEN
export PYTHONPATH=${PWD}
uvicorn --env-file=.env.development --app-dir=ahd2fhir main:app --reload --log-level=debugNote To enable reading FHIR DocumentReferences from a Kafka topic during development, make sure to set the env var
KAFKA_ENABLED=true
Uses the environment configuration from the .env.development file. You will need to modify the AHD_ env vars for
your local deployment.
Note the use of host.docker.internal so the running container can still access the version of AHD launched via
compose.dev.yml.
Also use your own manually created API-TOKEN below.
docker build -t ahd2fhir:local .
docker run \
--rm -it -p 8081:8080 \
--network=ahd2fhir_default \
-e AHD_URL=http://health-discovery-hd:8080/health-discovery \
-e AHD_API_TOKEN=<insert API-TOKEN here> \
-e AHD_PROJECT=test \
-e AHD_PIPELINE=discharge \
-e AHD_ENSURE_PROJECT_IS_CREATED_AND_PIPELINE_IS_STARTED=true \
-e KAFKA_ENABLED=true \
-e KAFKA_BOOTSTRAP_SERVERS=kafka:9092 \
ahd2fhir:localpytest --cov=ahd2fhirIf the snapshot tests fail, you may need to update them using:
pytest --snapshot-updatebut make sure the changed snapshots are actually still valid! You can use the Firely Terminal to do so:
pre-commit install
pre-commit install --hook-type commit-msgpip install git+https://github.com/miracum/ahd2fhir@masterimport json
from fhir.resources.R4B.documentreference import DocumentReference
from fhir.resources.R4B.reference import Reference
from ahd2fhir.mappers import ahd_to_condition
with open('tests/resources/ahd/payload_1.json') as json_resource:
ahd_payload = json.load(json_resource)
# Create Patient reference and DocumentReference
pat = FHIRReference(**{'reference': f'Patient/f1234'})
doc = DocumentReference.construct()
doc.subject = pat
doc.date = '2020-05-14'
conditions = ahd_to_condition.get_fhir_condition(ahd_payload, doc)