Skip to content

Commit 144ac57

Browse files
authored
update inference docs (#1066)
Update to InnerEye inference service docs
1 parent a23b19c commit 144ac57

File tree

1 file changed

+25
-24
lines changed

1 file changed

+25
-24
lines changed

docs/tre-templates/workspace-services/inner-eye.md

Lines changed: 25 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -47,6 +47,7 @@ This will provision Base Workspace, with AML service and InnerEye service, inclu
4747
```cmd
4848
./scripts/gitea_migrate_repo.sh -t <tre_id> -g https://github.com/microsoft/InnerEye-DeepLearning
4949
./scripts/gitea_migrate_repo.sh -t <tre_id> -g https://github.com/analysiscenter/radio
50+
./scripts/gitea_migrate_repo.sh -t <tre_id> -g https://github.com/microsoft/InnerEye-Inference
5051
```
5152
5253
### Setup the InnerEye run from AML Compute Instance
@@ -108,16 +109,37 @@ The workspace service provisions an App Service Plan and an App Service for host
108109
1. Log onto a VM in the workspace and run:
109110

110111
```cmd
111-
git clone https://github.com/microsoft/InnerEye-Inference
112+
git clone https://gitea-<TRE_ID>.azurewebsites.net/giteaadmin/InnerEye-Inference
112113
cd InnerEye-Inference
114+
```
115+
116+
1. Create a file named "set_environment.sh" with the following variables as content:
117+
118+
```bash
119+
#!/bin/bash
120+
export CUSTOMCONNSTR_AZUREML_SERVICE_PRINCIPAL_SECRET=<inference_sp_client_secret-from-above>
121+
export CUSTOMCONNSTR_API_AUTH_SECRET=<generate-a-random-guid--that-is-used-for-authentication>
122+
export CLUSTER=<name-of-compute-cluster>
123+
export WORKSPACE_NAME=<name-of-AML-workspace>
124+
export EXPERIMENT_NAME=<name-of-AML-experiment>
125+
export RESOURCE_GROUP=<name-of-resource-group>
126+
export SUBSCRIPTION_ID=<subscription-id>
127+
export APPLICATION_ID=<inference_sp_client_id-from-above>
128+
export TENANT_ID=<tenant-id>
129+
export DATASTORE_NAME=inferencedatastore
130+
export IMAGE_DATA_FOLDER=imagedata
131+
```
132+
133+
1. Upload the configuration file to the web app:
134+
135+
```cmd
113136
az webapp up --name <inference-app-name> -g <resource-group-name>
114137
```
115138
116139
1. Create a new container in your storage account for storing inference images called `inferencedatastore`.
117140
1. Create a new folder in that container called `imagedata`.
118141
1. Navigate to the ml.azure.com, `Datastores` and create a new datastore named `inferencedatastore` and connect it to the newly created container.
119-
1. The key used for authentication is the `inference_auth_key` provided as an output of the service deployment.
120-
1. Test the service by sending a GET or POST command using curl or Invoke-WebRequest:
142+
1. Test the service by sending a GET or POST command using curl or Invoke-WebRequest where API_AUTH_SECRET is the random GUID generated for CUSTOMCONNSTR_API_AUTH_SECRET above:
121143
122144
Simple ping:
123145
@@ -130,24 +152,3 @@ The workspace service provisions an App Service Plan and an App Service for host
130152
```cmd
131153
Invoke-WebRequest https://yourservicename.azurewebsites.net/v1/model/start/HelloWorld:1 -Method POST -Headers @{'Accept' = 'application/json'; 'API_AUTH_SECRET' = 'your-secret-1234-1123445'}
132154
```
133-
134-
## Alternative: Local Deployment
135-
136-
Instead of provisioning InnerEye workspace service through AzureTRE API, you can also provision resources directly by invoking porter bundles:
137-
138-
1. Create a copy of `templates/workspace_services/innereye/.env.sample` with the name `.env` and update the variables with the appropriate values.
139-
140-
| Environment variable name | Description |
141-
| ------------------------- | ----------- |
142-
| `ID` | A GUID to identify the workspace service. The last 4 characters of this `ID` can be found in the resource names of the workspace service resources. |
143-
| `WORKSPACE_ID` | The GUID identifier used when deploying the base workspace bundle. |
144-
| `INFERENCE_SP_CLIENT_ID` | Service principal client ID used by the inference service to connect to Azure ML. Use the output from the step above. |
145-
| `INFERENCE_SP_CLIENT_SECRET` | Service principal client secret used by the inference service to connect to Azure ML. Use the output from the step above. |
146-
147-
1. Build and install the InnerEye Deep Learning Service bundle
148-
149-
```cmd
150-
make porter-build DIR=./templates/workspace_services/innereye
151-
make porter-publish DIR=./templates/workspace_services/innereye
152-
make porter-install DIR=./templates/workspace_services/innereye
153-
```

0 commit comments

Comments
 (0)