Skip to content

Commit ff33c60

Browse files
committed
feat(infra): workshop environment helper stack
1 parent cd3f3d5 commit ff33c60

15 files changed

+1687
-0
lines changed

.workshop-infra/Makefile

+85
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: MIT-0
3+
STACK_NAME?="textract-transformers-workshop"
4+
DEPLOYMENT_BUCKET_NAME?="UNDEFINED"
5+
DEPLOYMENT_BUCKET_PREFIX?=""
6+
TARGET_REPO?="https://github.com/aws-samples/amazon-textract-transformer-pipeline"
7+
8+
target:
9+
$(info ${HELP_MESSAGE})
10+
@exit 0
11+
12+
package: ##=> Build SAM template & assets to CloudFormation on S3
13+
$(info [*] Building AWS SAM stack...)
14+
sam build \
15+
--use-container \
16+
--template template.sam.yaml && \
17+
sam package \
18+
--s3-bucket $(DEPLOYMENT_BUCKET_NAME) \
19+
--s3-prefix $(DEPLOYMENT_BUCKET_PREFIX)sam \
20+
--use-json \
21+
--output-template-file template.tmp.json && \
22+
python sam-postproc.py template.tmp.json template.tmp.json && \
23+
aws s3 cp template.tmp.json \
24+
s3://$(DEPLOYMENT_BUCKET_NAME)/$(DEPLOYMENT_BUCKET_PREFIX)template.cf.json
25+
26+
# CF with --disable-rollback is faster for debugging than sam deploy
27+
create: ##=> Create services stack (only)
28+
$(info [*] Deploying...)
29+
aws cloudformation create-stack \
30+
--template-body file://template.tmp.yaml \
31+
--stack-name $(STACK_NAME) \
32+
--capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND \
33+
--disable-rollback
34+
# --parameters \
35+
# ParameterKey=ParamName,ParameterValue=$(PARAM_VAR)
36+
37+
deploy: ##=> Deploy services (flexible create or update)
38+
$(info [*] Deploying...)
39+
sam deploy \
40+
--template-file template.tmp.yaml \
41+
--stack-name $(STACK_NAME) \
42+
--capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND \
43+
--no-fail-on-empty-changeset
44+
# --parameter-overrides \
45+
# ParamName=$(PARAM_VAR)
46+
47+
all: ##=> Build and create stack
48+
@$(MAKE) package
49+
@$(MAKE) create
50+
51+
delete: ##=> Delete services
52+
$(info [*] Deleting stack...)
53+
aws cloudformation delete-stack --stack-name $(STACK_NAME)
54+
55+
56+
#############
57+
# Helpers #
58+
#############
59+
60+
define HELP_MESSAGE
61+
62+
STACK_NAME: "textract-transformers-workshop"
63+
Description: Stack Name to deploy/redeploy to
64+
DEPLOYMENT_BUCKET_NAME:
65+
Description: Amazon S3 bucket for staging built SAM Lambda bundles and assets
66+
DEPLOYMENT_BUCKET_PREFIX: ""
67+
Description: For publishing to a prefix in your deployment bucket, instead of root. Should
68+
include trailing slash e.g. 'my-prefix/'
69+
TARGET_REPO: "https://github.com/aws-samples/amazon-textract-transformer-pipeline"
70+
Target repository where your workshop code lives
71+
72+
Common usage:
73+
74+
...::: Build all SAM based services :::...
75+
$ make package
76+
77+
...::: Deploy or re-deploy all SAM based services :::...
78+
$ make deploy
79+
80+
...::: Create (cannot re-deploy) all SAM based services with rollback disabled :::...
81+
$ make create
82+
83+
...::: Delete all SAM based services :::...
84+
$ make delete
85+
endef

.workshop-infra/README.md

+74
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
# Infrastructure for SageMaker Workshop with a CDK solution stack
2+
3+
This folder provides a helper stack which will:
4+
5+
- Create a SageMaker Notebook Instance with the repository cloned in
6+
- Create an (IAM-authenticated) SageMaker Studio domain, with a user profile, with the repository cloned in (and some VPC infrastructure required to make that happen)
7+
- Run a one-off AWS CodeBuild build to download the repository, `poetry install` the dependencies and `cdk deploy --all` stacks in the solution
8+
9+
It's intended to help automate setting up workshops on temporary AWS accounts, with CDK-based solutions (like this one) that assume a SageMaker notebook environment will be provisioned separately.
10+
11+
## Prerequisites and Caveats
12+
13+
This helper stack assumes that (in your target AWS Region):
14+
15+
- You have not yet onboarded to SageMaker Studio
16+
- You have a default VPC you're willing to use with standard configuration, or else would like to use a custom VPC but are comfortable checking the compatibility of the stack with your VPC configuration.
17+
18+
> ⚠️ This stack is oriented towards convenience of **getting started** and first exploring SageMaker Studio with the companion solution stack. It is **not recommended for long-lived environments**.
19+
>
20+
> In particular, **be aware that:**
21+
>
22+
> - The stack grants broad power user permissions to the CodeBuild job (for whatever resources the CDK deployment may need to create)
23+
> - When you delete the stack
24+
> - The SageMaker Studio setup for your target AWS Region will be deleted (and the stack should *fail* to delete if you have any users running 'apps' in Studio apart from the ones set up by the stack. You can manage these through the [SageMaker console UI](https://console.aws.amazon.com/sagemaker/home?#/studio))
25+
> - The CDK solution deployed deployed by the CodeBuild project will *not* automatically be cleaned up
26+
27+
## Developing and Deploying Locally
28+
29+
In addition to having an AWS account, you'll need an environment with:
30+
31+
- The [AWS CLI](https://aws.amazon.com/cli/)
32+
- The [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html)
33+
- A Docker-compatible container runtime such as [Docker Desktop](https://www.docker.com/products/docker-desktop)
34+
- A `make` utility such as [GNU Make](https://www.gnu.org/software/make/) - probably already installed if you have some bundled build tools already.
35+
- *Probably* a UNIX-like (non-Windows) shell if you want things to run smoothly... But you can always give it a try and resort to translating commands from the [Makefile](Makefile) if things go wrong.
36+
37+
You'll also need:
38+
39+
- Sufficient access (log in with `aws configure`) to be able to deploy the stacks in your target region
40+
- An *[Amazon S3](https://s3.console.aws.amazon.com/s3/home) Bucket* to use for staging deployment assets (Lambda bundles, etc)
41+
42+
**Step 1: Build the Lambda bundles and final CloudFormation template to S3 with AWS SAM**
43+
44+
(This command builds your assets and CloudFormation template, and stages them to your nominated Amazon S3 bucket)
45+
46+
```sh
47+
make package DEPLOYMENT_BUCKET_NAME=DOC-EXAMPLE-BUCKET
48+
```
49+
50+
**Step 2: Deploy (create or update) the stack**
51+
52+
```sh
53+
make deploy STACK_NAME=workshopstack
54+
```
55+
56+
***Alternative: Build and create the stack in one go**
57+
58+
(This option only *creates* stacks, and disables rollback, for easier debugging)
59+
60+
```sh
61+
make all DEPLOYMENT_BUCKET_NAME=example-bucket STACK_NAME=workshopstack
62+
```
63+
64+
There's also a `make delete` option to help with cleaning up.
65+
66+
## Preparing Templates for Multi-Region Deployment
67+
68+
If you'd like your template to be deployable in multiple AWS Regions:
69+
70+
- Set up an asset hosting bucket in each region of interest, and use the AWS Region ID (e.g. `us-east-1`) in the bucket names
71+
- Set up cross-region replication to copy contents from your lead region to other regions
72+
- Run the `make package` script against your lead region
73+
74+
The generated template will be automatically post-processed (by [sam-postproc.py](sam-postproc.py)) to tokenize S3 references to hosted assets to refer to the `${AWS::Region}` placeholder.
+85
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
2+
# SPDX-License-Identifier: MIT-0
3+
"""Custom CloudFormation Resource to kick off CodeBuild project builds
4+
5+
This custom resource expects a 'ProjectName' property, and will simply kick off a run of that AWS
6+
CodeBuild Project on creation. It doesn't wait for the run to complete successfully, and it doesn't
7+
do anything on resource UPDATE/DELETE.
8+
"""
9+
# Python Built-Ins:
10+
import logging
11+
import traceback
12+
13+
# External Dependencies:
14+
import boto3
15+
import cfnresponse
16+
17+
codebuild = boto3.client("codebuild")
18+
19+
20+
def lambda_handler(event, context):
21+
try:
22+
request_type = event["RequestType"]
23+
if request_type == "Create":
24+
handle_create(event, context)
25+
elif request_type == "Update":
26+
handle_update(event, context)
27+
elif request_type == "Delete":
28+
handle_delete(event, context)
29+
else:
30+
cfnresponse.send(
31+
event,
32+
context,
33+
cfnresponse.FAILED,
34+
{},
35+
error=f"Unsupported CFN RequestType '{request_type}'",
36+
)
37+
except Exception as e:
38+
logging.error("Uncaught exception in CFN custom resource handler - reporting failure")
39+
traceback.print_exc()
40+
cfnresponse.send(
41+
event,
42+
context,
43+
cfnresponse.FAILED,
44+
{},
45+
error=str(e),
46+
)
47+
raise e
48+
49+
50+
def handle_create(event, context):
51+
logging.info("**Received create request")
52+
resource_config = event["ResourceProperties"]
53+
logging.info("**Running CodeBuild Job")
54+
result = codebuild.start_build(
55+
projectName=resource_config["ProjectName"],
56+
)
57+
cfnresponse.send(
58+
event,
59+
context,
60+
cfnresponse.SUCCESS,
61+
{},
62+
physicalResourceId=result["build"]["arn"],
63+
)
64+
65+
66+
def handle_delete(event, context):
67+
logging.info("**Received delete event - no-op")
68+
cfnresponse.send(
69+
event,
70+
context,
71+
cfnresponse.SUCCESS,
72+
{},
73+
physicalResourceId=event["PhysicalResourceId"],
74+
)
75+
76+
77+
def handle_update(event, context):
78+
logging.info("**Received update event - no-op")
79+
cfnresponse.send(
80+
event,
81+
context,
82+
cfnresponse.SUCCESS,
83+
{},
84+
physicalResourceId=event["PhysicalResourceId"],
85+
)
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
# Nothing else required beyond common Lambda layer

0 commit comments

Comments
 (0)