Skip to content

Commit

Permalink
pubsub2inbox: version 1.3.0 (#880)
Browse files Browse the repository at this point in the history
* pubsub2inbox:
- Added Cloud Run support with Dockerfile and Terraform
- Improved SCC finding output processor (expand fields better).
- Fixed requirements.txt.
- Added new get_gcp_resource filter for fetching information about arbitrary GCP resources.
- Added example of sending Cloud IDS findings to SCC.
- Bumped version to 1.3.0.

* fix
  • Loading branch information
rosmo authored Oct 20, 2022
1 parent 4751067 commit 1d9c3b9
Show file tree
Hide file tree
Showing 13 changed files with 517 additions and 53 deletions.
49 changes: 49 additions & 0 deletions tools/pubsub2inbox/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Copyright 2022 Google, LLC.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

# Use the official lightweight Python image.
FROM python:3.10-slim

# Allow statements and log messages to immediately appear in the Knative logs
ENV PYTHONUNBUFFERED=True
ENV CONFIG=
ENV SERVICE_ACCOUNT=
ENV LOG_LEVEL=10
ENV WEBSERVER=1
ENV PORT=8080

ENV APP_HOME /app
WORKDIR $APP_HOME
COPY main.py requirements.txt ./
RUN mkdir {filters,output,processors,helpers}
COPY filters/*.py filters/
COPY output/*.py output/
COPY processors/*.py processors/
COPY helpers/*.py helpers/

# Install some support packages
RUN apt-get update && apt-get install -y libmagic1

# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt

# Run as a web service on using the gunicorn webserver, with one worker process and 8 threads.
#
# For environments with multiple CPU cores, increase the number of workers
# to be equal to the cores available.
#
# Timeout is set to 0 to disable the timeouts of the workers to allow Cloud Run to handle
# instance scaling.
CMD exec gunicorn --bind :$PORT --workers 1 --threads 8 --timeout 0 main:app

3 changes: 3 additions & 0 deletions tools/pubsub2inbox/PROCESSORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,9 @@ Output parameters:
Permissions:

- Browser (`roles/browser`) to fetch project details.
- `roles/securitycenter.findingsEditor` and `roles/securitycenter.findingSecurityMarksWriter` for writing
findings to a custom SCC source.
- Network Viewer (`roles/compute.networkViewer`) for Cloud IDS network ID resolving.

Output parameters:

Expand Down
32 changes: 27 additions & 5 deletions tools/pubsub2inbox/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ and output processors. Input processors can enrich the incoming messages with de
(for example, fetching the budget from Cloud Billing Budgets API). Multiple output
processors can be chained together.

Pubsub2Inbox is written in Python 3.8+ and can be deployed as a Cloud Function easily.
To guard credentials and other sensitive information, the tool can fetch its
YAML configuration from Google Cloud Secret Manager.
Pubsub2Inbox is written in Python 3.8+ and can be deployed as a Cloud Function or as a
Cloud Run function easily. To guard credentials and other sensitive information, the tool can
fetch its YAML configuration from Google Cloud Secret Manager.

The tool also supports templating of emails, messages and other parameters through
[Jinja2 templating](https://jinja.palletsprojects.com/en/2.10.x/templates/).
Expand All @@ -24,6 +24,7 @@ Out of the box, you'll have the following functionality:
- [How to set up programmatic notifications from billing budgets](https://cloud.google.com/billing/docs/how-to/budgets-programmatic-notifications)
- [Cloud Security Command Center](https://cloud.google.com/security-command-center)
- [Email notifications of findings](examples/scc-config.yaml) ([how to set up finding notifications from SCC](https://cloud.google.com/security-command-center/docs/how-to-notifications))
- [Create findings from Cloud IDS](examples/scc-cloud-ids.yaml)
- [Create custom findings](examples/scc-finding-config.yaml)
- [Cloud Storage notifications](examples/storage-config.yaml)
- [How to set up Cloud Storage notifications](https://cloud.google.com/storage/docs/reporting-changes)
Expand Down Expand Up @@ -145,9 +146,12 @@ parameters in when using as a module:
- `bucket_location` (string, optional): location of the bucket for Cloud Function archive (defaults to `EU`)
- `helper_bucket_name` (string, optional): specify an additional Cloud Storage bucket where the service account is granted `storage.objectAdmin` on
- `function_timeout` (number, optional): a timeout for the Cloud Function (defaults to `240` seconds)
- `retry_minimum_backoff` (string, optional): minimum backoff time for exponential backoff retries in Cloud Run. Defaults to 10s.
- `retry_maximum_backoff` (string, optional): maximum backoff time for exponential backoff retries in Cloud Run. Defaults to 600s.
- `cloud_run` (boolean, optional): deploy via Cloud Run instead of Cloud Function. Defaults to `false`. If set to `true`, also specify `cloud_run_container`.
- `cloud_run_container` (string, optional): container image to deploy on Cloud Run. See previous parameter.


### Deploying manually
## Deploying manually

First, we have the configuration in `config.yaml` and we're going to store the configuration for
the function as a Cloud Secret Manager secret.
Expand Down Expand Up @@ -207,6 +211,24 @@ gcloud functions deploy $FUNCTION_NAME \
--project $PROJECT_ID
```

## Deploying via Cloud Run

### Building the container

A [`Dockerfile`](Dockerfile) has been provided for building the container. You can build the
image locally and push it to for example [Artifact Registry](https://cloud.google.com/artifact-registry).

```sh
docker build -t europe-west4-docker.pkg.dev/$PROJECT_ID/pubsub2inbox/pubsub2inbox .
docker push europe-west4-docker.pkg.dev/$PROJECT_ID/pubsub2inbox/pubsub2inbox
```

### Deploying via Terraform

The provided Terraform scripts can deploy the code as a Cloud Function or Cloud Run. To enable
Cloud Run deployment, build and push the image and set `cloud_run` and `cloud_run_container`
parameters (see the parameter descriptions above).

### Running tests

Run the command:
Expand Down
63 changes: 63 additions & 0 deletions tools/pubsub2inbox/examples/scc-cloud-ids.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# Copyright 2022 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

# Creates findings from Cloud IDS in Cloud Security Command Center. You'll have to use the API
# to create a source first (its identifier looks like organizations/123/sources/456),
# see here for an example: https://cloud.google.com/security-command-center/docs/how-to-api-create-manage-security-sources#creating_a_source
#
# You'll also need the scc_writer permission (if deploying via Terraform). This includes the
# compute.networkViewer role, which is required to turn the network names into IDs for SCC.
#
# Create a Pub/Sub topic and use a log sink with a filter like:
# logName:"ids.googleapis.com%2Fthreat"
#
retryPeriod: 3 day ago

processors:
- genericjson

outputs:
- type: scc
source: organizations/382949788687/sources/5355536199717451283
finding_id: "{{ data.insertId|hash_string('md5') }}"
finding: #
resourceName: |
//compute.googleapis.com/{{ (data.jsonPayload.network|get_gcp_resource("compute", "compute")).selfLinkWithId|replace("https://www.googleapis.com/compute/v1/", "") }}
state: "ACTIVE"
description: |
{{ data.jsonPayload.name }}
{{ data.jsonPayload.details }}
category: "{{ data.jsonPayload.category|replace('-', '_')|upper }}"
externalUri: "https://console.cloud.google.com/logs/query;cursorTimestamp={{ data.timestamp }};query=timestamp%3D%22{{ data.timestamp }}%22%0AinsertId%3D%22{{ data.insertId }}%22"
indicator:
ipAddresses:
- "{{ data.jsonPayload.source_ip_address }}"
- "{{ data.jsonPayload.destination_ip_address }}"
sourceProperties:
application: "{{ data.jsonPayload.application }}"
direction: "{{ data.jsonPayload.direction }}"
ipProtocol: "{{ data.jsonPayload.ip_protocol }}"
destinationIpAddress: "{{ data.jsonPayload.destination_ip_address }}"
destinationPort: "{{ data.jsonPayload.destination_port }}"
sourceIpAddress: "{{ data.jsonPayload.source_ip_address }}"
sourcePort: "{{ data.jsonPayload.source_port }}"
vulnerability: |
{% if data.jsonPayload.cves is iterable %}{% set cve = {"id":data.jsonPayload.cves[0]} %}{{ {"cve":cve}|json_encode }}{% endif %}
eventTime: "{{ data.jsonPayload.alert_time }}"
createTime: "{{ ''|utc_strftime('%Y-%m-%dT%H:%M:%SZ') }}"
severity: "{{ data.jsonPayload.alert_severity }}"
findingClass: "{{ data.jsonPayload.type|upper }}"

3 changes: 2 additions & 1 deletion tools/pubsub2inbox/filters/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from .lists import split, index, merge_dict
from .strings import add_links, urlencode, generate_signed_url, json_encode, json_decode, b64decode, csv_encode, re_escape, html_table_to_xlsx, make_list, read_gcs_object, filemagic, hash_string
from .date import strftime, utc_strftime, recurring_date
from .gcp import format_cost, get_cost
from .gcp import format_cost, get_cost, get_gcp_resource
from .tests import test_contains


Expand All @@ -42,6 +42,7 @@ def get_jinja_filters():
'html_table_to_xlsx': html_table_to_xlsx,
'format_cost': format_cost,
'get_cost': get_cost,
'get_gcp_resource': get_gcp_resource,
'recurring_date': recurring_date,
'make_list': make_list,
'read_gcs_object': read_gcs_object,
Expand Down
17 changes: 17 additions & 0 deletions tools/pubsub2inbox/filters/gcp.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,12 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

from helpers.base import get_branded_http
from googleapiclient.http import HttpRequest
import json


def format_cost(cost, decimals=2):
_format = '%%.%df %%s' % decimals
return _format % (
Expand All @@ -20,3 +26,14 @@ def format_cost(cost, decimals=2):

def get_cost(cost):
return (float(cost['units']) + (float(cost['nanos']) / 1000000000.0))


def get_gcp_resource(resource, api_domain, api_endpoint, api_version='v1'):
uri = "https://%s.googleapis.com/%s/%s/%s" % (api_domain, api_endpoint,
api_version, resource)
req = HttpRequest(get_branded_http(), lambda resp, content: content, uri)
response = req.execute()
if response:
parsed_response = json.loads(response.decode('utf-8'))
return parsed_response
return None
27 changes: 26 additions & 1 deletion tools/pubsub2inbox/helpers/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
import google.auth
from googleapiclient import http

PUBSUB2INBOX_VERSION = '1.2.0'
PUBSUB2INBOX_VERSION = '1.3.0'


class NoCredentialsException(Exception):
Expand Down Expand Up @@ -133,4 +133,29 @@ def _jinja_expand_dict(self, _var, _tpl='config'):
_var[k] = self._jinja_expand_string(v)
else:
_var[k] = self._jinja_expand_dict(_var[k])
return _var

def _jinja_expand_dict_all(self, _var, _tpl='config'):
if not isinstance(_var, dict):
return _var
for k, v in _var.items():
if not isinstance(v, dict):
if isinstance(v, str):
_var[k] = self._jinja_expand_string(v)
if isinstance(v, list):
for idx, lv in enumerate(_var[k]):
if isinstance(lv, dict):
_var[k][idx] = self._jinja_expand_dict_all(lv)
if isinstance(lv, str):
_var[k][idx] = self._jinja_expand_string(lv)
else:
_var[k] = self._jinja_expand_dict_all(_var[k])
return _var

def _jinja_expand_list(self, _var, _tpl='config'):
if not isinstance(_var, list):
return _var
for idx, v in enumerate(_var):
if isinstance(v, str):
_var[idx] = self._jinja_expand_string(v)
return _var
Loading

0 comments on commit 1d9c3b9

Please sign in to comment.