Skip to content

Add code to create a BigQuery table using an Avro schema and then download a Json schema #239

@troyraen

Description

@troyraen

The BigQuery table storing alerts should have exactly (as much as possible) the same schema as the alerts. If we have an Avro schema for the alerts, the best way to create the BigQuery schema is to use the Avro schema to create a BigQuery table, and then download the BigQuery schema as a Json file that we store in the repo. An example of this Json file is at the comment that prompted this issue:

If we have an avro schema, we can use it to create the bigquery table, and then download this json version for future use. That's the easiest way to make this schema match (as much as possible) the alerts' schema, which is really what it needs to do. (And we should have a bit of code in here that will do that.)

Originally posted by @troyraen in #232 (comment)

This issue is to add a bit of code to the repo that will create the json file (avro schema -> bigquery table -> json schema) needed in the repo, assuming we have an avro schema.

Metadata

Metadata

Assignees

No one assigned

    Labels

    EnhancementNew feature or requestPipeline: AdminAdministration tasks; may touch multiple pipeline areas, but not clearly owned by any of themPipeline: StorageComponents whose primary function is to store datagood first issueGood first issue for anyone new to Pitt-Google, python, GCP, etc.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions