Skip to content

steven-luabase/airbyte_dbt_stripe

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Apache License

Stripe Airbyte

This package models Stripe data from Airbyte's connector.

Let us know which connectors you would like to see next here

Models

This package contains staging models, with the following naming conventions across all packages:

  • Boolean fields are prefixed with is_ or has_
  • Timestamps are appended with _timestamp
  • ID primary keys are prefixed with the name of the table. For example, the campaign table's ID column is renamed campaign_id.

DBT Metrics

This package contains configurations for DBT metrics for you to get up and running quickly with standard Stripe metrics in your existing BI tools.

Installation Instructions

Check dbt Hub for the latest installation instructions, or read the dbt docs for more information on installing packages.

Include in your packages.yml

packages:
  - package: cerebriumAI/dbt-stripe
    version: ["0.1.0"]

Configuration

Source Data Location

By default, this package will look for your gStripe data in the stripe schema of your target database. If this is not where your Stripe data is, please add the following configuration to your dbt_project.yml file:

# dbt_project.yml

...
config-version: 2

vars:
    stripe_schema: your_schema_name
    stripe_database: your_database_name 

Database Support

This package has been tested on BigQuery, Snowflake, Redshift, Postgres, and Databricks.

Databricks Dispatch Configuration

dbt v0.20.0 introduced a new project-level dispatch configuration that enables an "override" setting for all dispatched macros. If you are using a Databricks destination with this package you will need to add the below (or a variation of the below) dispatch configuration within your dbt_project.yml. This is required in order for the package to accurately search for macros within the dbt-labs/spark_utils then the dbt-labs/dbt_utils packages respectively.

# dbt_project.yml

dispatch:
  - macro_namespace: dbt_utils
    search_order: ['spark_utils', 'dbt_utils']

Contributions

Additional contributions to this package are very welcome! Please create issues or open PRs against master. Check out this post on the best workflow for contributing to a package. Suggestions to the DBT metrics are welcome too!

Resources:

  • Provide feedback on our existing dbt packages or what you'd like to see next

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published