Helping customers dbt @ dbt Labs.
- Converting a dbt jinja macro to a python function for use in a python model
- Debugging dbt snapshots
- Which dbt nodes respect the generate_schema_name macro?
- Stopping dbt from auto-expanding column types
- A quick explainer of the "dbt was unable to infer all dependencies for the model" error
- How to get models to run only on local timezones if your scheduler only supports UTC
- How to do multi-threaded / parallel operations with dbt
- How to correctly use a macro that returns a value to a hook
- Using alternative hosts for dbt hub packages
- Using query comments to identify how things came to be
- Freshness checking models instead of sources
- Converting the timestamps in dbt
- Can dbt introspect a table created in a pre-hook if we try to query from it in the body of the model?
- What keys to use from results or graph to associate a test with a node (model/source)
- Why it's best not to use Truthy/Falsy/Boolean types with Jinja
- A pattern for moving dbt vars from dbt_project.yml to macros
- Unloading new rows to a Snowflake stage with hooks
- Dynamically generating
where
parameters to thedbt_utils.union_relations
macro - dbt Cloud Slim CI using GitHub Actions
- Extracting Snowflake variant keys and mapping them to dbt vars so we can use them as column names
- Maintaining the order of columns with Redshift and dbt
- Making a macro use the same Snowflake warehouse as the model that's calling the macro
- Executing stored procedures from dbt
- The difference between
--select
and--selector
arguments in dbt commands - Tidying up the SQL statements dbt generates
- Accessing private GitHub repositories (dbt packages) using GitHub Apps installation access tokens
- Live tailing a dbt Cloud run using the dbt Cloud API
- Filtering dbt's catalog query to only relations that are used in the project
- Hooks vs operations in dbt
- Why can't I upload the run_results.json file (using dbt_artifacts) in an on-run-end hook?
- Are dbt freshness checks expensive in Snowflake?
- Making dbt use a BigQuery project that is different to our production jobs when developing in the cloud IDE
- Model name validation
- Installing really old dbt versions
- Getting dbt Cloud to throw an error if a selection did not include any models
- Adding custom generic dbt tests
- Recording model run errors in a table
- Making a dbt materialization that ignores certain columns
- Customising the dbt-event-logging package
- Overriding dbt Cloud default database / schema on CI runs
- Creating jobs with the dbt Cloud API
- Customising dbt snapshots
- Building SCD-2 models using the default incremental materialization
The source of this README.md file is in src/README.md
and it has to run through cog
to automatically generate the list of linked writings above.
git clone https://github.com/jeremyyeo/jeremyyeo.git
pip install cog requests
cog -o README.md -d src/README.md
This is also periodically regenerated via a GitHub workflow.
Last generated at: Nov 09, 2024 01:46:34 UTC