Is Dbt compatible with Snowflake? Definitely! And if you combine it with some ETL/orchestration tool like Keboola (cloud) or Mage.ai (on-premises), you’ve got yourself a decent data solution. Nowadays, most ETL frameworks (at least the better ones) integrate with dbt.
Local Configuration of Dbt and Snowflake
In this tutorial, we assume that dbt is installed locally and configured with a Snowflake database. The process consists of several steps:
Step 1 – Installation of dbt-core and Snowflake adapter – by running the command “pip install dbt-snowflake“. This will install dbt-core and the Snowflake adapter necessary for communication with the database.
Step 2 – Creation of a dbt project – there are 2 options: either create an empty project or use one already in Git.
- For an empty project – navigate in the terminal to the desired location for the project and run the command “dbt init”.
- For an existing project – perform a git clone of your project from an existing dbt Git repository.
When creating a new project, you’ll be prompted to provide a project name and adapter (choose Snowflake). This will create a new dbt project.
Step 3 – Configuration of profiles.yml and dbt_project.yml files – documentation on how the configuration in profiles.yml should look like can be found here. Alternatively, modify the template below (fill in your credentials).
Step 4 – Configuration testing via “dbt debug” was successful, and we can see that all tests passed.
Step 5 – Running dbt via “dbt run” and checking the data directly in Snowflake. 2 default objects are processed (my_first_dbt_model.sql and my_second_dbt_model.sql).
The data was successfully delivered to the Snowflake database DBT_DATABASE.