Kieran Healey
1 min readJun 2, 2021

--

It depends on what Sink data store you are talking about... If you are purely talking Snowflake to Blob then it functions fine for dumping out files, though I would ask the question why do that? I would consider that use case atypical as generally you would want to do transformations once the data is in Snowflake. In this article, we are assuming that you are in the ELT paradigm rather than ETL paradigm. Using ELT you can take advantage of the compute warehouse in Snowflake using something like dbt/python/etc.

There are obvious advantages for being in the same virtual network such as privacy, however; this was not within the scope of the article as I would have had to write something more akin to a 20 page essay rather than an easy to read and understand article. ADF is not something that can be tuned easily as it is designed to be a "low-code" environment. There are some things you can mess with that can help on importing into Snowflake such as file size of the parquet, csv or orc file, and whether you have parallelism enabled or not.

Snowflake does not use Azure blob as its underlying storage layer. This is merely a staging location. What ADF is doing is staging the data so that the data can load into the data warehouse. Snowflake, is originally at its core, a fork of Postgres that has been give the SaaS treatment and given many features such as the compute warehouse, improved security features, etc.

--

--

Kieran Healey

Full Stack Data Guy — likes blogging about new technologies and sharing simple tutorials to explain the tech.