Snowflake Data Warehouse

Description

Used for working with Snowflake data warehouse.

Task Type Support

This connection type supports the following task types:

  • Data Migration

Data Migration Support

This connection can be used in a Data Migration.

Usable as Source Usable as Target Usable as Incremental Source Usable as Incremental Target

Connection String Templates

Template Name Description
Snowflake The standard Snowflake connection string-requires details such as the host, account, role, database and warehouse to connect to.

Connection Notes

  • The username and password fields for this connection type should be your Snowflake Data Warehouse credentials.

  • The role of your account should be provided when creating the connection, if it’s not known the default will be set to public.

  • When using the connection string the YOUR_FULL_HOST can contain your YOUR_ACCOUNT values as shown below:

  • Connection String Template:

 "account=YOUR_ACCOUNT;host=YOUR_FULL_HOST.snowflakecomputing.com;role=PUBLIC;db=YOUR_DATABASE;"

  • Connection String Example:
 "account=Perspective;host=perspective.ap-southeast-2.snowflakecomputing.com;role=PUBLIC;db=DEMO_WH;WAREHOUSE=DEMO_WH"

  • Reader Connection String Example:
"account=readeraccount01;host=perspective.ap-southeast-2.snowflakecomputing.com;role=PUBLIC;db=DGDatabase;WAREHOUSE=DEMO_WH"

  • Snowflake data migrations require a staging connection and staging file definition that is a blob based connection such as Azure Blob and a blob file definition.

  • When performing a data migration from a Snowflake source to a different Snowflake target, it will require a staging connection or internal staging. SnowSQL is required to be installed on the same host as the Agent if you wish to perform internal staging.

When using an internal stage for Snowflake connections users will have to install SnowSQL. Users will also have to restart their Agent if they install SnowSQL in an existing running Agent.

When importing double data from MySQL and SQL Server into Snowflake, Snowflake currently cannot handle the minimum double. To handle the minimum double you can cast the value into a varchar before inserting it into Snowflake.

  • When importing from Snowflake, you have the option to use a CSV or Parquet staging type. It is recommended that you use CSV unless storage size is an issue on your agent.

We suggest using CSV over Parquet when importing timezone sensitive data, so that you do not have any issues with timestamp handling when using the Snowflake connector.