Skip to main content

Snowflake Destination

Connect Snowflake data warehouse as a destination for loading data from your sources.

Prerequisites

Before you begin, ensure you have:

  • An active Snowflake account with admin privileges (Don't have one? Sign up for a free trial)
  • A warehouse configured for data loading (or ability to create one)
  • Target database and schema created
  • Appropriate role with CREATE TABLE permissions
  • Network policies configured (if applicable)
  • Whitelist Supaflow IP (If Required)

If your Snowflake account has IP restrictions enabled, add this IP to your Network Access settings:

18.214.240.61

For reference: https://docs.snowflake.com/en/user-guide/network-policies

Run Script in Snowflake Warehouse

Follow these steps to set up your Snowflake environment:

  1. Log in to your Snowflake data warehouse
  2. Click "Copy Script" button below to copy the snowflake setup script
  3. In Snowflake, click ProjectsWorksheets+ (New Worksheet)
  4. Paste the entire script into the new worksheet
  5. IMPORTANT: Modify these values as needed:
    • Change the user_password (required)
    • Optionally modify role_name, user_name, warehouse_name, database_name, or schema_name
  6. Click the dropdown arrow next to the Run button ▼ and select "Run All"
note

Don't use the regular "Run" button — it only executes the current statement. Use "Run All" to execute the entire script.

Snowflake Setup Script (Key Pair Auth)
-- create variables for user / password / role / warehouse / database (needs to be uppercase for objects)
set role_name = 'SUPA_ROLE';
set user_name = 'SUPA_USER';
set warehouse_name = 'SUPA_WH';
set database_name = 'SUPA_DB';
set schema_name = 'SUPA_SCHEMA';
set fqn_schema_name = concat($database_name,'.',$schema_name);

-- change role to securityadmin for user/role steps
use role securityadmin;

-- create role for Supaflow
create role if not exists identifier($role_name);
grant role identifier($role_name) to role SYSADMIN;

-- create a user for SupaFlow
create user if not exists identifier($user_name)
type = SERVICE
default_role = $role_name
default_warehouse = $warehouse_name;

grant role identifier($role_name) to user identifier($user_name);

-- set binary_input_format to BASE64
ALTER USER identifier($user_name) SET BINARY_INPUT_FORMAT = 'BASE64';

-- change role to sysadmin for warehouse/database steps
use role sysadmin;

-- create a warehouse for Supaflow
create warehouse if not exists identifier($warehouse_name)
warehouse_size = xsmall
warehouse_type = standard
auto_suspend = 60
auto_resume = true
initially_suspended = true;

-- create database for Supaflow
create database if not exists identifier($database_name);

create schema if not exists identifier($fqn_schema_name);



-- grant supaflow role access to warehouse
grant USAGE on warehouse identifier($warehouse_name) to role identifier($role_name);

-- grant supaflow access to the database
grant ALL on database identifier($database_name) to role identifier($role_name);
grant ALL on ALL schemas in database identifier($database_name) to role identifier($role_name);
grant ALL on schema identifier($fqn_schema_name) to role identifier($role_name);

-- change role to ACCOUNTADMIN for STORAGE INTEGRATION support (only needed for Snowflake on GCP)
use role ACCOUNTADMIN;

-- transfer ownership of database and schema to SUPA_ROLE
GRANT OWNERSHIP ON DATABASE identifier($database_name)
TO ROLE identifier($role_name) COPY CURRENT GRANTS;
GRANT OWNERSHIP ON SCHEMA identifier($fqn_schema_name)
TO ROLE identifier($role_name) COPY CURRENT GRANTS;

grant CREATE INTEGRATION on account to role identifier($role_name);
grant CREATE EXTERNAL VOLUME on account to role identifier($role_name);
grant CREATE DATABASE on account to role identifier($role_name);
GRANT EXECUTE TASK ON ACCOUNT TO ROLE identifier($role_name);
GRANT EXECUTE MANAGED TASK ON ACCOUNT TO ROLE identifier($role_name);

grant all on future schemas in database identifier($database_name) to role identifier($role_name);
grant all on future tables in database identifier($database_name) to role identifier($role_name);
grant all on future views in database identifier($database_name) to role identifier($role_name);

-- grants for querying Iceberg tables via external volume + catalog integration
grant CREATE TABLE on all schemas in database identifier($database_name) to role identifier($role_name);
grant CREATE TABLE on future schemas in database identifier($database_name) to role identifier($role_name);

-- grant usage on future file formats and stages
GRANT USAGE ON FUTURE FILE FORMATS IN DATABASE identifier($database_name) TO ROLE identifier($role_name);
GRANT USAGE ON FUTURE STAGES IN DATABASE identifier($database_name) TO ROLE identifier($role_name);

Configuration

Step 1: Connection

Account Identifier*

Your full Snowflake account identifier.
Example: orgname-account_name.snowflakecomputing.com
or
Example: account_locator.cloud_region_id.cloud.snowflakecomputing.com

Warehouse*

Compute warehouse for queries
Example: SUPA_WH (created by script)

Database*

Target database name
Example: SUPA_DB (created by script)

Schema*

Target schema name
Example: SUPA_SCHEMA (created by script)


Step 2: Authentication

Username*

Your Snowflake username
Example: SUPA_USER (created by script)

Authentication Type*

Choose authentication method
Options: keypair, basic (password), or spcs_oauth
Default: keypair

Auth TypeUse Case
keypairRecommended for all standard connections. Uses RSA key pair authentication.
basicPassword-based authentication. Requires a Snowflake user with password login enabled.
spcs_oauthMost secure option. Used by Native App destinations where the agent runs inside Snowflake (SPCS). No credentials are stored -- authentication uses Snowflake-injected OAuth tokens.

When keypair is selected, you have two options for providing the private key:

Private Key*
  • Create Key -- Click to have Supaflow automatically generate a key pair and passphrase. This is the easiest option.
  • Upload Key -- Click to upload an existing private key file.
Important: Register the Public Key in Snowflake

After clicking Create Key, Supaflow generates a key pair and displays a SQL statement:

ALTER USER SUPA_USER SET RSA_PUBLIC_KEY='MIIBIjAN...';
  1. Click Copy to copy the SQL statement
  2. Open a Snowflake worksheet as SECURITYADMIN (or ACCOUNTADMIN)
  3. Paste and run the SQL statement

You must complete this step before the connection test will succeed.

Private Key Passphrase

Passphrase for the encrypted private key
Auto-generated when using Create Key. Only required if your uploaded key is encrypted.

Password Authentication

When basic is selected:

Password*

Your Snowflake password
Use the password set in the setup script

SPCS OAuth Authentication

When a Supaflow agent runs inside Snowflake as a Native App (via Snowpark Container Services), it uses spcs_oauth authentication. This is the most secure option:

  • Zero stored credentials -- No passwords, keys, or secrets are configured in the datasource. Authentication uses OAuth tokens that Snowflake injects directly into the running container at runtime.
  • No credential exposure risk -- Since no credentials exist in the datasource configuration, there is nothing that can be leaked, stolen, or rotated.
  • Private network path -- Data flows directly from the agent to Snowflake within the Snowflake infrastructure, without traversing the public internet.
info

After you approve a Native App agent in Settings > Agents, Supaflow shows a Create Snowflake Destination prompt. Click Create Destination to create the spcs_oauth destination. You can also create it later from the agent menu. You do not need to select this auth type manually in the destination form. See the Snowflake Native App Deployment Guide for the full setup walkthrough.


Step 3: Advanced Settings (Optional)

Role

The Snowflake role to use
Example: SUPA_ROLE (created by script)
Other options: ACCOUNTADMIN, SYSADMIN

Noop Query

Query to test connectivity
Default: SELECT 1

Query Tag

Tag to apply to all queries for tracking
Default: Supaflow

Log Query Stats

Enable query statistics logging
Default: Disabled

Schema Refresh Interval

Interval in minutes for schema metadata refresh
0 = refresh before every pipeline execution
-1 = disable schema refresh
Default: 0 (Range: -1 to 10080)


Step 4: Test & Save

After configuring all required properties, click Test & Save to verify your connection and save the destination.

Troubleshooting

Common issues and their solutions:

Invalid account identifier

Problem:

  • Connection fails with "Invalid account identifier" error
  • Cannot connect to Snowflake account

Solutions:

  1. Use the full account identifier including region and cloud provider
    • ✅ Correct: xy12345.us-east-1.snowflakecomputing.com
    • ❌ Wrong: xy12345
  2. Find your account identifier:
    • Log into Snowflake Console
    • Look at the bottom left corner
    • Copy the full identifier shown
  3. Verify format:
    • New format: orgname-account_name.snowflakecomputing.com
    • Legacy format: account_locator.region.cloud.snowflakecomputing.com

Warehouse suspended or not running

Problem:

  • Queries fail with "Warehouse not running" error
  • Connection test times out
  • Data loading fails intermittently

Solutions:

  1. Check warehouse status in Snowflake:
    • Navigate to: Admin → Warehouses
    • Verify warehouse state is "Started" or "Auto-Resume"
  2. Enable auto-resume:
    ALTER WAREHOUSE SUPA_WH SET AUTO_RESUME = TRUE;
  3. Manually start warehouse:
    ALTER WAREHOUSE SUPA_WH RESUME;
  4. Verify warehouse name:
    • Ensure the warehouse name in your config matches exactly (case-sensitive)

Network policy blocking connection

Problem:

  • Connection refused due to IP restrictions
  • "IP address not allowed" error
  • Connection timeout

Solutions:

  1. Add Supaflow IP to your network policy:
    -- Create network policy (if doesn't exist)
    CREATE NETWORK POLICY supaflow_policy
    ALLOWED_IP_LIST = ('18.214.240.61');

    -- Or modify existing policy
    ALTER NETWORK POLICY your_existing_policy
    SET ALLOWED_IP_LIST = ('your.existing.ip', '18.214.240.61');
  2. Verify network policies:
    • Go to: Admin → Security → Network Policies
    • Ensure 18.214.240.61 is in the allowed list
  3. Contact your Snowflake administrator if you don't have permission to modify network policies

Insufficient permissions

Problem:

  • Cannot create tables in target schema
  • "Access denied" or "Insufficient privileges" error
  • Data loading fails with permission error

Solutions:

  1. Re-run the setup script provided in Prerequisites section
  2. Manually verify permissions:
    -- Check current grants
    SHOW GRANTS TO ROLE SUPA_ROLE;

    -- Grant necessary permissions
    GRANT CREATE TABLE, INSERT, UPDATE, DELETE ON SCHEMA SUPA_DB.SUPA_SCHEMA TO ROLE SUPA_ROLE;
    GRANT USAGE ON DATABASE SUPA_DB TO ROLE SUPA_ROLE;
    GRANT USAGE ON WAREHOUSE SUPA_WH TO ROLE SUPA_ROLE;
  3. Verify role assignment:
    -- Check user's default role
    SHOW USERS LIKE 'SUPA_USER';

    -- Assign role if missing
    GRANT ROLE SUPA_ROLE TO USER SUPA_USER;

Authentication failed

Problem:

  • "Invalid username or password" error
  • Key-pair authentication fails
  • Connection refused

Solutions:

  1. For password authentication:
    • Verify password matches what you set in setup script
    • Check for special characters that might need escaping
    • Reset password if needed:
      ALTER USER SUPA_USER SET PASSWORD = 'your_new_password';
  2. For key-pair authentication:
    • Ensure private key is in correct PEM format
    • Verify public key is assigned to user in Snowflake
    • Check if private key is encrypted (requires passphrase)
    • Test key pair:
      -- Assign public key to user
      ALTER USER SUPA_USER SET RSA_PUBLIC_KEY='your_public_key';

Database or schema does not exist

Problem:

  • "Object does not exist" error
  • Cannot find database or schema
  • Setup test fails

Solutions:

  1. Verify database exists:
    SHOW DATABASES LIKE 'SUPA_DB';
  2. Create database if missing:
    CREATE DATABASE IF NOT EXISTS SUPA_DB;
  3. Verify schema exists:
    SHOW SCHEMAS IN DATABASE SUPA_DB;
  4. Create schema if missing:
    CREATE SCHEMA IF NOT EXISTS SUPA_DB.SUPA_SCHEMA;
  5. Check naming:
    • Database and schema names are case-sensitive
    • Ensure exact match with configuration

Connection timeout

Problem:

  • Connection times out after waiting
  • No response from Snowflake
  • Intermittent connectivity issues

Solutions:

  1. Check Snowflake service status:
  2. Verify network connectivity:
    • Ensure your network allows outbound HTTPS (443)
    • Check if corporate firewall is blocking Snowflake
  3. Test from different network:
    • Try from a different internet connection
    • Verify if issue is network-specific
  4. Contact support if timeout persists

Support

Need help? Contact us at support@supa-flow.io