Skip to main content

Snowflake Native App Deployment Guide

A step-by-step walkthrough for deploying Supaflow as a Snowflake Native App. This guide covers everything from initial Snowflake setup to running your first pipeline.

Supaflow Snowflake Native App Architecture

Time estimate

This process takes approximately 15-20 minutes end to end.

Prerequisites

Before you begin, ensure you have:

  • A Supaflow account with admin role (Sign up if you don't have one)
  • A paid Snowflake account with ACCOUNTADMIN access (trial accounts do not support External Access Integrations, which are required for the Native App)
  • Access to Snowflake Worksheets (Snowsight)

Run the Snowflake Setup Script

A Snowflake account admin needs to run this script before you can install the Native App. If this has already been done for your account, skip to Phase 1.

Run Script in Snowflake Worksheets
  1. Log in to your Snowflake account as ACCOUNTADMIN
  2. Click Projects > Worksheets > + (New Worksheet)
  3. Copy the script below and paste it into the worksheet
  4. Optionally modify the variable values at the top: role_name, user_name, warehouse_name, database_name, or schema_name
  5. Click the dropdown arrow next to the Run button and select "Run All"
note

Don't use the regular "Run" button -- it only executes the current statement. Use "Run All" to execute the entire script.

This script creates a dedicated role (SUPA_ROLE), service user (SUPA_USER), warehouse (SUPA_WH), and database (SUPA_DB). Supaflow automatically generates and assigns the key pair for authentication -- no manual key setup is required.

Snowflake Setup Script
-- create variables for user / password / role / warehouse / database (needs to be uppercase for objects)
set role_name = 'SUPA_ROLE';
set user_name = 'SUPA_USER';
set warehouse_name = 'SUPA_WH';
set database_name = 'SUPA_DB';
set schema_name = 'SUPA_SCHEMA';
set fqn_schema_name = concat($database_name,'.',$schema_name);

-- change role to securityadmin for user/role steps
use role securityadmin;

-- create role for Supaflow
create role if not exists identifier($role_name);
grant role identifier($role_name) to role SYSADMIN;

-- create a user for SupaFlow
create user if not exists identifier($user_name)
type = SERVICE
default_role = $role_name
default_warehouse = $warehouse_name;

grant role identifier($role_name) to user identifier($user_name);

-- set binary_input_format to BASE64
ALTER USER identifier($user_name) SET BINARY_INPUT_FORMAT = 'BASE64';

-- change role to sysadmin for warehouse/database steps
use role sysadmin;

-- create a warehouse for Supaflow
create warehouse if not exists identifier($warehouse_name)
warehouse_size = xsmall
warehouse_type = standard
auto_suspend = 60
auto_resume = true
initially_suspended = true;

-- create database for Supaflow
create database if not exists identifier($database_name);

create schema if not exists identifier($fqn_schema_name);



-- grant supaflow role access to warehouse
grant USAGE on warehouse identifier($warehouse_name) to role identifier($role_name);

-- grant supaflow access to the database
grant ALL on database identifier($database_name) to role identifier($role_name);
grant ALL on ALL schemas in database identifier($database_name) to role identifier($role_name);
grant ALL on schema identifier($fqn_schema_name) to role identifier($role_name);

-- change role to ACCOUNTADMIN for STORAGE INTEGRATION support (only needed for Snowflake on GCP)
use role ACCOUNTADMIN;

-- transfer ownership of database and schema to SUPA_ROLE
GRANT OWNERSHIP ON DATABASE identifier($database_name)
TO ROLE identifier($role_name) COPY CURRENT GRANTS;
GRANT OWNERSHIP ON SCHEMA identifier($fqn_schema_name)
TO ROLE identifier($role_name) COPY CURRENT GRANTS;

grant CREATE INTEGRATION on account to role identifier($role_name);
grant CREATE EXTERNAL VOLUME on account to role identifier($role_name);
grant CREATE DATABASE on account to role identifier($role_name);
GRANT EXECUTE TASK ON ACCOUNT TO ROLE identifier($role_name);
GRANT EXECUTE MANAGED TASK ON ACCOUNT TO ROLE identifier($role_name);

grant all on future schemas in database identifier($database_name) to role identifier($role_name);
grant all on future tables in database identifier($database_name) to role identifier($role_name);
grant all on future views in database identifier($database_name) to role identifier($role_name);

-- grants for querying Iceberg tables via external volume + catalog integration
grant CREATE TABLE on all schemas in database identifier($database_name) to role identifier($role_name);
grant CREATE TABLE on future schemas in database identifier($database_name) to role identifier($role_name);

-- grant usage on future file formats and stages
GRANT USAGE ON FUTURE FILE FORMATS IN DATABASE identifier($database_name) TO ROLE identifier($role_name);
GRANT USAGE ON FUTURE STAGES IN DATABASE identifier($database_name) TO ROLE identifier($role_name);

Values to Note Down

After running the script, note down the following values. You will need them when creating the Snowflake destination datasource in Phase 1.

ValueScript DefaultWhere to Find
Account Identifier--Your Snowflake account URL (e.g., orgname-account_name.snowflakecomputing.com). Find it in the bottom-left corner of Snowsight.
UsernameSUPA_USERThe user_name variable in the script
RoleSUPA_ROLEThe role_name variable in the script
WarehouseSUPA_WHThe warehouse_name variable in the script
DatabaseSUPA_DBThe database_name variable in the script
SchemaSUPA_SCHEMAThe schema_name variable in the script
note

If you customized any of the variable values at the top of the script, use your custom values instead of the defaults shown above.


Phase 1: Create Snowflake Destination Datasource

Create a Snowflake destination datasource using the SUPA_USER created in the prerequisites. This datasource will later be assigned as the controller for the Native App agent. For full details on Snowflake destination configuration, see the Snowflake Destination docs.

Step 1: Create a Snowflake Destination

  1. In Supaflow, go to Destinations in the sidebar
  2. Click Create Destination
  3. Select Snowflake from the connector list

This opens the Snowflake destination form. Fill in the fields below using the values from the setup script. If you kept the script defaults, use the values shown in the Default column.

Destination Info

FieldRequiredDefaultDescription
Destination NameYesSnowflake (#1)A display name for this datasource. Choose something descriptive (e.g., Snowflake Controller).
API NameYesAuto-generatedAuto-generated from the destination name. Used as an internal identifier.
DescriptionNo--Optional description for your reference.

Authentication

FieldRequiredDefaultDescription
UsernameYes--The Snowflake user created by the setup script. Default: SUPA_USER
Authentication TypeYeskeypairSelect keypair. The other options (basic, spcs_oauth) are not used for the controller datasource.
Private KeyYes--Click Create Key to have Supaflow generate a key pair automatically. See the important step below. Alternatively, click Upload Key to use an existing key.
Private Key PassphraseNo--Only required if your private key is encrypted with a passphrase. Leave blank if you used Create Key.
Important: Register the Public Key in Snowflake

After clicking Create Key, Supaflow generates a key pair and displays a SQL statement under the heading "Run this SQL to register the public key:"

ALTER USER SUPA_USER SET RSA_PUBLIC_KEY='MIIBIjAN...';
  1. Click Copy to copy the SQL statement
  2. Open a Snowflake worksheet as SECURITYADMIN (or ACCOUNTADMIN)
  3. Paste and run the SQL statement
  4. This registers the public key with your Snowflake user so key pair authentication works

You must complete this step before the connection test will succeed.

Connection

FieldRequiredDefault from ScriptDescription
Account IdentifierYes--Your Snowflake account identifier (e.g., orgname-account_name.snowflakecomputing.com)
WarehouseYesSUPA_WHThe warehouse created by the setup script
DatabaseYesSUPA_DBThe database created by the setup script
SchemaYesSUPA_SCHEMAThe schema created by the setup script

Advanced Settings (Optional)

Click Advanced Settings to expand. These can generally be left at their defaults.

FieldDefaultDescription
Role--The Snowflake role to use. Set to SUPA_ROLE (created by the setup script).
Noop QuerySELECT 1Query used to test connectivity. No need to change.
Query TagSupaflowTag applied to all queries for tracking in Snowflake.
Log Query StatsDisabledEnable to log query execution statistics.
Query Timeout0Query timeout in seconds. 0 means no timeout.
Schema Refresh Interval30Interval in minutes for schema metadata refresh.

Create Snowflake datasource

Step 2: Test and Save the Datasource

  1. Click Test & Save to verify connectivity and save the datasource
  2. The status indicator at the bottom shows the connection test progress
  3. On success, you are redirected to the Destinations list
  4. The new datasource appears with an Active state (it may briefly show "Synching Schema" first)

Destinations list

tip

If the connection test fails, check that your account identifier, username, and private key are correct. Make sure you ran the ALTER USER SQL in Snowflake to register the public key. See the Snowflake Destination docs for common troubleshooting steps.


Phase 2: Install the Snowflake Native App

Install the Supaflow Native App from the Snowflake Marketplace, link it to your Supaflow account, and deploy the agent.

Step 3: Install the Native App

  1. In Snowsight, navigate to Data Products > Marketplace
  2. Search for Supaflow
  3. Click on the Supaflow app listing
  4. Click Get to begin installation

Step 4: Configure the App

After clicking Get, Snowflake opens a configuration wizard. Complete the following steps:

  1. Review External Connections -- The first screen shows the Supaflow External Access Integration under "External connections." This integration allows the Native App to make outbound connections to Supabase, GitHub, SaaS APIs, and AWS services.
    • Click Review to inspect the list of allowed endpoints
    • Review the endpoints, then click Connect to approve the External Access Integration

External connections

Review allowed endpoints

  1. Grant Account-Level Privileges -- Grant all five required privileges:

    PrivilegeDescription
    CREATE COMPUTE POOLRequired to create compute pools for running the Supaflow agent
    CREATE WAREHOUSERequired to create a service warehouse for activity signaling
    CREATE EXTERNAL ACCESS INTEGRATIONRequired to allow agent to connect to Supabase, GitHub, and SaaS APIs
    EXECUTE MANAGED TASKRequired to run the agent as a service
    BIND SERVICE ENDPOINTRequired to expose service endpoints
  2. Click Next after all privileges show Granted.

Account privileges

All privileges granted

  1. Click Proceed to app -- This takes you to the app overview page.

App activated

  1. Click Launch app -- This opens the Supaflow Streamlit interface inside Snowflake.
tip

The Streamlit interface may take a minute or two to load on first launch. If it appears stuck on "Loading Supaflow...", go back to the app overview page and click Launch app again.

After launching, the Streamlit interface loads and shows the Link Supaflow Account screen.

Link Supaflow Account screen

  1. Click Yes, I have an account (since you already have a Supaflow account from the prerequisites)
  2. The app generates encryption keys automatically
  3. Click Open Supaflow to Link Account -- This opens Supaflow in a new tab

Open Supaflow to Link Account

  1. In Supaflow, name your agent and click Link Agent

Link Agent in Supaflow

  1. After linking, you are redirected back to the Streamlit UI showing "Link completed!"

Link completed

Close the Streamlit tab

Close the Streamlit browser tab after linking to avoid unnecessary Snowflake warehouse credit consumption. You will reopen it briefly for agent deployment in the next step.

Step 6: Deploy the Agent

Once linked, deploy the agent service:

  1. In the Streamlit UI, click Deploy Agent

Deploy Agent page

  1. Select a Compute Configuration from the dropdown (default: Medium - 3 CPU, 13 GB)
  2. Click Deploy Agent

Compute configuration

  1. The app creates the compute pool, warehouse, and agent service
  2. Wait for all deployment steps to complete

Deployment complete

  1. Click Back to Agents to verify the agent status is Running
  2. Click the Supaflow dashboard link in the info banner to open Supaflow in a new tab

Agent running in Streamlit

Close the Streamlit tab

Close the Streamlit browser tab after confirming the agent is running. Keeping it open consumes Snowflake warehouse credits unnecessarily.


Phase 3: Approve Agent and Build Pipeline

Approve the agent in Supaflow, assign the controller datasource, and create your first pipeline. For more on agent management, see the Agents docs.

Step 7: Approve the Agent

  1. In Supaflow, go to Settings > Agents
  2. Your new agent appears with a Registered status
  3. Click the green Approve button
  4. Wait for the agent to activate -- you'll see a spinner with "Approving agent... Waiting for agent to connect..."
  5. The agent transitions to Running

Agents page with Approve button

note

Approval is a security gate -- it ensures only explicitly trusted agents can execute jobs in your environment.

Step 8: Create a Native App Destination

Immediately after approval, Supaflow displays a Create Snowflake Destination prompt. This destination uses the Native App's internal warehouse and database with SPCS OAuth authentication.

  1. Review the pre-configured details (Account, Database, Schema, Warehouse, Auth Type)
  2. Click Create Destination
  3. A success banner confirms: "Snowflake Native destination created successfully."

Agent approved with Create Destination prompt

Native App destination created

Step 9: Verify the Controller Datasource

The controller datasource is typically set automatically during deployment. Verify it points to the Snowflake datasource you created in Phase 1:

  1. On the Agents page, click the three-dot menu on your agent
  2. Select Change Controller
  3. Confirm that Snowflake Controller is selected and shows CURRENT
  4. Verify the account identifier matches your Snowflake account
  5. Click Cancel (no change needed) or Set Controller if you need to change it

Change Controller modal

Step 10: Grant the Application Role

Important: Run this in Snowflake

For the controller datasource to manage the Native App service, run this in a Snowflake worksheet as ACCOUNTADMIN:

GRANT APPLICATION ROLE <native_app_database>.app_public TO ROLE SUPA_ROLE;

-- Example (production):
GRANT APPLICATION ROLE SUPAFLOW_DI_AGENT.app_public TO ROLE SUPA_ROLE;

Without this grant, the controller datasource cannot start or stop the Native App agent service.

Step 11: Create a Source Datasource

Add the data source you want to pull from:

  1. Go to Sources in the sidebar
  2. Click Create Source
  3. Choose your source connector (e.g., Salesforce, HubSpot, PostgreSQL)
  4. Configure the connection and test it

See the Sources documentation for connector-specific setup guides, or watch the video below:

Step 12: Build Your First Pipeline

With both source and destination datasources configured, create a pipeline:

  1. Go to Pipelines in the sidebar
  2. Click + Create Pipeline
  3. Walk through the 4-step wizard:
    • Choose Source -- Select the source datasource
    • Configure Pipeline -- Review and adjust sync/load/schema settings (project destination is pre-selected)
    • Choose Objects to Sync -- Select tables and fields
    • Review & Save -- Confirm details
  4. Click Create Pipeline

See the Ingestion Pipelines documentation for detailed configuration options, or watch the video below:


Verification Checklist

After completing all phases, verify:

  • Consumer account setup objects -- SUPA_ROLE, SUPA_USER, and SUPA_DB exist
  • Native App database -- SUPAFLOW_DI_AGENT exists (or your installed Native App database name)
  • Native App -- Installed and showing in Snowflake Apps
  • Agent -- Shows Running status on the Supaflow Agents page
  • Controller datasource -- Set on the agent with matching account ID
  • Application role -- <native_app_database>.app_public granted to SUPA_ROLE (for prod: SUPAFLOW_DI_AGENT.app_public)
  • Pipeline -- First sync completes successfully

Troubleshooting

Agent stuck in "Registered" after deployment

The agent may take a few minutes to start. In Snowflake, check the compute pool status:

SHOW COMPUTE POOLS LIKE '%_POOL';

-- Replace with your Native App database name:
DESCRIBE COMPUTE POOL <native_app_database>_POOL;

-- Example (production):
DESCRIBE COMPUTE POOL SUPAFLOW_DI_AGENT_POOL;

If the pool is still provisioning, wait for it to reach ACTIVE status.

Connection test fails for controller datasource

Verify that:

  • The private key matches the public key registered with SUPA_USER
  • The account identifier is correct (format: ORG-ACCOUNT, e.g., MYORG-MYACCOUNT)
  • SUPA_ROLE has the required grants

Agent shows "Running" but jobs fail

Check the agent logs in Snowflake:

SELECT SYSTEM$GET_SERVICE_LOGS('SUPA_AGENT_SERVICES.SUPAFLOW_AGENT_SERVICE', 0, 'supaflow-agent', 100);

Native App Streamlit UI not loading

Ensure the compute pool is active and the service is running:

SHOW SERVICES IN SCHEMA SUPA_AGENT_SERVICES;
SELECT SYSTEM$GET_SERVICE_STATUS('SUPA_AGENT_SERVICES.SUPAFLOW_AGENT_SERVICE');


Support

Need help? Contact us at support@supa-flow.io