Snowflake Connector
Load data from any Supaflow source into Snowflake. Supports key pair, password, and SPCS OAuth authentication with automatic schema management and internal stage loading.
Setup a Snowflake Destination in Supaflow
Why Supaflow
All connectors included
No per-connector fees. Every connector is available on every plan.
Pay for compute, not rows
Credit-based pricing. No per-row charges, no MAR surprises.
One platform
Ingestion, dbt Core transformation, reverse ETL, and orchestration in a single workspace.
Capabilities
Key Pair, Password, and SPCS OAuth Authentication
Connect via RSA key pair (recommended), username/password, or SPCS OAuth when running as a Snowflake Native App. Key pair supports auto-generation of keys and passphrase directly in Supaflow.
Internal Stage Loading
Data is loaded through Snowflake internal stages using PUT and COPY INTO. Files are automatically compressed, staged, loaded, and cleaned up.
Automatic Schema Management
Supaflow creates target tables, manages column types, and handles schema evolution including type promotions and new column additions in the destination database and schema you configure.
Role and Warehouse Configuration
Specify the Snowflake role, warehouse, database, and schema to use. The warehouse auto-resumes for loading and can auto-suspend when idle.
Supported Objects
Configuration
Account Identifier
Snowflake account URL in orgname-account_name format.
Warehouse
Compute warehouse used for queries and loading.
Database
Target database where tables are created.
Schema
Target schema within the database.
Role
Snowflake role used for all operations.
How It Works
Run the Snowflake setup script
Log in to Snowflake and execute the provided setup script. The script creates a dedicated role, user, warehouse, database, and schema for Supaflow. Modify the names and password as needed before running.
Choose authentication and connect
Select Key Pair (recommended), Password, or SPCS OAuth. For key pair, click Create Key to auto-generate credentials, or paste your own private key. Enter your Snowflake account identifier (e.g., orgname-account_name.snowflakecomputing.com).
Configure advanced settings
Optionally set the Snowflake role, loading method, and schema refresh interval. The default settings work for most deployments.
Test and save
Click Test & Save to verify the connection. Supaflow validates the account identifier, credentials, and permissions on the target database and schema.
Use Cases
Centralize SaaS data in Snowflake
Load data from Salesforce, HubSpot, Airtable, and other sources into Snowflake for cross-system analytics and reporting.
SQL Server migration to Snowflake
Replicate SQL Server tables into Snowflake with incremental change tracking, maintaining a near-real-time copy of your operational database.
File-based data ingestion
Ingest CSV, Excel, and Google Sheets files from Google Drive into Snowflake tables with automatic schema inference and type detection.
Snowflake-native data pipelines
Run Supaflow as a Snowflake Native App from the Snowflake Marketplace. The pipeline agent runs inside your Snowflake account using Snowpark Container Services -- your data never leaves Snowflake. Pay with existing Snowflake credits.
Frequently Asked Questions
Which authentication method should I use?
Does the warehouse need to be running?
What account identifier format does Supaflow expect?
Can I install Supaflow directly from the Snowflake Marketplace?
Need a connector we don't support yet?
Build one with AI-powered Connector Dev Skills.
Learn More About the Connector SDK