Snowflake

Add the following to a .env file in your Cube project:

CUBEJS_DB_TYPE=snowflake
CUBEJS_DB_SNOWFLAKE_ACCOUNT=XXXXXXXXX.us-east-1
CUBEJS_DB_SNOWFLAKE_WAREHOUSE=MY_SNOWFLAKE_WAREHOUSE
CUBEJS_DB_NAME=my_snowflake_database
CUBEJS_DB_USER=snowflake_user
CUBEJS_DB_PASS=**********

Environment VariableDescriptionPossible ValuesRequired[Supports multiple data sources?][ref-config-multiple-ds-decorating-env]
CUBEJS_DB_SNOWFLAKE_ACCOUNTThe Snowflake account identifier to use when connecting to the databaseA valid Snowflake account ID
CUBEJS_DB_SNOWFLAKE_REGIONThe Snowflake region to use when connecting to the databaseA valid Snowflake region
CUBEJS_DB_SNOWFLAKE_WAREHOUSEThe Snowflake warehouse to use when connecting to the databaseA valid Snowflake warehouse in the account
CUBEJS_DB_SNOWFLAKE_ROLEThe Snowflake role to use when connecting to the databaseA valid Snowflake role in the account
CUBEJS_DB_SNOWFLAKE_CLIENT_SESSION_KEEP_ALIVEIf true, keep the Snowflake connection alive indefinitelytrue, false
CUBEJS_DB_NAMEThe name of the database to connect toA valid database name
CUBEJS_DB_USERThe username used to connect to the databaseA valid database username
CUBEJS_DB_PASSThe password used to connect to the databaseA valid database password
CUBEJS_DB_SNOWFLAKE_AUTHENTICATORThe type of authenticator to use with Snowflake. Use SNOWFLAKE with username/password, or SNOWFLAKE_JWT with key pairs. Defaults to SNOWFLAKESNOWFLAKE, SNOWFLAKE_JWT
CUBEJS_DB_SNOWFLAKE_PRIVATE_KEY_PATHThe path to the private RSA key folderA valid path to the private RSA key
CUBEJS_DB_SNOWFLAKE_PRIVATE_KEY_PASSThe password for the private RSA key. Only required for encrypted keysA valid password for the encrypted private RSA key
CUBEJS_CONCURRENCYThe number of concurrent connections each queue has to the database. Default is 5A valid number
CUBEJS_DB_MAX_POOLThe maximum number of concurrent database connections to pool. Default is 20A valid number

countDistinctApprox

Measures of type countDistinctApprox can be used in pre-aggregations when using Snowflake as a source database. To learn more about Snowflake's support for approximate aggregate functions, click here.

To learn more about pre-aggregation build strategies, head here.

FeatureWorks with read-only mode?Is default?
Batching
Export Bucket

By default, Snowflake uses batching to build pre-aggregations.

Batching

No extra configuration is required to configure batching for Snowflake.

Export Bucket

Snowflake supports using both AWS S3 and Google Cloud Storage for export bucket functionality.

AWS S3

Ensure the AWS credentials are correctly configured in IAM to allow reads and writes to the export bucket in S3.

CUBEJS_DB_EXPORT_BUCKET_TYPE=s3
CUBEJS_DB_EXPORT_BUCKET=my.bucket.on.s3
CUBEJS_DB_EXPORT_BUCKET_AWS_KEY=<AWS_KEY>
CUBEJS_DB_EXPORT_BUCKET_AWS_SECRET=<AWS_SECRET>
CUBEJS_DB_EXPORT_BUCKET_AWS_REGION=<AWS_REGION>

Google Cloud Storage

When using an export bucket, remember to assign the Storage Object Admin role to your BigQuery credentials (CUBEJS_DB_EXPORT_GCS_CREDENTIALS).

Before configuring Cube, an integration must be created and configured in Snowflake. Take note of the integration name (gcs_int from the example link) as you'll need it to configure Cube.

Once the Snowflake integration is set up, configure Cube using the following:

CUBEJS_DB_EXPORT_BUCKET=snowflake-export-bucket
CUBEJS_DB_EXPORT_BUCKET_TYPE=gcp
CUBEJS_DB_EXPORT_GCS_CREDENTIALS=<BASE64_ENCODED_SERVICE_CREDENTIALS_JSON>
CUBEJS_DB_EXPORT_INTEGRATION=gcs_int

Cube does not require any additional configuration to enable SSL as Snowflake connections are made over HTTPS.

Did you find this page useful?