Skip to main content

What’s New

Qrvey 9.0
Qrvey Version 9.0 is now available! This release introduces multi-platform hosting (Azure & AWS), a redesigned Dashboard, extensive widget customizations, and numerous features, enhancements, bug fixes, and performance improvements.
Learn More
Qrvey 8.8
Qrvey Version 8.8 (LTS) is now available to customers! This version supports FIPS for GovCloud and includes tons of bug fixes and performance improvements.
Learn More
Qrvey 8.7
Version 8.7 of the Qrvey platform is now available to customers! This version includes new features including area charts, the ability to pivot and export data, as well as numerous bug fixes and performance improvements.
Learn More
Qrvey 8.6
Version 8.6 of the Qrvey platform is now available to customers. This version includes several new feature enhancements and performance improvements.
Learn More
Required Update for 8.5.1
Attention 8.5.1 customers: for any 8.5.1 instance deployed prior to 08/05/2024, an update is required to ensure you are running the latest images.
Learn More
Qrvey 8.5
Version 8.5 (LTS) of the Qrvey platform is now available to customers. This version includes several new features and performance improvements.
Learn More
End-of-life Schedule
We've added a new article that lists the features and endpoints that have been scheduled for deprecation. All features and endpoints will be supported for (1) year after the release date of the LTS version that contains the alternative.
Learn More
Version: 9.0

Snowflake Configuration (AWS)

Snowflake is a compute-based warehouse software built on top of Amazon Web Services (AWS) or Microsoft Azure infrastructure. This document explains how to configure a connection to this platform.

Requirements

  • Snowflake Account: A Snowflake account is required to access the Snowflake Data Cloud.
  • Snowflake User with Permissions over the Databases: A dedicated Snowflake user with sufficient permissions to interact with the necessary databases is required.
  • Snowflake Storage Integration: Snowflake storage integration allows the Snowflake account to access external cloud storage resources, in this case, AWS S3.
  • Private Key: A private key is used to securely authenticate the Snowflake user.
  • Public Key: The public key is paired with the private key to complete the key-pair authentication process in Snowflake.

Step 1: Create an IAM Policy and Role in AWS

Define a Custom IAM Policy

Navigate to the AWS IAM service and create a new policy with the following JSON configuration:

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:DeleteObject",
"s3:DeleteObjectVersion"
],
"Resource": "arn:aws:s3:::${PREFIX}-dataload-drdatacommons/sql_export/*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::${PREFIX}-dataload-drdatacommons",
"Condition": {
"StringLike": {
"s3:prefix": [
"sql_export/*"
]
}
}
}
]
}

Create an IAM Role

  1. In the IAM console, navigate to Roles and select Create role.
  2. Choose AWS service as the trusted entity, select AWS account, and provide the required external ID (e.g., 0000).

AWS IAM

  1. Attach the policy created in the previous step.
  2. Complete the role creation process by assigning a meaningful name and reviewing the configurations.

Step 2: Configure the Snowflake Integration

Create a Storage Integration

Create a storage integration using the CREATE STORAGE INTEGRATION command. A storage integration is a Snowflake object that stores a generated service principal for your AWS cloud storage, along with an optional set of allowed or blocked storage locations (i.e., containers). Cloud provider administrators in your organization grant permissions on the storage locations to the generated service principal. This option allows users to avoid supplying credentials when creating stages or loading data.

CREATE STORAGE INTEGRATION QRVEY_MP_AWS4
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = S3
ENABLED = TRUE
STORAGE_AWS_ROLE_ARN = 'SNOWFLAKE-ROLE-ARN'
STORAGE_ALLOWED_LOCATIONS = ('s3://{PREFIX}-dataload-drdatacommons/sql_export/')
  • Integration Name: Name of the integration.
  • Storage AWS Role ARN: The ARN of the AWS IAM role created earlier.
  • Storage Allowed Locations: Specify the allowed storage paths. Replace <container>/<path> with dataload-drdatacommons/sql_export.

Step 3: Grant Access to the Storage Account

Retrieve Snowflake Storage Integration Details

In the Snowflake console, execute the following command to get the required storage integration details:

DESC STORAGE INTEGRATION <integration_name>;

This will return details about the integration, including STORAGE_AWS_IAM_USER_ARN and STORAGE_AWS_ROLE_ARN.

AWS STORAGE ACCOUNT

Update IAM Role Trust Policy

  1. Navigate to the AWS IAM console and locate the role created in Step 1.
  2. Open the role, go to the Trust relationships tab, and click Edit trust policy.
  3. Update the trust policy by replacing the existing values with the STORAGE_AWS_IAM_USER_ARN and STORAGE_AWS_ROLE_ARN from Snowflake.

This step ensures that Snowflake is authorized to assume the IAM role and access the designated S3 bucket.

Grant Access to the Storage Location

GRANT USAGE ON INTEGRATION <AWS_INT> TO ROLE <myrole>;
  • myrole: Name of the role used by the user configured for Snowflake.
  • AWS_INT: Name of the integration created in the earlier steps.

Step 4: Secrets Creation

You need to create a private key and public key, which will be stored as secrets. You can generate these using the openssl command or similar tools.

Generate Private Key

openssl genpkey -algorithm RSA -out private-key.pem

Generate Public Key

openssl rsa -pubout -in private-key.pem -out public-key.pem

Assign Public Key to User

The public key must now be assigned to the user.

ALTER USER USER_NAME SET RSA_PUBLIC_KEY = '<public key>';

Ensure all line breaks are removed from the public key when entering it in the Snowflake console.

Use Private Key and Create Secrets on Kubernetes

To use Snowflake correctly, you need to create some secrets on Kubernetes. Include the following:

  • account: Snowflake account, formatted as accountid.region.
  • username: Username to perform operations on Snowflake.
  • password: Password for the Snowflake user.
  • rsaPrivateKey: The private key to generate the token for API operations.
  • storageIntegration: Name of the previously created storage integration.

You can create the secret using the following command:

kubectl create secret generic <snowflakesecret> --from-literal=account=<SNOWFLAKEACCOUNT> --from-literal=username=<user> --from-literal=password=<password> --from-file=rsaPrivateKey=./private-key.pem --from-literal=storageIntegration=<integration> -n qrveyapps

Configure Connection in Qrvey

Open Qrvey Composer, configure the connection, and test whether it is successful.