How to Migrate from Redshift to Snowflake: A Comprehensive Guide

Published: July 23, 2024 - 7 min read

Julian Alvarado

Migrating from Redshift to Snowflake is a significant step for organizations looking to enhance their data warehousing capabilities. This guide provides a detailed walkthrough of the migration process, from initial planning to post-migration optimization.

Why Migrate from Redshift to Snowflake?

Both Amazon Redshift and Snowflake are popular cloud data warehouses, but they differ in key aspects:

  1. Architecture:
    • Redshift uses a shared-nothing architecture
    • Snowflake separates compute and storage, offering more flexibility
  1. Performance:
    • Snowflake often outperforms Redshift for complex queries and concurrent workloads
    • Snowflake’s multi-cluster, shared data architecture allows for better query optimization
  1. Scalability:
    • Snowflake offers more flexible scaling options, allowing independent scaling of compute and storage
    • Redshift requires more manual intervention for scaling
  1. Data Support:
    • Snowflake provides better support for semi-structured data (JSON, Avro, XML)
    • Snowflake offers native support for geospatial data
  1. Ease of Use:
    • Snowflake’s interface and SQL dialect are often considered more user-friendly
    • Snowflake requires less maintenance and tuning compared to Redshift

Benefits of Migrating to Snowflake

  1. Improved Performance:
    • Better handling of diverse workloads
    • Faster query execution for complex analytics
  1. Cost-effectiveness:
    • Separate billing for compute and storage
    • Pay-per-second pricing for compute resources
  1. Enhanced Features:
    • Time Travel for data recovery and reproducibility
    • Zero-copy cloning for efficient development and testing
    • Secure data sharing without data movement
  1. Scalability:
    • Instant and unlimited scaling of compute resources
    • Automatic storage scaling without performance impact
  1. Simplified Management:
    • Reduced need for manual optimization and tuning
    • Automated security and encryption features

Redshift to Snowflake: Challenges and Solutions

Handle Complex Data Types

Challenge: Redshift and Snowflake handle semi-structured data differently.

Solution: Use Snowflake’s VARIANT data type and JSON functions to work with semi-structured data effectively.

Address Performance Issues

Challenge: Queries that performed well in Redshift may be slow in Snowflake.

Solution: Analyze query plans, implement proper clustering, and use materialized views where appropriate.

Manage Costs During and After Migration

Challenge: Unexpected costs during and after migration.

Solution:

  • Use Snowflake’s resource monitors to set spending limits.
  • Implement auto-suspend for warehouses to avoid idle compute costs.
  • Regularly review and optimize storage usage.

Step 1. Plan Your Migration

Assess Your Current Redshift Environment

  1. Data volume and complexity: Analyze your current data size and query patterns.
  2. Schema structure and dependencies: Document your existing schema and identify any dependencies.
  3. Existing workflows and integrations: List all systems and processes that interact with Redshift.

For example, if you’re using Redshift with Google Sheets, you’ll need to plan for migrating these connections to Snowflake.

Set Migration Goals and Timelines

  1. Define success criteria: Establish clear objectives for performance, cost, and functionality.
  2. Create a realistic timeline: Break down the migration into phases with specific deadlines.
  3. Allocate resources and budget: Determine the team and financial resources needed for the migration.

Choose a Migration Approach

  1. Lift and shift: Migrate the entire system at once.
  2. Phased migration: Move data and processes gradually.
  3. Hybrid approach: Combine elements of both methods based on your specific needs.

Step 2. Prepare for Migration

Data Cleaning and Optimization

  1. Identify and resolve data quality issues: Run data profiling tools to detect inconsistencies.
  2. Optimize current Redshift schema: Remove unused tables and columns, and normalize data where appropriate.

Map Redshift to Snowflake Components

  1. Schema conversion considerations: Analyze differences in data types and schema structures.
  2. Data type mapping: Create a mapping document for Redshift to Snowflake data types.
  3. Handle unsupported features: Identify Redshift-specific features and plan alternatives in Snowflake.

Set Up Snowflake Environment

  1. Create Snowflake account and database: Set up your Snowflake account and create the target database.
  2. Configure warehouses and resource allocation: Define compute warehouses based on workload requirements.
  3. Establish security measures and access controls: Set up user roles, access policies, and network security.

Pro-Tip: Analyze Your Database Before Migration

Before moving your data from Redshift to Snowflake, it’s crucial to thoroughly examine your current database. This step will help you plan more effectively and identify potential issues early in the process.

Coefficient can be a valuable tool in this pre-migration analysis. It allows you to pull live data from your Redshift database into spreadsheets like Google Sheets or Excel, making it easier to examine your data without interfering with your production environment.

Step 1. Connect Both Data Sources

First, ensure Coefficient has access to both Redshift and Snowflake.

For Redshift:

  • Open the Coefficient sidebar in Google Sheets or Excel.
  • Navigate to the “Connected Sources” menu.
  • Add a new connection, select Redshift, and provide the necessary details such as Host, Database Name, Username, Password, and Port.
Adding Redshift connection

For Snowflake:

  • In the Coefficient sidebar, go to “Connected Sources.”
  • Click “Add Connection,” then select “Connect to Snowflake.”
  • Enter your Snowflake account credentials including Account Name, Database Name, Username, Password, and Warehouse Name.
Entering Snowflake credentials

Step 2. Import Data from Redshift

Use Coefficient to pull data from Redshift:

  • From the Coefficient sidebar, select “Import from” and then “Redshift.”
  • Choose “From Tables & Columns” or use a custom SQL query based on your needs.
Choosing import method

A Data Preview window will appear, allowing you to select the table that you want to import from.

Data preview window
  • Customize your import by selecting the tables, columns, and applying any necessary filters.
Customizing import settings

Click Import when finished.

Step 3. Prepare Data in Google Sheets/Excel

Once the data is imported, it will appear in your spreadsheet. Review and organize the data to ensure:

  • Proper headers are set.
  • Data accuracy is verified.
  • Necessary transformations are made to fit the right schema.

Step 4. Export Data to Snowflake

Finally, push the data to Snowflake:

  • In the Coefficient sidebar, click “Export to…” and then “Snowflake”.
Selecting export to Snowflake
  • Select the action (Insert, Update, Delete) and confirm your settings.
Selecting action for Snowflake
  • Designate the appropriate Snowflake table and complete field mappings.
Designating Snowflake table
  • Confirm your settings and then click “Export”.
 Confirming export settings

When the Export to Snowflake is complete (and successful), you can see the number of rows exported/skipped.

 Export to Snowflake complete

Step 3. Execute the Migration

Data Export from Redshift

  1. Use UNLOAD command to export data to S3:

UNLOAD (‘SELECT * FROM your_table’)

TO ‘s3://your-bucket/your-prefix/’

IAM_ROLE ‘arn:aws:iam::your-account-id:role/your-role’

PARALLEL OFF

ALLOWOVERWRITE

GZIP;

  1. Handle large datasets and partitioning: For tables exceeding 1 TB, consider using partitioned unloads:

UNLOAD (‘SELECT * FROM your_table WHERE date_column BETWEEN ”2023-01-01” AND ”2023-12-31”’)

TO ‘s3://your-bucket/your-prefix/year=2023/’

IAM_ROLE ‘arn:aws:iam::your-account-id:role/your-role’

PARALLEL OFF

Coefficient Excel Google Sheets Connectors
Try the Free Spreadsheet Extension Over 500,000 Pros Are Raving About

Stop exporting data manually. Sync data from your business systems into Google Sheets or Excel with Coefficient and set it on a refresh schedule.

Get Started

ALLOWOVERWRITE

GZIP;

Data Import into Snowflake

  1. Use Snowflake’s COPY INTO command:

COPY INTO your_table

FROM @your_stage/your-prefix/

FILE_FORMAT = (TYPE = ‘CSV’ FIELD_OPTIONALLY_ENCLOSED_BY = ‘”‘ COMPRESSION = GZIP)

ON_ERROR = ‘CONTINUE’;

  1. Leverage Snowpipe for continuous data loading:

CREATE OR REPLACE PIPE your_pipe

AUTO_INGEST = TRUE

AS

COPY INTO your_table

FROM @your_stage/your-prefix/

FILE_FORMAT = (TYPE = ‘CSV’ FIELD_OPTIONALLY_ENCLOSED_BY = ‘”‘ COMPRESSION = GZIP);

Validate Data Integrity

  1. Compare record counts and checksums:

— In Redshift

SELECT COUNT(*), SUM(CAST(column1 AS BIGINT)) FROM your_table;

— In Snowflake

SELECT COUNT(*), SUM(CAST(column1 AS NUMBER)) FROM your_table;

  1. Sample and detailed comparisons: Randomly select rows for in-depth comparison.
  2. Address discrepancies: Investigate and resolve any differences found during validation.

Step 4. Post-Migration Optimization

Performance Tuning in Snowflake

  1. Optimize query performance:
    • Use EXPLAIN PLAN to analyze query execution.
    • Implement appropriate clustering keys.
  1. Implement proper clustering and partitioning:

ALTER TABLE your_table CLUSTER BY (date_column, category_column);

Adapt Existing Workflows

  1. Update ETL processes: Modify data loading scripts to use Snowflake-specific commands.
  2. Modify reporting and analytics tools: Update connection strings and adapt queries for Snowflake syntax.

Monitor and Maintain

  1. Set up Snowflake-specific monitoring:
    • Use Snowflake’s Account Usage views to track query performance and resource usage.
    • Implement automated alerts for long-running queries or high credit consumption.
  1. Implement cost control measures:
    • Set up resource monitors to cap warehouse usage.
    • Use auto-suspend and auto-resume features for warehouses.

Redshift to Snowflake: Elevating Your Data Warehouse

Migrating from Redshift to Snowflake opens up new possibilities for data analytics and scalability. A successful transition requires meticulous planning, focusing on data integrity and minimizing operational disruptions.

Post-migration, prioritize optimizing your Snowflake implementation. Leverage its unique features like Time Travel and Zero-Copy Cloning to enhance your data operations. Ensure your team is trained to fully utilize Snowflake’s capabilities.

Want to maximize your Snowflake investment? Discover how to connect your Snowflake data directly to Excel and Google Sheets for seamless analysis and reporting.

Sync Live Data into Your Spreadsheet

Connect Google Sheets or Excel to your business systems, import your data, and set it on a refresh schedule.

Try the Spreadsheet Automation Tool Over 500,000 Professionals are Raving About

Tired of spending endless hours manually pushing and pulling data into Google Sheets? Say goodbye to repetitive tasks and hello to efficiency with Coefficient, the leading spreadsheet automation tool trusted by over 350,000 professionals worldwide.

Sync data from your CRM, database, ads platforms, and more into Google Sheets in just a few clicks. Set it on a refresh schedule. And, use AI to write formulas and SQL, or build charts and pivots.

Julian Alvarado Content Marketing
Julian is a dynamic B2B marketer with 8+ years of experience creating full-funnel marketing journeys, leveraging an analytical background in biological sciences to examine customer needs.
500,000+ happy users
Wait, there's more!
Connect any system to Google Sheets in just seconds.
Get Started Free

Trusted By Over 50,000 Companies