A company needs to read multiple terabytes of data for an initial load as part of a Snowflake migration. The company can control the number and size of CSV extract files.
How does Snowflake recommend maximizing the load performance?
A.
Use auto-ingest Snowpipes to load large files in a serverless model.
B.
Produce the largest files possible, reducing the overall number of files to process.
C.
Produce a larger number of smaller files and process the ingestion with size Small virtual warehouses.
D.
Use an external tool to issue batched row-by-row inserts within BEGIN TRANSACTION and COMMIT commands.
Snowflake’s documentation recommends producing the largest files possible for data loading, as larger files reduce the number of files to process and the overhead associated with handling many small files. This approach can maximize the load performance by leveraging Snowflake’s ability to ingest large files efficiently1. References: [COF-C02] SnowPro Core Certification Exam Study Guide
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit