
Answer-first summary for fast verification
Answer: Aim to produce data files that are between 100 MB and 250 MB in size, compressed., Enclose fields that contain delimiter characters in single or double quotes., Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
The correct answers are A, C, and D based on Snowflake's official documentation and strong community consensus. Option A is optimal because Snowflake recommends compressed file sizes of 100-250 MB to balance parallel processing efficiency with file management overhead. Option C is essential for data integrity, as enclosing delimiter-containing fields in quotes prevents parsing errors during loading. Option D improves performance by splitting large files to leverage Snowflake's distributed compute resources effectively. Option B is incorrect due to potential cross-region data transfer costs and latency. Option E is suboptimal as starting with the largest warehouse may incur unnecessary costs without performance benefits. Option F is inefficient compared to structured partitioning strategies.
Author: LeetQuiz Editorial Team
Ultimate access to all questions.
Which of the following are best practices for loading data into Snowflake? (Choose three.)
A
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
B
Load data from files in a cloud storage service in a different region or cloud platform from the service or region containing the Snowflake account, to save on cost.
C
Enclose fields that contain delimiter characters in single or double quotes.
D
Split large files into a greater number of smaller files to distribute the load among the compute resources in an active warehouse.
E
When planning which warehouse(s) to use for data loading, start with the largest warehouse possible.
F
Partition the staged data into large folders with random paths, allowing Snowflake to determine the best way to load each file.
No comments yet.