Loading from Gzipped CSV is several times faster than loading from ORC and Parquet at an impressive 15 TB/Hour. While 5-6 TB/hour is decent if your data is originally in ORC or Parquet, don’t go out of your way to CREATE ORC or Parquet files from CSV in the hope that it will load Snowflake faster.
Loading data into fully structured (columnarized) schema is ~10-20% faster than landing it into a VARIANT.
https://community.snowflake.com/s/article/How-to-Load-Terabytes-Into-Snowflake-Speeds-Feeds-and-Techniques
should be B While CSV (Gzipped) (option C) is a commonly used format and can be efficient in terms of storage space due to compression, it is not as performant as Parquet (option B) for loading data in Snowflake.
A voting comment increases the vote count for the chosen answer by one.
Upvoting a comment with a selected answer will also increase the vote count towards that answer by one.
So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.
leozhang
Highly Voted 11 months agoMultiCloudIronMan
Most Recent 8 months, 2 weeks agoMultiCloudIronMan
6 months ago