Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.

Unlimited Access

Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Exam SnowPro Core topic 1 question 795 discussion

Actual exam question from Snowflake's SnowPro Core
Question #: 795
Topic #: 1
[All SnowPro Core Questions]

What are the recommended steps to address poor SQL query performance due to data spilling? (Choose two.)

  • A. Clone the base table.
  • B. Fetch required attributes only.
  • C. Use a larger virtual warehouse.
  • D. Process the data in smaller batches.
  • E. Add another cluster in the virtual warehouse.
Show Suggested Answer Hide Answer
Suggested Answer: CD 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Lematthew31
2 weeks, 4 days ago
Selected Answer: BC
It's B and C
upvoted 1 times
...
0e504b5
3 months ago
Selected Answer: BC
The spilling can't always be avoided, especially for large batches of data, but it can be decreased by: Reviewing the query for query optimization especially if it is a new query. Reducing the amount of data processed. For example, by trying to improve partition pruning, or projecting only the columns that are needed in the output. Decreasing the number of parallel queries running in the warehouse. Trying to split the processing into several steps (for example by replacing the CTEs with temporary tables). Using a larger warehouse. This effectively means more memory and more local disk space. https://community.snowflake.com/s/article/Performance-impact-from-local-and-remote-disk-spilling
upvoted 2 times
...
Isio05
6 months, 2 weeks ago
Selected Answer: BC
Actually I think there are 3 possible options. However, only B and C are explicitly mentioned in docs (as shown by basdas comment) Moreover, I think processing data in smaller batches can be just unhandy and requiring more work than other solutions. Thus I eventually vote for BC
upvoted 1 times
...
basdas
6 months, 3 weeks ago
Selected Answer: BC
The spilling can't always be avoided, especially for large batches of data, but it can be decreased by: Reviewing the query for query optimization especially if it is a new query Reducing the amount of data processed. For example, by trying to improve partition pruning, or projecting only the columns that are needed in the output. Decreasing the number of parallel queries running in the warehouse. Trying to split the processing into several steps (for example by replacing the CTEs with temporary tables). Using a larger warehouse - this effectively means more memory and more local disk space.
upvoted 2 times
...
Heetec
7 months, 1 week ago
Selected Answer: CD
CD correct https://community.snowflake.com/s/article/Performance-impact-from-local-and-remote-disk-spilling
upvoted 3 times
Rajivnb
7 months ago
its BD. Snowflake does not encourage increasing the warehouse size if something can be done with the existing query. The link which you gave also talk about "projecting only the columns that are needed in the output"
upvoted 1 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...