exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 1 question 65 discussion

Actual exam question from Microsoft's DP-203
Question #: 65
Topic #: 1
[All DP-203 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage account that contains 100 GB of files. The files contain rows of text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.
You plan to copy the data from the storage account to an enterprise data warehouse in Azure Synapse Analytics.
You need to prepare the files to ensure that the data copies quickly.
Solution: You modify the files to ensure that each row is less than 1 MB.
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Tj87
Highly Voted 2 years, 8 months ago
I think we had this question in the previous pages and the correct answer was set as " compress the files"
upvoted 34 times
semauni
1 year, 9 months ago
More than 1 solution might be right. The question here is: if row size is reduced to 1MB, will loading go faster? The answer then is yes: whether compression is better or not, is not relevant.
upvoted 5 times
...
dom271219
2 years, 8 months ago
Exactly compress because a lot of row have more than 1MB length
upvoted 4 times
...
kim32
1 year, 12 months ago
The question before was more than that 1 MB but here is less than 1 MB. since, it is less, then answer is Yes.
upvoted 5 times
...
...
Phund
Highly Voted 2 years, 8 months ago
Selected Answer: A
"ensure that each row is less than 1 MB" and the condition for polybase is <1M, whatever method you used
upvoted 16 times
...
ecalvo
Most Recent 1 month, 2 weeks ago
Selected Answer: B
correct answer is B, it's the same question 24 topic 1
upvoted 1 times
...
imatheushenrique
1 month, 2 weeks ago
Selected Answer: B
Answer: B. No While ensuring that each row is less than 1 MB may help in some cases, it does not fully address the performance challenges of copying data efficiently into Azure Synapse Analytics.
upvoted 1 times
...
moize
4 months ago
Selected Answer: B
Réduire la taille des lignes à moins de 1 Mo ne garantit pas nécessairement une copie plus rapide des données vers Azure Synapse Analytics. Il serait plus efficace d'utiliser des techniques d'optimisation de transfert de données, comme la compression des fichiers ou l'utilisation de formats de fichiers optimisés pour les performances, tels que Parquet ou ORC.
upvoted 1 times
...
dgerok
1 year ago
Selected Answer: B
If you modify the files to ensure that each row is less than 1 MB, you might end up truncating or losing data from those rows. To achieve faster data copying, consider alternative approaches such as: Compression: Compress the files before transferring them to Azure Synapse Analytics. This can reduce the overall size of the data and improve transfer speed. Parallelization: Split the data into smaller chunks and copy them in parallel to take advantage of multiple resources. Optimized Data Types: Ensure that numerical values are stored using appropriate data types (e.g., integers, floats) to minimize storage space. Batch Processing: Process the data in batches rather than row by row to optimize data transfer. The question should be read carefully and attentively. You need to prepare the files to ensure that the data copies QUICKLY. Nobody asks if this improve... I agree with rocky48. You need do more to copy the data QUICKLY. So, the answer is B (NO)
upvoted 4 times
stornati
10 months, 1 week ago
I am with this answer. It should never be a good practise modifying the data at loading time.
upvoted 2 times
...
...
ItsAB
1 year, 2 months ago
Selected Answer: A
Given the large size of the table, I will utilize PolyBase for data transfer. Additionally, considering PolyBase's constraint that it cannot load rows exceeding 1MB in size, I will compress rows to ensure compliance with this requirement, thereby making PolyBase the optimal choice for data transfer. => Option A: yes
upvoted 1 times
...
stickslinger
1 year, 2 months ago
Shouldn't this be "B: No". Where does the question ask about Polybase? Modifying the rows of data could affect the integrity of the data
upvoted 1 times
...
rocky48
1 year, 2 months ago
Selected Answer: B
No, this solution does not meet the goal. While modifying the files to ensure that each row is less than 1 MB might help with individual row sizes, it won’t necessarily improve the overall data transfer speed. The total size of the files in the storage account is still 100 GB, and copying large volumes of data can be time-consuming regardless of individual row sizes. To optimize data transfer speed, consider other strategies such as parallelizing the data transfer, optimizing network bandwidth, or using appropriate data loading techniques in Azure Synapse Analytics.
upvoted 3 times
...
rocky48
1 year, 2 months ago
No, this solution does not meet the goal. While modifying the files to ensure that each row is less than 1 MB might help with individual row sizes, it won’t necessarily improve the overall data transfer speed. The total size of the files in the storage account is still 100 GB, and copying large volumes of data can be time-consuming regardless of individual row sizes. To optimize data transfer speed, consider other strategies such as parallelizing the data transfer, optimizing network bandwidth, or using appropriate data loading techniques in Azure Synapse Analytics.
upvoted 2 times
...
Charley92
1 year, 3 months ago
Selected Answer: B
No, this solution does not meet the goal. The files contain rows of text and numerical values, and 75% of the rows contain description data that has an average length of 1.1 MB. If you modify the files to ensure that each row is less than 1 MB, you may end up splitting the description data into multiple rows, which could affect the integrity of the data
upvoted 3 times
...
ExamDestroyer69
1 year, 4 months ago
Selected Answer: A
**Variations** Solution: You convert the files to compressed delimited text files. Does this meet the goal? **YES** Solution: You copy the files to a table that has a columnstore index. Does this meet the goal? **NO** Solution: You modify the files to ensure that each row is more than 1 MB. Does this meet the goal? **NO** Solution: You modify the files to ensure that each row is less than 1 MB. Does this meet the goal? **YES**
upvoted 14 times
...
hcq31818
1 year, 5 months ago
Selected Answer: A
PolyBase enables Azure Synapse Analytics to import and export data from Azure Data Lake Store, and from Azure Blob Storage. And it supports row sizes up to 1MB. https://learn.microsoft.com/en-us/sql/relational-databases/polybase/polybase-guide?view=sql-server-ver16#:~:text=Azure%20integration,and%20from%20Azure%20Blob%20Storage. https://learn.microsoft.com/en-us/sql/relational-databases/polybase/polybase-versioned-feature-summary?view=sql-server-ver16
upvoted 1 times
...
kkk5566
1 year, 8 months ago
Selected Answer: A
A is correct
upvoted 1 times
...
auwia
1 year, 10 months ago
Selected Answer: A
Yes, with less of 1Mb file we increase performance.
upvoted 2 times
...
e5019c6
1 year, 10 months ago
i thought that polybase just query the tables and dont do any process of ETL or ELT.
upvoted 2 times
...
rocky48
1 year, 11 months ago
Selected Answer: A
"You modify the files to ensure that each row is more than 1 MB" and the answer was "No". This particular question asks if "You modify the files to ensure that each row is less than 1 MB", and the answer given is "Yes".
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago