exam questions

Exam DP-500 All Questions

View all questions & answers for the DP-500 exam

Exam DP-500 topic 1 question 20 discussion

Actual exam question from Microsoft's DP-500
Question #: 20
Topic #: 1
[All DP-500 Questions]

You have a deployment pipeline for a Power BI workspace. The workspace contains two datasets that use import storage mode.
A database administrator reports a drastic increase in the number of queries sent from the Power BI service to an Azure SQL database since the creation of the deployment pipeline.
An investigation into the issue identifies the following:
One of the datasets is larger than 1 GB and has a fact table that contains more than 500 million rows.
When publishing dataset changes to development, test, or production pipelines, a refresh is triggered against the entire dataset.
You need to recommend a solution to reduce the size of the queries sent to the database when the dataset changes are published to development, test, or production.
What should you recommend?

  • A. Turn off auto refresh when publishing the dataset changes to the Power BI service.
  • B. In the dataset, change the fact table from an import table to a hybrid table.
  • C. Enable the large dataset storage format for workspace.
  • D. Create a dataset parameter to reduce the fact table row count in the development and test pipelines.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
PrudenceK
Highly Voted 1 year, 11 months ago
Selected Answer: B
B is correct: By changing the fact table from an import table to a hybrid table, you can leverage the benefits of DirectQuery storage mode for the fact table. With DirectQuery, the data remains in the Azure SQL database, and Power BI sends queries directly to the database when users interact with the report. This approach can significantly reduce the amount of data transferred between Power BI and the Azure SQL database.
upvoted 8 times
TheSwedishGuy
1 year, 7 months ago
ChatGPT agrees with you. So I'll go with you and ChatGPT.
upvoted 2 times
...
...
nbagchi
Highly Voted 2 years, 5 months ago
Correct For more info, check: https://powerbi.microsoft.com/en-za/blog/announcing-public-preview-of-hybrid-tables-in-power-bi-premium/
upvoted 5 times
ivanb94
2 years, 4 months ago
I also agree with B mostly bc with D I don't see how we justify working with lesw rows in development and production. Usually you want more data at these stages, not less, to make sure what yu've developed primarily applies to more/different data.
upvoted 2 times
ivanb94
2 years, 4 months ago
*test and production, sorry
upvoted 3 times
...
...
...
Sri966
Most Recent 1 year, 4 months ago
Selected Answer: B
B is the simple and correct answer
upvoted 1 times
...
Akin_Eren
1 year, 5 months ago
Selected Answer: B
I think it's B as well; as DirectQuery option would be best for such a huge table. D option does not cover solution for production environment
upvoted 1 times
...
bigdave987
1 year, 8 months ago
Selected Answer: B
You need to recommend a solution to reduce the size of the queries sent to the database when the dataset changes are published to development, test, or production. PRODUCTION being the key word here. D. Create a dataset parameter to reduce the fact table row count in the development and test pipelines. - does NOT reduce queries for the production environment. Correct answer is B
upvoted 1 times
...
Hisayuki
1 year, 8 months ago
Selected Answer: B
With DirectQuery for Fact table
upvoted 1 times
...
Deloro
1 year, 8 months ago
Selected Answer: D
d applied this to clients already If you publish a report with large dataset it will take forever. limit the tables to for example top 5 rows with a parameter deploy and in the service turn the parameter off (set to false) and the data is refreshed in the service and you wont publish a large dataset
upvoted 1 times
...
Alborz
1 year, 9 months ago
Selected Answer: D
By creating a dataset parameter that can dynamically filter or reduce the fact table's row count during development and testing pipelines, you can limit the amount of data that needs to be refreshed and processed. This can help in reducing the load on the database during these pipeline runs and optimize the performance.
upvoted 1 times
...
ExamPage
1 year, 10 months ago
Correct Answer : B https://data-mozart.com/hybrid-tables-in-power-bi-the-ultimate-guide/
upvoted 1 times
...
dev2dev
2 years, 1 month ago
Selected Answer: B
issue identified after the deployment, so this solution is wrong which is only applicable to development and testing. Given answer is correct
upvoted 2 times
...
kelibanlangen
2 years, 1 month ago
the answer is D, pls read the question carefully "reduce the size of the queries sent to the database when the dataset changes are published to development, test, or production"
upvoted 3 times
Cococo
2 years ago
Agree, the only option to REDUCE SIZE OF QUERIES is to limit records (there is OR in "are published to development, test, or production", so in any of that), hybrid table will get you the same 500million rows
upvoted 2 times
...
...
Az301301X
2 years, 2 months ago
Selected Answer: D
The issue of a drastic increase in the number of queries sent to an Azure SQL database can be addressed by reducing the amount of data transferred during a dataset refresh. Given that one of the datasets is larger than 1 GB and has a fact table with over 500 million rows, a full refresh of the entire dataset can cause a significant increase in the number of queries sent to the database. Therefore, to reduce the size of queries sent to the database during dataset refreshes, I would recommend creating a dataset parameter to filter the fact table rows based on a condition. This will enable the development and test pipelines to refresh the dataset with a smaller subset of data, reducing the overall size of the queries sent to the database.
upvoted 5 times
Az301301X
2 years, 2 months ago
Answer given by Chat GPT
upvoted 1 times
...
AN_78
2 years, 1 month ago
I disagree. Data in the test environment should be the same as in production, hence you can't reduce number of rows. For me correct answer is B
upvoted 1 times
DarioReymago
2 years, 1 month ago
I prefere B but, the needed but the requirement does not talk about same date in production and test/develop. D could be a solution but we have the issue in producction
upvoted 1 times
...
...
dev2dev
2 years, 1 month ago
issue identified after the deployment, so this solution is wrong which is only applicable to development and testing. Given answer is correct
upvoted 1 times
...
...
dera23
2 years, 2 months ago
D is correct. It says that a refresh is always triggered when publishing from one stage to the next, hence changing the refresh mode will not help the situation
upvoted 1 times
...
solref
2 years, 2 months ago
B is correct . You could see this video to know more about Hybrid table benefits https://www.youtube.com/watch?v=HckuKYlx8kk
upvoted 2 times
...
DarioReymago
2 years, 2 months ago
Selected Answer: B
B is correct
upvoted 2 times
...
teedot
2 years, 2 months ago
I believe B is correct since the volume of data going into the test environment should be as close a possible to what will be experienced in production
upvoted 1 times
...
MrXBasit
2 years, 5 months ago
Selected Answer: D
Answer is correct
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...