exam questions

Exam DP-500 All Questions

View all questions & answers for the DP-500 exam

Exam DP-500 topic 1 question 73 discussion

Actual exam question from Microsoft's DP-500
Question #: 73
Topic #: 1
[All DP-500 Questions]

You develop a solution that uses a Power BI Premium capacity. The capacity contains a dataset that is expected to consume 50 GB of memory.
Which two actions should you perform to ensure that you can publish the model successfully to the Power BI service? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. Restart the capacity.
  • B. Publish an initial dataset that is less than 10 GB.
  • C. Increase the Max Offline Dataset Size setting.
  • D. Invoke a refresh to load historical data based on the incremental refresh policy.
  • E. Publish the complete dataset.
Show Suggested Answer Hide Answer
Suggested Answer: BD 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
salvalcaraz
1 year, 6 months ago
Selected Answer: DE
It's in the docs: https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models#enable-large-semantic-models --> Publish the model as a semantic model to the service. --> Invoke a refresh to load historical data based on the incremental refresh policy.
upvoted 3 times
...
DarioReymago
2 years, 2 months ago
Selected Answer: BD
in practice B y D is the best solution
upvoted 2 times
...
Az301301X
2 years, 3 months ago
Selected Answer: CE
Chat GPT: The correct options to ensure successful publishing of a Power BI Premium dataset that is expected to consume 50 GB of memory are: C. Increase the Max Offline Dataset Size setting: Power BI Premium capacity has a default limit of 10 GB for the maximum offline dataset size that can be published to the service. You can increase this limit to accommodate larger datasets. In this case, you should increase the limit to 50 GB to support the dataset. E. Publish the complete dataset: To ensure that the complete dataset is published successfully, you should not publish an initial dataset that is less than 10 GB, as suggested in option B. Also, restarting the capacity, as suggested in option A, is not necessary in this scenario. Invoking a refresh to load historical data based on the incremental refresh policy, as suggested in option D, is optional and does not affect the publishing process. Therefore, the correct actions to perform are to increase the Max Offline Dataset Size setting to 50 GB and publish the complete dataset.
upvoted 1 times
...
solref
2 years, 3 months ago
Selected Answer: DE
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models
upvoted 3 times
...
stfglv
2 years, 5 months ago
We need to differentiate between the limit for a dataset and limit for a data model that will be uploaded to the service. As this link (https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models) states, for Premium the option to enable large datasets does not affect the PBI Desktop model upload size, which is still limited to 10 GB. "Instead, datasets can grow beyond that limit in the service on refresh." . In conclusion, until you upload it it has to be smaller than 10GB and afterwards it can grow beyond that. Incremental refresh is a must obviously.
upvoted 2 times
ThariCD
2 years, 4 months ago
The link you mention also includes the steps to enable a large dataset for a new model published to the service, which are: 1. Create a model in Power BI Desktop. If your dataset will become larger and progressively consume more memory, be sure to configure Incremental Refresh 2. Publish the model as a dataset to the service 3. In the service > dataset > Settings, expand Large dataset storage format, set the slider to On and select Apply Doesn't that mean that D & E are the correct answers? The incremental refresh is obvious from step 1 but doesn't step 2 mean you should publish the complete dataset? Nowhere in those steps does it say you need to first publish 10 GB of your dataset.
upvoted 2 times
...
...
Saffar
2 years, 5 months ago
Selected Answer: BD
I think BD is correct.
upvoted 4 times
...
cherious
2 years, 6 months ago
Selected Answer: BD
BD is correct.
upvoted 4 times
...
Maazi
2 years, 6 months ago
I think D & E are the correct answers. See the section "Enable large Datasets" at https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models
upvoted 2 times
...
jeroen12345
2 years, 6 months ago
Selected Answer: BD
D + B is correct
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...