exam questions

Exam DP-203 All Questions

View all questions & answers for the DP-203 exam

Exam DP-203 topic 10 question 1 discussion

Actual exam question from Microsoft's DP-203
Question #: 1
Topic #: 10
[All DP-203 Questions]

What should you do to improve high availability of the real-time data processing solution?

  • A. Deploy a High Concurrency Databricks cluster.
  • B. Deploy an Azure Stream Analytics job and use an Azure Automation runbook to check the status of the job and to start the job if it stops.
  • C. Set Data Lake Storage to use geo-redundant storage (GRS).
  • D. Deploy identical Azure Stream Analytics jobs to paired regions in Azure.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
petulda
Highly Voted 2 years, 8 months ago
There is a request 'Minimize number of Azure services'. With https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview Event capture, data can be stored in DL without using Stream Analytics. In this case just Reional redundancy for DL would be needed.
upvoted 13 times
subhub
8 months ago
I passed today.. 820.. Thoughts.. 80% of exam questions were in Examtopics. Other 20% were not difficult. I got the Contoso case study... Good Luck.
upvoted 12 times
...
GDJ2022
2 years, 3 months ago
The question is asking "improve high availability of the real-time data processing solution" and not high availability of data. Hence the correct answer is D
upvoted 3 times
...
ian_viana
2 years, 8 months ago
Agree, they also want a stage on data lake 2. "Stage Inventory data in Azure Data Lake Storage Gen2" we don't need Stream Analytics to do that. Event Hub enables you to automatically capture the streaming data in Event Hubs in an Azure Blob storage or Azure Data Lake Storage Gen 1 or Gen 2 account of your choice, with the added flexibility of specifying a time or size interval.
upvoted 1 times
ian_viana
2 years, 8 months ago
Please desconsidere my answer! Event Hub can capture data to Data Lake and Blob. But I think the key word in the question is: eal-time data PROCESSING solution azure. Event hub is just for capture. Stream Analytics do the processing so I'm going with answer D
upvoted 9 times
Marcus1612
2 years, 8 months ago
I agree, Regional redundancy will be great for data but the processing would be lost. We need a solution for High Availability for PROCESSING and DATA.
upvoted 8 times
...
...
...
sachabess79
2 years, 7 months ago
NB : it's an asynchronous copy.
upvoted 1 times
...
...
kkk5566
Most Recent 8 months, 2 weeks ago
Selected Answer: D
should be D
upvoted 1 times
...
vctrhugo
10 months, 4 weeks ago
Selected Answer: D
By deploying identical Azure Stream Analytics jobs to paired regions in Azure, you ensure redundancy and fault tolerance for the real-time data processing solution. Paired regions in Azure are geographically separated and designed to provide resilience and data protection in the event of a regional outage or failure. If one region becomes unavailable, the other paired region can seamlessly take over the processing workload, ensuring continuous availability of the real-time data processing solution.
upvoted 3 times
...
Deeksha1234
1 year, 9 months ago
Selected Answer: D
answer D is correct
upvoted 4 times
...
StudentFromAus
1 year, 10 months ago
Selected Answer: D
Answer is correct
upvoted 3 times
...
GDJ2022
2 years, 3 months ago
D is correct. The question is asking "improve high availability of the real-time data processing solution" and not high availability of data. Hence the correct answer is
upvoted 3 times
...
PallaviPatel
2 years, 3 months ago
Selected Answer: D
I go with D and info provided by Canary_2021 is correct.
upvoted 2 times
...
HaBroNounen
2 years, 4 months ago
guys, the correct answer is A. It says to limit the amount of different services to use. Databricks is being used as a analytical tool for the datascientist already, so it can also be used for processing jobs.
upvoted 2 times
Davico93
1 year, 10 months ago
You are right, but HC doesn't improve one shot processing, this would work better with multiple users
upvoted 1 times
...
...
edba
2 years, 4 months ago
I think the answer is correct!
upvoted 2 times
...
Canary_2021
2 years, 4 months ago
The answer should be D if the real time data load solution to move data from Azure Data Lake Storage Gen2 to Data Lake Gen2 to Azure SQL DB or Synapse Analytics as analytical data store. If this way, Power BI and Azure Databricks notebooks will run query against Azure SQL DB or Synapse Analytics. • Daily inventory data comes from a Microsoft SQL server located on a private network. • Stage Inventory data in Azure Data Lake Storage Gen2 before loading the data into the analytical data store. • See inventory levels across the stores. Data must be updated as close to real time as possible. • Litware employs business analysts who prefer to analyze data by using Microsoft Power BI, and data scientists who prefer analyzing data in Azure Databricks notebooks.
upvoted 4 times
...
jx1982
2 years, 4 months ago
I think the answer C is correct, high availability of "the real-time data processing", not high availability of "the data storage"
upvoted 4 times
jx1982
2 years, 4 months ago
sorry, typo, right answer is D
upvoted 3 times
...
...
FredNo
2 years, 5 months ago
What is the correct answer?
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago