exam questions

Exam DP-200 All Questions

View all questions & answers for the DP-200 exam

Exam DP-200 topic 6 question 116 discussion

Actual exam question from Microsoft's DP-200
Question #: 116
Topic #: 6
[All DP-200 Questions]

A company plans to store hundreds of files in an Azure Storage account and in Azure Data Lake Storage account. The files will be stored in the parquet format. A solution must be in place that would adopt the following requirements:
- Provide the ability to process the data every 5 hours
- Give the ability for interactive data analysis
- Give the ability to process data using solid-state drive caching
- Make use of Directed Acyclic Graph processing mechanisms
- Provide support for REST API calls for monitoring purposes
- Ensure support for Python and Integration with Microsoft Power BI
Which of the following would you consider for the solution?

  • A. Azure SQL Datawarehouse
  • B. HDInsight Apache storm cluster
  • C. Azure stream Analytics
  • D. HDInsight Spark cluster
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️
All of these features are provided with the HDInsight Apache Storm cluster. The Microsoft documentation mentions the following:
Why use Apache Storm on HDInsight?
Storm on HDInsight provides the following features:
ג€¢ 99% Service Level Agreement (SLA) on Storm uptime: For more information, see the SLA information for HDInsight document.
ג€¢ Supports easy customization by running scripts against a Storm cluster during or after creation. For more information, see Customize HDInsight clusters using script action.
ג€¢ Create solutions in multiple languages: You can write Storm components in the language of your choice, such as Java, C#, and Python.
- Integrates Visual Studio with HDInsight for the development, management, and monitoring of C# topologies. For more information, see Develop C# Storm topologies with the HDInsight Tools for Visual Studio.
- Supports the Trident Java interface. You can create Storm topologies that support exactly once processing of messages, transactional datastore persistence, and a set of common stream analytics operations.
ג€¢ Dynamic scaling: You can add or remove worker nodes with no impact to running Storm topologies.
- You must deactivate and reactivate running topologies to take advantage of new nodes added through scaling operations.
ג€¢ Create streaming pipelines using multiple Azure services: Storm on HDInsight integrates with other Azure services such as Event Hubs, SQL Database, Azure
Storage, and Azure Data Lake Storage.
All of the other options are incorrect because they don't provide all of the capabilities.
Reference:
https://docs.microsoft.com/en-us/azure/hdinsight/storm/apache-storm-overview

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
massnonn
3 years, 6 months ago
"Ensure support for Python " this is seems you have to use spark
upvoted 1 times
...
Avinash75
3 years, 11 months ago
should be D .. Spark Cluster has all the requirements as mentioned in the reference link : https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...