exam questions

Exam DP-200 All Questions

View all questions & answers for the DP-200 exam

Exam DP-200 topic 2 question 21 discussion

Actual exam question from Microsoft's DP-200
Question #: 21
Topic #: 2
[All DP-200 Questions]

You need to develop a pipeline for processing data. The pipeline must meet the following requirements:
✑ Scale up and down resources for cost reduction
✑ Use an in-memory data processing engine to speed up ETL and machine learning operations.
✑ Use streaming capabilities
✑ Provide the ability to code in SQL, Python, Scala, and R
Integrate workspace collaboration with Git

What should you use?

  • A. HDInsight Spark Cluster
  • B. Azure Stream Analytics
  • C. HDInsight Hadoop Cluster
  • D. Azure SQL Data Warehouse
  • E. HDInsight Kafka Cluster
  • F. HDInsight Storm Cluster
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️
Aparch Spark is an open-source, parallel-processing framework that supports in-memory processing to boost the performance of big-data analysis applications.
HDInsight is a managed Hadoop service. Use it deploy and manage Hadoop clusters in Azure. For batch processing, you can use Spark, Hive, Hive LLAP,
MapReduce.
Languages: R, Python, Java, Scala, SQL
You can create an HDInsight Spark cluster using an Azure Resource Manager template. The template can be found in GitHub.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/batch-processing

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
hoangton
3 years, 12 months ago
Answer is CORECT. key word is in-memory data processing
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...