exam questions

Exam DP-100 All Questions

View all questions & answers for the DP-100 exam

Exam DP-100 topic 2 question 3 discussion

Actual exam question from Microsoft's DP-100
Question #: 3
Topic #: 2
[All DP-100 Questions]

DRAG DROP -
You are building an intelligent solution using machine learning models.
The environment must support the following requirements:
✑ Data scientists must build notebooks in a cloud environment
✑ Data scientists must use automatic feature engineering and model building in machine learning pipelines.
✑ Notebooks must be deployed to retrain using Spark instances with dynamic worker allocation.
✑ Notebooks must be exportable to be version controlled locally.
You need to create the environment.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:

Show Suggested Answer Hide Answer
Suggested Answer:

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Dasist
Highly Voted 4 years, 2 months ago
Should be: Create Azure Databricks cluster -> Install Azure ML SDK for Python -> Create and exec Jupyter notebook using AutoML -> Export Jupyter to local env. That because you need auto feature engineering provided by autoML
upvoted 50 times
spaceykacey
3 years, 7 months ago
incase anyone still has doubts about this, refer: https://docs.microsoft.com/en-us/azure/machine-learning/how-to-configure-databricks-automl-environment
upvoted 3 times
...
prashantjoge
4 years ago
dont see the option for install ML SDK for python for databricks
upvoted 3 times
prashantjoge
4 years ago
you can add ML lib using a script action when you create the HDinsight service. databricks-rg-azdatabrickspt-fkd2ogyzogbag
upvoted 3 times
...
...
bruce
4 years, 2 months ago
Notebooks must be deployed to retrain using Spark instances with dynamic worker allocation- This condition won't be satisfied with Jupyter
upvoted 5 times
...
...
ajay_1233456
Highly Voted 2 years, 9 months ago
1. Create Azure Databricks cluster 2. Install Azure ML SDK for Python 3. Create and exec Jupyter notebook using AutoML 4. Export Jupyter to local env
upvoted 13 times
...
OdaNabunaga
Most Recent 11 months, 3 weeks ago
1. Create an Azure Databricks cluster 2. Install the Azure Machine Learning SDK for Python on the cluster 3. Create and execute a Jupyter notebook by using automated machine learning (AutoML) on the cluster 4. When the cluster is ready and has processed the notebook, export your Jupyter notebook to a local environment
upvoted 3 times
larimalarima
11 months, 2 weeks ago
I think it's most accurate
upvoted 2 times
...
...
PI_Team
1 year, 11 months ago
Create an Azure Databricks cluster to provide a cloud environment for data scientists to build their notebooks. Install the Azure ML SDK for Python on the cluster to enable data scientists to use automatic feature engineering and model building in machine learning pipelines. Create and execute the Zeppelin notebooks on the cluster to build and train machine learning models using Spark instances with dynamic worker allocation. When the cluster is ready, export Zeppelin notebooks to a local environment to enable version control of the notebooks locally. SaM
upvoted 2 times
...
phdykd
2 years, 4 months ago
Here is the most accurate sequence of actions for creating the desired environment: Create an Azure Databricks cluster Install Microsoft Machine Learning for Apache Spark on the cluster Create and execute Jupyter notebooks using AutoML on the cluster When the cluster is ready and has processed the notebook, export your Jupyter notebook to a local environment for version control. This sequence of actions will allow you to take advantage of the Azure Databricks platform for cloud-based data processing, and the Microsoft Machine Learning for Apache Spark library for automating feature engineering and model building in your Jupyter notebooks. Additionally, exporting the notebooks to a local environment will allow you to version control them and collaborate with other team members.
upvoted 7 times
...
shubhangi2612
2 years, 4 months ago
https://industry40.co.in/azure-hdinsight-and-azure-databricks/
upvoted 3 times
...
ning
3 years ago
Totally agree 1. Create Azure Databricks cluster 2. Install Azure ML SDK for Python 3. Create and exec Jupyter notebook using AutoML 4. Export Jupyter to local env
upvoted 4 times
...
DingDongSingSong
3 years, 2 months ago
Reference this link: https://docs.microsoft.com/en-us/azure/machine-learning/concept-automated-ml Answer is as per DASIST noted: 1. Create Azure Databricks cluster 2. Install Azure ML SDK for Python 3. Create and exec Jupyter notebook using AutoML 4. Export Jupyter to local env Another link that supports this rationale is :https://industry40.co.in/azure-hdinsight-and-azure-databricks/. It clearly outlines why for Spark based environments, Databricks is a better option than HDInsight
upvoted 3 times
...
ajayjha123
3 years, 6 months ago
Should be: Create Azure Databricks cluster -> Install Azure ML SDK for Python -> Create and exec Jupyter notebook using AutoML -> Export Jupyter to local env. That because you need auto feature engineering provided by autoML
upvoted 3 times
...
[Removed]
3 years, 8 months ago
I am still confused on the right answer.
upvoted 2 times
...
RyanTsai
3 years, 8 months ago
agree: Create Azure Databricks cluster -> Create and exec Jupyter notebook using AutoML -> Install Azure ML SDK for Python -> Export Jupyter to local env
upvoted 4 times
...
dija123
3 years, 9 months ago
Create Azure Databricks cluster -> Create and exec Jupyter notebook using AutoML -> Install Azure ML SDK for Python -> Export Jupyter to local env
upvoted 3 times
...
Akki0120
3 years, 11 months ago
If anyone wants all questions ping me 9403778084
upvoted 3 times
Lutendo
3 years, 9 months ago
Can you please share the email
upvoted 1 times
...
...
tamoor
4 years, 3 months ago
you can use only azure hdinsights because of condition you can use only apache-spark for data bricks, you must use Hadoop.
upvoted 1 times
...
dzzz
4 years, 5 months ago
I believe Data Bricks is capable, but if you choose that as first step, there is no further actions can be chosen, which all around Zeppelin, but Data Bricks doesn't support Zeppelin.
upvoted 3 times
Srivathsan
4 years, 4 months ago
https://docs.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect#:~:text=Databricks%20Connect%20allows%20you%20to,applications%20to%20Azure%20Databricks%20clusters. From the above link, it is seen that Databricks can support Zepplin.
upvoted 1 times
...
prashantjoge
4 years ago
data bricks is a an analytics platform. it does not support feature engineering
upvoted 1 times
...
...
valkyrieShadow
4 years, 6 months ago
Azure data bricks meets all the requirements. HDInsight does not. Example: automatic feature engineering is included with autoML. HDinsight does not include this feature. HDinsight: https://docs.microsoft.com/en-us/azure/hdinsight/ Azure Databricks: https://docs.microsoft.com/en-us/azure/databricks/applications/machine-learning/automl-hyperparam-tuning/
upvoted 3 times
HkIsCrazY
4 years, 4 months ago
No, HDinsight also provides all the autoML and auto feature engineering features https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-run-machine-learning-automl
upvoted 3 times
prashantjoge
4 years ago
the given answer is correct
upvoted 3 times
...
...
...
Karen_12321
4 years, 7 months ago
Why not jupyter note book?
upvoted 1 times
LakeSky
4 years, 1 month ago
Maybe because jupyter notebook don't provide intepreter for Spark like Zeppelin? https://medium.com/ankitakumar140494/a-comprehensive-comparison-between-jupyter-notebook-and-apache-zeppelin-911501981bfb
upvoted 1 times
zehraoneexam
3 years, 2 months ago
No, it supports too.
upvoted 1 times
...
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...