exam questions

Exam DP-600 All Questions

View all questions & answers for the DP-600 exam

Exam DP-600 topic 1 question 54 discussion

Actual exam question from Microsoft's DP-600
Question #: 54
Topic #: 1
[All DP-600 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a Fabric tenant that contains a new semantic model in OneLake.
You use a Fabric notebook to read the data into a Spark DataFrame.
You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.
Solution: You use the following PySpark expression:
df.explain()
Does this meet the goal?

  • A. Yes
  • B. No
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
282b85d
Highly Voted 6 months, 3 weeks ago
Selected Answer: B
The df.explain() method in PySpark is used to print the logical and physical plans of a DataFrame, which helps in understanding how Spark plans to execute the query. It does not compute any statistical values like min, max, mean, or standard deviation. **To achieve the goal, you should use: df.describe().show()
upvoted 9 times
...
SamuComqi
Highly Voted 10 months ago
Selected Answer: B
The correct syntax is df.describe(). Sources: * describe --> https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.describe.html * explain --> https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.explain.html
upvoted 5 times
SamuComqi
10 months ago
Also df.summary() is a valid solution. Source ---> https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.summary.html
upvoted 1 times
...
...
stilferx
Most Recent 7 months, 1 week ago
Selected Answer: B
IMHO, NOOO explain() shows the execution plan...
upvoted 1 times
...
a_51
9 months ago
Selected Answer: B
describe is how you get the information.
upvoted 2 times
...
Momoanwar
10 months ago
Selected Answer: B
No explain is for the execut plan
upvoted 4 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...