exam questions

Exam AWS Certified Data Engineer - Associate DEA-C01 All Questions

View all questions & answers for the AWS Certified Data Engineer - Associate DEA-C01 exam

Exam AWS Certified Data Engineer - Associate DEA-C01 topic 1 question 241 discussion

A company uses AWS Glue Apache Spark jobs to handle extract, transform, and load (ETL) workloads. The company has enabled logging and monitoring for all AWS Glue jobs.

One of the AWS Glue jobs begins to fail. A data engineer investigates the error and wants to examine metrics for all individual stages within the job.

How can the data engineer access the stage metrics?

  • A. Examine the AWS Glue job and stage details in the Spark UI.
  • B. Examine the AWS Glue job and stage metrics in Amazon CloudWatch.
  • C. Examine the AWS Glue job and stage logs in AWS CloudTrail logs.
  • D. Examine the AWS Glue job and stage details by using the run insights feature on the job.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
rdiaz
1 month, 2 weeks ago
Selected Answer: A
AWS Glue uses Apache Spark under the hood for distributed data processing. When you run a Glue job, especially a Spark-based ETL job, AWS Glue provides a Spark UI where you can inspect: • Stages • Tasks • Execution time • Shuffle read/write • Input/output records This is the most detailed and appropriate way to examine metrics at the stage level.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...