exam questions

Exam AWS Certified Machine Learning - Specialty All Questions

View all questions & answers for the AWS Certified Machine Learning - Specialty exam

Exam AWS Certified Machine Learning - Specialty topic 1 question 202 discussion

A company's machine learning (ML) specialist is designing a scalable data storage solution for Amazon SageMaker. The company has an existing TensorFlow-based model that uses a train.py script. The model relies on static training data that is currently stored in TFRecord format.

What should the ML specialist do to provide the training data to SageMaker with the LEAST development overhead?

  • A. Put the TFRecord data into an Amazon S3 bucket. Use AWS Glue or AWS Lambda to reformat the data to protobuf format and store the data in a second S3 bucket. Point the SageMaker training invocation to the second S3 bucket.
  • B. Rewrite the train.py script to add a section that converts TFRecord data to protobuf format. Point the SageMaker training invocation to the local path of the data. Ingest the protobuf data instead of the TFRecord data.
  • C. Use SageMaker script mode, and use train.py unchanged. Point the SageMaker training invocation to the local path of the data without reformatting the training data.
  • D. Use SageMaker script mode, and use train.py unchanged. Put the TFRecord data into an Amazon S3 bucket. Point the SageMaker training invocation to the S3 bucket without reformatting the training data.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
VinceCar
Highly Voted 1 year, 5 months ago
Selected Answer: D
Should be D. TFRecord could be uploaded to S3 directly and be used as SageMaker's data source. https://sagemaker-examples.readthedocs.io/en/latest/sagemaker_batch_transform/working_with_tfrecords/working-with-tfrecords.html#Upload-dataset-to-S3
upvoted 5 times
BTRYING
1 year, 3 months ago
Then why not C
upvoted 1 times
drcok87
1 year, 2 months ago
then how is local path a "scalable data storage solution" ? answer is D
upvoted 2 times
...
...
...
Amit11011996
Highly Voted 1 year, 5 months ago
Selected Answer: D
It has to option D.
upvoted 5 times
...
loict
Most Recent 8 months ago
Selected Answer: D
A. NO - SageMaker can use TFRecods as-is in S3 B. NO - SageMaker can use TFRecods as-is in S3 C. NO - SageMaker must use S3 as input, it cannot read your local data D. YES - SageMaker can use TFRecods as-is in S3
upvoted 2 times
...
Mickey321
8 months, 4 weeks ago
Selected Answer: D
SageMaker script mode allows you to use your existing TensorFlow training scripts without any modifications. You can use the same TFRecord data format that your model expects, and point the SageMaker training invocation to the S3 bucket where the data is stored. SageMaker will automatically download the data to the local path of the training instance and pass it as an argument to your train.py script. You don’t need to reformat the data to protobuf format or rewrite your script to convert the data12.
upvoted 3 times
...
Mickey321
9 months, 2 weeks ago
Selected Answer: D
This option allows the ML specialist to use the existing train.py script and TFRecord data without any changes, minimizing development overhead. By using SageMaker script mode, the specialist can run the existing TensorFlow script as-is, and by pointing the SageMaker training invocation to the S3 bucket containing the TFRecord data, the specialist can provide the training data to SageMaker without reformatting it.
upvoted 1 times
...
AjoseO
1 year, 2 months ago
Selected Answer: D
This option leverages SageMaker's built-in support for the TensorFlow framework and script mode. The existing train.py script can be used without any modifications. SageMaker will automatically download the training data from the specified S3 location to the instance running the training job. This option saves development time by avoiding the need to rewrite the train.py script or reformat the training data.
upvoted 3 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago