exam questions

Exam AWS Certified AI Practitioner AIF-C01 All Questions

View all questions & answers for the AWS Certified AI Practitioner AIF-C01 exam

Exam AWS Certified AI Practitioner AIF-C01 topic 1 question 179 discussion

A company has created a custom model by fine-tuning an existing large language model (LLM) from Amazon Bedrock. The company wants to deploy the model to production and use the model to handle a steady rate of requests each minute.

Which solution meets these requirements MOST cost-effectively?

  • A. Deploy the model by using an Amazon EC2 compute optimized instance.
  • B. Use the model with on-demand throughput on Amazon Bedrock.
  • C. Store the model in Amazon S3 and host the model by using AWS Lambda.
  • D. Purchase Provisioned Throughput for the model on Amazon Bedrock.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
a6558c7
2 weeks, 3 days ago
Selected Answer: D
D. For custom (fine-tuned) models on Amazon Bedrock, deploying to production requires purchasing Provisioned Throughput; on-demand mode is not available for most custom models. Provisioned Throughput reserves dedicated model units and provides a guaranteed, predictable capacity for a steady (not bursty) workload, with discounted pricing over pay-as-you-go options. Option B (on-demand throughput) is best for unpredictable or low-volume workloads, as it charges per token and may be unavailable for custom fine-tuned models, which commonly require Provisioned Throughput for production deployment
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...