exam questions

Exam AWS Certified Database - Specialty All Questions

View all questions & answers for the AWS Certified Database - Specialty exam

Exam AWS Certified Database - Specialty topic 1 question 76 discussion

Exam question from Amazon's AWS Certified Database - Specialty
Question #: 76
Topic #: 1
[All AWS Certified Database - Specialty Questions]

A large ecommerce company uses Amazon DynamoDB to handle the transactions on its web portal. Traffic patterns throughout the year are usually stable; however, a large event is planned. The company knows that traffic will increase by up to 10 times the normal load over the 3-day event. When sale prices are published during the event, traffic will spike rapidly.
How should a Database Specialist ensure DynamoDB can handle the increased traffic?

  • A. Ensure the table is always provisioned to meet peak needs
  • B. Allow burst capacity to handle the additional load
  • C. Set an AWS Application Auto Scaling policy for the table to handle the increase in traffic
  • D. Preprovision additional capacity for the known peaks and then reduce the capacity after the event
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
sachin
Highly Voted 3 years ago
C is the correct. D is not correct because in Dynamodb when you scale up the capacity your data partition will increase accorssing to your RCU and WCU, but when you scale down the partition remain unchnaged, so the per table RCU and WCU will give poor performance. I think Auto Scaling is the correct way is such situation.
upvoted 22 times
minhntm
3 years ago
correct, I'm surprised that no one talk about it. Once you add more capacity, it's really hard to reduce
upvoted 7 times
...
Mintwater
2 years, 2 months ago
https://docs.aws.amazon.com/autoscaling/application/userguide/application-auto-scaling-tutorial.html After completing this tutorial, you’ll know how to: Use scheduled scaling to add extra capacity to meet a heavy load before it arrives, and then remove the extra capacity when it's no longer required. Use a target tracking scaling policy to scale your application based on current resource utilization. Vote for C
upvoted 2 times
leotoras
2 years ago
this document regards EC2 auto scaling, not DynamoDB scaling
upvoted 1 times
kerl
1 year, 11 months ago
https://docs.aws.amazon.com/autoscaling/application/userguide/what-is-application-auto-scaling.html and https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AutoScaling.html With Application Auto Scaling, you create a scaling policy for a table or a global secondary index. The scaling policy specifies whether you want to scale read capacity or write capacity (or both), and the minimum and maximum provisioned capacity unit settings for the table or index. Answer: C
upvoted 1 times
...
...
...
...
BillyMadison
Highly Voted 3 years, 9 months ago
I'm going with D because we know about the increased traffic in advance because it will be due to a sale. Burst capacity is fine for unknown spikes up to 5 minutes. This even is for 3 days. https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/bp-partition-key-design.html#bp-partition-key-throughput-bursting "DynamoDB provides some flexibility in your per-partition throughput provisioning by providing burst capacity. Whenever you're not fully using a partition's throughput, DynamoDB reserves a portion of that unused capacity for later bursts of throughput to handle usage spikes. DynamoDB currently retains up to 5 minutes (300 seconds) of unused read and write capacity. During an occasional burst of read or write activity, these extra capacity units can be consumed quickly—even faster than the per-second provisioned throughput capacity that you've defined for your table. DynamoDB can also consume burst capacity for background maintenance and other tasks without prior notice. Note that these burst capacity details might change in the future."
upvoted 15 times
...
MultiAZ
Most Recent 1 year, 6 months ago
Selected Answer: D
The answer is D, as we know about the event beforehand Furthermore, C will have an issue getting 10x performance quickly enough because of the cooldown.
upvoted 1 times
...
sonu6252
1 year, 6 months ago
D. https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ReadWriteCapacityMode.html#HowItWorks.ProvisionedThroughput.Manual
upvoted 1 times
...
rrshah83
1 year, 6 months ago
Selected Answer: C
scheduled auto-scaling
upvoted 1 times
...
Santix
1 year, 8 months ago
D is correct, because prewarm: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/HowItWorks.ReadWriteCapacityMode.html#HowItWorks.OnDemand
upvoted 1 times
...
Germaneli
1 year, 9 months ago
Selected Answer: C
Scheduled scaling, as one way of Application Auto Scaling, is available for DynamoDB tables and global secondary indexes. It allows to "scale a resource one time only or on a recurring schedule". I understand that this is what we need for the one-time event, and it's even automated (option D is not automated). https://docs.aws.amazon.com/autoscaling/application/userguide/application-auto-scaling-scheduled-scaling.html
upvoted 1 times
...
orlvas
1 year, 10 months ago
D In summary, Autoscaling requires consecutive data points where the target utilization value is being breached to scale up a DynamoDB table. For this reason Autoscaling is not recommended as a solution for dealing with spiked workloads.
upvoted 1 times
...
IhorK
1 year, 11 months ago
Selected Answer: C
Amazon DynamoDB auto scaling uses the AWS Application Auto Scaling service to dynamically adjust provisioned throughput capacity on your behalf, in response to actual traffic patterns. https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/AutoScaling.html
upvoted 1 times
...
Paulv82003
2 years, 1 month ago
Have everyone forgot cooldown during AWS Application Auto Scaling policy? We know we need to increase by 10x for AWS Application Auto Scaling policy with cooldowns it will take time to get there.
upvoted 2 times
sguinales
1 year, 8 months ago
agree and in questo said "traffic will spike rapidly" autoscaling here bad performance instead provisioned, because you know when is going to be a spike and can be prepared.
upvoted 1 times
...
...
aviathor
2 years, 1 month ago
Selected Answer: D
A. This is achieved by D, but D is more precise. B. Does DynamoDB support burst capacity? C.The question is not about the application, but about DynamoDB D. Using provisioned capacity to meet the expected demand is one way of doing it. Using provisioned capacity with auto-scaling would also work. And of course on-demand would be an option.
upvoted 1 times
...
guau
2 years, 5 months ago
Selected Answer: C
C- I will go with autoscaling. Why change 2 times config, when autoscaling is designed for that..
upvoted 1 times
leotoras
2 years ago
because the usage will spike rapidly, if you have pre previsioned, you dont waste time scaling
upvoted 1 times
...
...
im_not_robot
2 years, 5 months ago
D is wrong because on-demand to maximum 2 times of previous peak, it can not scale to 10x. https://aws.amazon.com/premiumsupport/knowledge-center/on-demand-table-throttling-dynamodb/
upvoted 1 times
...
renfdo
2 years, 5 months ago
Selected Answer: C
C is correct. AWS always recomend to use auto scaling when you can predict usage. I now that I excpect 10x more traffic.
upvoted 1 times
...
lollyj
2 years, 6 months ago
Selected Answer: D
I"m going with D because auto scaling on the application doesn't mean the DB can accommodate the increased RW on the DB. Since the peak traffic is predictable then it may be best to pre-provision ahead and reduce after sale is over. I may be wrong though
upvoted 1 times
...
awsjjj
2 years, 8 months ago
Selected Answer: D
I go with D. although If the answer D also includes autoscaling it would have been easy to choose D
upvoted 1 times
...
SachinGoel
2 years, 9 months ago
Selected Answer: C
C. Set an AWS Application Auto Scaling policy for the table to handle the increase in traffic
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...