exam questions

Exam AWS Certified Solutions Architect - Professional All Questions

View all questions & answers for the AWS Certified Solutions Architect - Professional exam

Exam AWS Certified Solutions Architect - Professional topic 1 question 59 discussion

An International company has deployed a multi-tier web application that relies on DynamoDB in a single region. For regulatory reasons they need disaster recovery capability in a separate region with a Recovery Time Objective of 2 hours and a Recovery Point Objective of 24 hours. They should synchronize their data on a regular basis and be able to provision me web application rapidly using CloudFormation.
The objective is to minimize changes to the existing web application, control the throughput of DynamoDB used for the synchronization of data and synchronize only the modified elements.
Which design would you choose to meet these requirements?

  • A. Use AWS data Pipeline to schedule a DynamoDB cross region copy once a day, create a ג€Lastupdatedג€ attribute in your DynamoDB table that would represent the timestamp of the last update and use it as a filter.
  • B. Use EMR and write a custom script to retrieve data from DynamoDB in the current region using a SCAN operation and push it to DynamoDB in the second region.
  • C. Use AWS data Pipeline to schedule an export of the DynamoDB table to S3 in the current region once a day then schedule another task immediately after it that will import data from S3 to DynamoDB in the other region.
  • D. Send also each Ante into an SQS queue in me second region; use an auto-scaling group behind the SQS queue to replay the write in the second region.
Show Suggested Answer Hide Answer
Suggested Answer: A 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
student22
7 months, 3 weeks ago
Selected Answer: A
A is the only solution that supports the requirement "synchronize only the modified elements".
upvoted 1 times
...
amministrazione
10 months, 2 weeks ago
A. Use AWS data Pipeline to schedule a DynamoDB cross region copy once a day, create a Lastupdated attribute in your DynamoDB table that would represent the timestamp of the last update and use it as a filter.
upvoted 1 times
...
tototo12
1 year, 5 months ago
Selected Answer: A
answer A
upvoted 1 times
...
TravelKo
1 year, 11 months ago
Selected Answer: A
I will go with option A. We don't know number of changes in a day from the question. it could be 0 or millions. Why to export the whole table?
upvoted 1 times
...
SkyZeroZx
2 years ago
Selected Answer: C
The best design for this scenario is C. This design meets all of the requirements: It uses AWS Data Pipeline to schedule an export of the DynamoDB table to S3 in the current region once a day. This ensures that the data is synchronized on a regular basis. It schedules another task immediately after it that will import data from S3 to DynamoDB in the other region. This ensures that the data is synchronized in both regions. It uses a LastUpdated attribute in the DynamoDB table to represent the timestamp of the last update. This allows the Data Pipeline to only export the modified elements. It uses S3 as a staging area. This helps to control the throughput of DynamoDB and prevents the table from becoming overloaded.
upvoted 1 times
SkyZeroZx
2 years ago
This design would allow the company to meet their disaster recovery requirements while minimizing changes to the existing web application. The data would be synchronized on a regular basis, and only the modified elements would be exported. The throughput of DynamoDB would be controlled, and the data could be replayed even if the auto-scaling group is scaled down. Here are some additional considerations for this design: The frequency of the data synchronization should be based on the regulatory requirements. The size of the DynamoDB table should be considered when determining the capacity of the S3 bucket. The Data Pipeline job should be configured to retry if it fails.
upvoted 1 times
...
SkyZeroZx
2 years ago
Both ChatGPT and Bard agree that C seems to me the most viable option as explained
upvoted 1 times
...
SkyZeroZx
2 years ago
The other designs are not as suitable for this scenario. For example, design A only synchronizes the data once a day, which is not enough for regulatory reasons. Design B uses EMR, which is more complex and expensive than Data Pipeline. Therefore, the best design for this scenario is C. Here are some additional details about the design: The Data Pipeline job would use the LastUpdated attribute to filter the data that is exported. This would ensure that only the modified elements are exported. The S3 bucket would be configured to have a high throughput. This would ensure that the data can be exported without overloading the DynamoDB table. The Data Pipeline job would also import the data from S3 to DynamoDB in the other region. This would ensure that the data is synchronized in both regions.
upvoted 1 times
...
...
Jesuisleon
2 years, 1 month ago
Selected Answer: A
I think A.
upvoted 1 times
...
hahaaaaa
2 years, 11 months ago
I think A. Requirements: "to synchronize only updated parts." A. CRR: continuously replicate changes from a DynamoDB table. C. export/import : backup the entire contents of DynamoDB table.
upvoted 3 times
...
bobsmith2000
3 years, 2 months ago
Selected Answer: C
It can't be A! There's no such an option in data pipeline templates any more! It's not mentioned in documentation as well. There are two options in templates: 1) Export DynamoDB table to S3 2) Import DynamoDB backup data from S3 That's it! So the only thing that has left is C. Import and export
upvoted 1 times
wassb
2 years, 8 months ago
I believe that wont satisfy the RPO of 24h
upvoted 2 times
...
...
cldy
3 years, 6 months ago
A. Use AWS data Pipeline to schedule a DynamoDB cross region copy once a day, create a ג€Lastupdatedג€ attribute in your DynamoDB table that would represent the timestamp of the last update and use it as a filter.
upvoted 2 times
...
acloudguru
3 years, 7 months ago
why no use dynamoDB global table?
upvoted 1 times
tiana528
3 years, 6 months ago
dynamodb global table is active-active, this question asks for a way of disaster recovery which is not active-active.
upvoted 3 times
...
...
rockc
3 years, 7 months ago
Here are more details: https://aws.amazon.com/blogs/aws/copy-dynamodb-data-between-regions-using-the-aws-data-pipeline/
upvoted 1 times
...
nwk
3 years, 8 months ago
https://aws.amazon.com/about-aws/whats-new/2013/09/12/announcing-dynamodb-cross-region-copy-feature-in-aws-data-pipeline/
upvoted 1 times
...
01037
3 years, 8 months ago
A needs to modify the web application to update Lastupdated attribute, doesn't it?
upvoted 1 times
Yahowmy
3 years, 8 months ago
I believe the only modification here is to the DynamoDB table.
upvoted 2 times
...
01037
3 years, 8 months ago
I think C is the best option here.
upvoted 2 times
...
...
Malcnorth59
3 years, 9 months ago
A. Seems right
upvoted 2 times
...
ppshein
3 years, 9 months ago
A is for exact answer I believe.
upvoted 2 times
...
cldy
3 years, 9 months ago
A. meets the requirements.
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...