exam questions

Exam AWS Certified Database - Specialty All Questions

View all questions & answers for the AWS Certified Database - Specialty exam

Exam AWS Certified Database - Specialty topic 1 question 131 discussion

Exam question from Amazon's AWS Certified Database - Specialty
Question #: 131
Topic #: 1
[All AWS Certified Database - Specialty Questions]

A retail company manages a web application that stores data in an Amazon DynamoDB table. The company is undergoing account consolidation efforts. A database engineer needs to migrate the DynamoDB table from the current AWS account to a new AWS account.
Which strategy meets these requirements with the LEAST amount of administrative work?

  • A. Use AWS Glue to crawl the data in the DynamoDB table. Create a job using an available blueprint to export the data to Amazon S3. Import the data from the S3 file to a DynamoDB table in the new account.
  • B. Create an AWS Lambda function to scan the items of the DynamoDB table in the current account and write to a file in Amazon S3. Create another Lambda function to read the S3 file and restore the items of a DynamoDB table in the new account.
  • C. Use AWS Data Pipeline in the current account to export the data from the DynamoDB table to a file in Amazon S3. Use Data Pipeline to import the data from the S3 file to a DynamoDB table in the new account.
  • D. Configure Amazon DynamoDB Streams for the DynamoDB table in the current account. Create an AWS Lambda function to read from the stream and write to a file in Amazon S3. Create another Lambda function to read the S3 file and restore the items to a DynamoDB table in the new account.
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
novak18
Highly Voted 3 years, 8 months ago
I think the answer is C https://aws.amazon.com/premiumsupport/knowledge-center/dynamodb-cross-account-migration/
upvoted 11 times
DevoteamAnalytix
2 years, 11 months ago
For me it is A because it seems to be easier with Glue than Data Pipeline ("with the MINIMUM amount of administrative work") GLUE: https://aws.amazon.com/de/premiumsupport/knowledge-center/dynamodb-cross-account-migration/ DATA PIPELINE: https://aws.amazon.com/de/premiumsupport/knowledge-center/data-pipeline-account-access-dynamodb-s3/
upvoted 4 times
...
...
roymunson
Most Recent 1 year, 8 months ago
Selected Answer: C
Create a DynamoDB table in your source account. Create an Amazon Simple Storage Service (Amazon S3) bucket in the destination account. Attach an AWS Identity and Access Management (IAM) policy to the Data Pipeline default roles in the source account. Create an S3 bucket policy in the destination account. Create and activate a pipeline in the source account. Create a DynamoDB table in the destination account. Restore the DynamoDB export in the destination account. https://aws.amazon.com/de/blogs/database/how-to-migrate-amazon-dynamodb-tables-from-one-aws-account-to-another-with-aws-data-pipeline/
upvoted 2 times
...
alexpl
1 year, 8 months ago
Selected Answer: C
The correct answer is C, because: https://aws.amazon.com/blogs/database/how-to-migrate-amazon-dynamodb-tables-from-one-aws-account-to-another-with-aws-data-pipeline/
upvoted 1 times
...
thuyeinaung
1 year, 8 months ago
Selected Answer: A
I think it is A
upvoted 1 times
...
Germaneli
1 year, 8 months ago
Selected Answer: A
Migrate your DynamoDB table to a different AWS account with one of these methods that suit your use case: https://repost.aws/knowledge-center/dynamodb-cross-account-migration - AWS Backup - DynamoDB import and export to Amazon Simple Storage Service (Amazon S3) - Amazon S3 and AWS Glue - Amazon EMR
upvoted 1 times
...
Monknil
1 year, 10 months ago
C looks like the best option https://aws.amazon.com/premiumsupport/knowledge-center/dynamodb-cross-account-migration/
upvoted 1 times
...
milan9527
2 years ago
Selected Answer: A
A. Why not?
upvoted 2 times
...
clarksu
2 years, 1 month ago
Selected Answer: C
You can migrate your DynamoDB tables to a different AWS account by choosing one of the following methods depending on your use case: AWS Backup DynamoDB import and export to Amazon Simple Storage Service (Amazon S3) Amazon S3 and AWS Glue AWS Data Pipeline Amazon EMR
upvoted 1 times
Paulv82003
2 years ago
Amazon S3 and AWS Glue is on your list, listed before AWS Data Pipeline, and yet you select C?
upvoted 2 times
...
...
redman50
2 years, 2 months ago
Selected Answer: D
Configure Amazon DynamoDB Streams for the DynamoDB table in the current account. Create an AWS Lambda function to read from the stream and write to a file in Amazon S3. Create another Lambda function to read the S3 file and restore the items to a DynamoDB table in the new account. This approach leverages DynamoDB Streams, which captures item-level modifications to a table, including creates, updates, and deletes. The DynamoDB Streams data can be used to replicate data in near real-time across different AWS accounts. The approach allows for minimal administrative work as it only requires the creation of two Lambda functions to read and write to S3 and DynamoDB tables, respectively. This approach also ensures that data consistency is maintained during the migration process.
upvoted 1 times
...
sk1974
2 years, 3 months ago
https://aws.amazon.com/premiumsupport/knowledge-center/dynamodb-cross-account-migration/ . I initially thought answer was 'A' since answer had the word 'existing blueprint' . but , I went for C based on the link pasted above. Scroll to the 'Data Pipeline' section in there .
upvoted 1 times
...
jjyy80
2 years, 5 months ago
DATAPIEPLINE is the correct one "Note: The destination account can't access the DynamoDB data in S3 bucket. To work with the data, restore it to a DynamoDB table. Data Pipeline provides the easiest method to move the table with the least manual effort. However, there are fewer options for customization."
upvoted 2 times
...
novice_expert
3 years, 1 month ago
Selected Answer: C
https://aws.amazon.com/premiumsupport/knowledge-center/dynamodb-cross-account-migration/ export dynamoDB to S3 in other account -> use Glue job (or data pipeline or EMR) to import data C. Use AWS Data Pipeline in the current account to export the data from the DynamoDB table to a file in Amazon S3. Use Data Pipeline to import the data from the S3 file to a DynamoDB table in the new account.
upvoted 2 times
...
johnconnor
3 years, 6 months ago
Why not A? being glue serverless wouldn't it be easier to do this way?
upvoted 2 times
...
ChauPhan
3 years, 7 months ago
Agree with C, LEAST amount of work.
upvoted 2 times
...
AM
3 years, 7 months ago
I agree with C and also that the question is a bit ambigupus
upvoted 1 times
...
Aesthet
3 years, 7 months ago
C https://aws.amazon.com/premiumsupport/knowledge-center/data-pipeline-account-access-dynamodb-s3/
upvoted 4 times
...
manan728
3 years, 7 months ago
A seems to be the answer. https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-dynamo-db-cross-account.html
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...