exam questions

Exam AWS Certified Data Analytics - Specialty All Questions

View all questions & answers for the AWS Certified Data Analytics - Specialty exam

Exam AWS Certified Data Analytics - Specialty topic 1 question 41 discussion

A mortgage company has a microservice for accepting payments. This microservice uses the Amazon DynamoDB encryption client with AWS KMS managed keys to encrypt the sensitive data before writing the data to DynamoDB. The finance team should be able to load this data into Amazon Redshift and aggregate the values within the sensitive fields. The Amazon Redshift cluster is shared with other data analysts from different business units.
Which steps should a data analyst take to accomplish this task efficiently and securely?

  • A. Create an AWS Lambda function to process the DynamoDB stream. Decrypt the sensitive data using the same KMS key. Save the output to a restricted S3 bucket for the finance team. Create a finance table in Amazon Redshift that is accessible to the finance team only. Use the COPY command to load the data from Amazon S3 to the finance table.
  • B. Create an AWS Lambda function to process the DynamoDB stream. Save the output to a restricted S3 bucket for the finance team. Create a finance table in Amazon Redshift that is accessible to the finance team only. Use the COPY command with the IAM role that has access to the KMS key to load the data from S3 to the finance table.
  • C. Create an Amazon EMR cluster with an EMR_EC2_DefaultRole role that has access to the KMS key. Create Apache Hive tables that reference the data stored in DynamoDB and the finance table in Amazon Redshift. In Hive, select the data from DynamoDB and then insert the output to the finance table in Amazon Redshift.
  • D. Create an Amazon EMR cluster. Create Apache Hive tables that reference the data stored in DynamoDB. Insert the output to the restricted Amazon S3 bucket for the finance team. Use the COPY command with the IAM role that has access to the KMS key to load the data from Amazon S3 to the finance table in Amazon Redshift.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
awssp12345
Highly Voted 3 years, 8 months ago
Answer is B – C and D are cancelled because – EMR is not needed to process DynamoDB streams. Lambda function would be good enough. Option A is wrong because it suggests decrypting the data and storing in S3 which is not good since it contains sensitive fields. Option B is correct because Redshift will only decrypt the data while reading it.
upvoted 53 times
freaky
3 years, 7 months ago
But why do we need to create DynamoDB streams. Streams is mentioned only in answer. Also it will be only for new data. What about the data which is already present. One of the requirement is Finane team should be able to aggregate data on sensitive field. But if they do not hae all the data in Redshift then how will aggregation provide correct result?
upvoted 3 times
blackgamer
1 year, 4 months ago
B is wrong because the data is encrypted before loading into DynamoDB which implies that it is client side encryption and Redshift doesn't support the client side encryption - https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 1 times
...
...
...
JD78780
Highly Voted 3 years, 7 months ago
Correct: A C and D can be eliminated because this is a shared Redshift cluster, so you need to create a table accessible only to the finance team. B is wrong as the application uses DynamoDB client-side encryption (not S3 client-side encryption), which means it will not automatically decrypt by AWS, and needs manual decryption before sending to S3 and then COPY’d into Redshift. Even if you want to use COPY ENCRYPTED to copy client-side encrypted S3 files, you need to specify credentials not IAM roles. REPORT THIS AD REPORT THIS AD However, DynamoDB stream only does new data, so existing data won’t be processed, this is not a perfect answer. https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 13 times
dushmantha
2 years, 11 months ago
I think this is the best explanation.
upvoted 1 times
...
ru4aws
2 years, 10 months ago
In A the issue is though S3 is restricted the data stored still is unencrypted before loading to redshift
upvoted 2 times
...
JoellaLi
2 years, 7 months ago
Actually it will automatically decrypt by AWS, and no need to do manual decryption. "After you create and configure the required components, the DynamoDB Encryption Client transparently encrypts and signs your table items when you add them to a table, and verifies and decrypts them when you retrieve them." https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 3 times
siju13
2 years, 5 months ago
on the link you shared above, The COPY command doesn't support the following types of Amazon S3 encryption: Client-side encryption using an AWS KMS key if client side encruption does not support than decrypting will not work as well
upvoted 1 times
...
...
...
Cristian_T5
Most Recent 1 year, 1 month ago
<a href=“https://striveenterprise.com/digital-marketing-tampa-fl/”>Digital marketing</a> isn't just about selling a product; it's about crafting an immersive brand experience. It's the fusion of compelling storytelling, cutting-edge technology, and a deep understanding of consumer psychology.
upvoted 1 times
...
chinmayj213
1 year, 2 months ago
When you load an encrypted file from Amazon S3 to Redshift, the encryption involved is neither purely client-side nor server-side using AWS KMS. It's a hybrid approach Decryption during Load: When you use the COPY command in Redshift to load the data, Redshift retrieves the data key from KMS using its IAM role or credentials. This retrieval can be considered a server-side operation from Redshift's perspective.
upvoted 1 times
...
NarenKA
1 year, 2 months ago
Selected Answer: A
Lambda function processes the DynamoDB stream, the sensitive data encrypted with KMS keys can be decrypted securely using the same KMS key. Storing the decrypted data in a restricted S3 bucket accessible only to the finance team ensures that sensitive information is not exposed to unauthorised users. Creating a dedicated finance table in Redshift is accessible only to the finance team ensures that the aggregated sensitive data remains confidential and is not accessible by others. The COPY command to load data from the restricted S3 bucket into the finance table in Redshift is efficient. B- loading encrypted data directly into Redshift and decrypting it during the COPY process is not directly supported as part of the COPY command. C, D - involve using EMR and Apache Hive, which could add complexity and operational overhead to the data processing workflow and decryption needs to occur before the data can be processed by EMR.
upvoted 2 times
rag_mat_80
1 year, 2 months ago
COPY command does decrypt - https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 1 times
...
...
Adzz
1 year, 3 months ago
Selected Answer: B
Will go for B
upvoted 1 times
...
blackgamer
1 year, 4 months ago
Selected Answer: A
B is wrong because the data is encrypted before loading into DynamoDB which implies that it is client side encryption and Redshift doesn't support the client side encryption - https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 1 times
rag_mat_80
1 year, 2 months ago
the key is this i feel - "AWS KMS managed keys to encrypt the sensitive data"
upvoted 1 times
...
...
gofavad926
1 year, 7 months ago
Selected Answer: B
B. A and B are similar but B is more secure option
upvoted 1 times
...
rlnd2000
1 year, 7 months ago
Selected Answer: A
B is incorrect, this option omits the step of decrypting the data before saving, I think A is the correct option.
upvoted 1 times
...
SMALLAM
1 year, 10 months ago
The COPY command doesn't support the following types of Amazon S3 encryption: Answer A Server-side encryption with customer-provided keys (SSE-C) Client-side encryption using an AWS KMS key Client-side encryption using a customer-provided asymmetric root key
upvoted 3 times
...
Hisayuki
1 year, 11 months ago
Selected Answer: A
A is the answer
upvoted 1 times
...
pk349
2 years ago
A: I passed the test
upvoted 3 times
...
anjuvinayan
2 years ago
Answer is B as lambda can analyze the data in dynamo db and is save to restricted bucket which cannot be accessed without desired permission. Also to copy from s3 bucket to redshift it requires IAM role.
upvoted 2 times
...
akashm99101001com
2 years, 1 month ago
Selected Answer: B
https://docs.aws.amazon.com/redshift/latest/dg/c_loading-encrypted-files.html
upvoted 2 times
...
Arjun777
2 years, 3 months ago
Option B is not the best solution because it does not address the need to decrypt the sensitive data before loading it into Amazon Redshift. The finance team needs to be able to aggregate the values within the sensitive fields, which would not be possible if the data is not decrypted before loading it into Redshift. Option A solves this problem by creating a Lambda function that processes the DynamoDB stream and decrypts the sensitive data using the same KMS key used for encryption before loading it into Redshift. The data is also saved to a restricted S3 bucket to ensure that only the finance team has access to it.
upvoted 3 times
...
ota123
2 years, 4 months ago
Selected Answer: B
the decrypting is automatically done by AWS as the data is moved from Dynamo to S3. Before it's written to S3 it's reencyrpted using SSE. Writing to S3 and leaving it decrypting (as Option A suggests) would not be a secure move. Hence B should be the right answer.
upvoted 4 times
...
henom
2 years, 5 months ago
Ans- B
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago