exam questions

Exam AWS Certified Data Analytics - Specialty All Questions

View all questions & answers for the AWS Certified Data Analytics - Specialty exam

Exam AWS Certified Data Analytics - Specialty topic 1 question 157 discussion

A social media company is using business intelligence tools to analyze its data for forecasting. The company is using Apache Kafka to ingest the low-velocity data in near-real time. The company wants to build dynamic dashboards with machine learning (ML) insights to forecast key business trends. The dashboards must provide hourly updates from data in Amazon S3. Various teams at the company want to view the dashboards by using Amazon QuickSight with ML insights. The solution also must correct the scalability problems that the company experiences when it uses its current architecture to ingest data.
Which solution will MOST cost-effectively meet these requirements?

  • A. Replace Kafka with Amazon Managed Streaming for Apache Kafka. Ingest the data by using AWS Lambda, and store the data in Amazon S3. Use QuickSight Standard edition to refresh the data in SPICE from Amazon S3 hourly and create a dynamic dashboard with forecasting and ML insights.
  • B. Replace Kafka with an Amazon Kinesis data stream. Use an Amazon Kinesis Data Firehose delivery stream to consume the data and store the data in Amazon S3. Use QuickSight Enterprise edition to refresh the data in SPICE from Amazon S3 hourly and create a dynamic dashboard with forecasting and ML insights.
  • C. Configure the Kafka-Kinesis-Connector to publish the data to an Amazon Kinesis Data Firehose delivery stream that is configured to store the data in Amazon S3. Use QuickSight Enterprise edition to refresh the data in SPICE from Amazon S3 hourly and create a dynamic dashboard with forecasting and ML insights.
  • D. Configure the Kafka-Kinesis-Connector to publish the data to an Amazon Kinesis Data Firehose delivery stream that is configured to store the data in Amazon S3. Configure an AWS Glue crawler to crawl the data. Use an Amazon Athena data source with QuickSight Standard edition to refresh the data in SPICE hourly and create a dynamic dashboard with forecasting and ML insights.
Show Suggested Answer Hide Answer
Suggested Answer: B 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
eccc7e6
1 year, 6 months ago
Selected Answer: B
A is incorrect because QuickSight Standard Edition does not support ML insights. C is incorrect because this does not solve the scalability problems since it keeps part of the current architecture. D is incorrect for the same reasons as A and C. Also KDS is cheaper than MSK.
upvoted 2 times
...
LocalHero
1 year, 8 months ago
kafka-connecter is very expensive.
upvoted 1 times
...
LocalHero
1 year, 8 months ago
A and B is not correct . QuickSight SE is not supported ML insights. ML insights needs QuickSIght EE. C is not correct. Kafka-connecter is high cost solution.
upvoted 1 times
...
ccpmad
2 years ago
Selected Answer: B
"The solution also must correct the scalability problems": so, can't be C
upvoted 1 times
...
pk349
2 years, 2 months ago
B: I passed the test
upvoted 1 times
...
Chelseajcole
2 years, 6 months ago
Selected Answer: B
C is wrong as it stated: The solution also must correct the scalability problems that the company experiences when it uses its current architecture to ingest data. We gonna using pure AWS solution. Using connector cannot solve Apache Kafka scale problem unless they using MSK
upvoted 3 times
...
nadavw
2 years, 7 months ago
Selected Answer: C
Kafka-Kinesis-Connector for Firehose is used to publish messages from Kafka to one of the following destinations: Amazon S3, Amazon Redshift, or Amazon Elasticsearch Service and in turn enabling near real time analytics with existing business intelligence tools and dashboards. Amazon Kinesis Firehose has ability to transform, batch, archive message onto S3 and retry if destination is unavailable. https://github.com/awslabs/kinesis-kafka-connector ML insights available in enterprise edition
upvoted 1 times
...
rav009
2 years, 8 months ago
Selected Answer: C
cost-effectively is the key C will save the cost and B need change the codes from kafka to KDS.
upvoted 1 times
...
muhsin
2 years, 10 months ago
I think it is C. Because we don't need to use KDS to get streams. Kafka can publish the data to KDF directly with the connector. https://github.com/awslabs/kinesis-kafka-connector
upvoted 1 times
muhsin
2 years, 10 months ago
pardon, it is replacing kafka with KDS. so it is b
upvoted 2 times
...
...
rocky48
2 years, 11 months ago
Selected Answer: B
Selected Answer: B
upvoted 1 times
...
dushmantha
3 years ago
Selected Answer: B
ML insights available in enterprise edition. And hourly refresh of SPICE is also available in enterprise edition. So is gotta be B or C. Since the current solution is not scalable it need to be replaced. Hence B is the answer.
upvoted 3 times
...
jrheen
3 years, 2 months ago
Answer - B
upvoted 1 times
...
Teraxs
3 years, 2 months ago
Selected Answer: B
Enterprise edition needed because of ML capabilities. Current solution not scalable, thus needs to be replaced, i.e. B
upvoted 4 times
...
CHRIS12722222
3 years, 2 months ago
B seems okay. ML insight is in enterprise edition
upvoted 2 times
...
rb39
3 years, 2 months ago
Selected Answer: B
B - Kinesis is cheaper than Kafka
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...