exam questions

Exam DP-420 All Questions

View all questions & answers for the DP-420 exam

Exam DP-420 topic 5 question 8 discussion

Actual exam question from Microsoft's DP-420
Question #: 8
Topic #: 5
[All DP-420 Questions]

You need to configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account. The data from a container named telemetry must be added to a Kafka topic named iot. The solution must store the data in a compact binary format.
Which three configuration items should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. "connector.class": "com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector"
  • B. "key.converter": "org.apache.kafka.connect.json.JsonConverter"
  • C. "key.converter": "io.confluent.connect.avro.AvroConverter"
  • D. "connect.cosmos.containers.topicmap": "iot#telemetry"
  • E. "connect.cosmos.containers.topicmap": "iot"
  • F. "connector.class": "com.azure.cosmos.kafka.connect.source.CosmosDBSinkConnector"
Show Suggested Answer Hide Answer
Suggested Answer: ACD 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
TimSss
Highly Voted 1 year, 7 months ago
Selected Answer: ACD
We want to have data from cosmos to kafka so source, not sink
upvoted 10 times
TRUESON
1 year ago
source is to get cosmosdb data to kafka, sink is to write kafka data to cosmosdb
upvoted 2 times
...
TRUESON
1 year ago
for better understanding watch this video from 22:30 https://www.youtube.com/live/b9L_CTUaz5Y
upvoted 1 times
...
...
AscentAcademy
Highly Voted 1 year, 9 months ago
Not sure, but shouldn't we have answer A and not F ? Here we intent to have the AzureCosmosDB as a source to export to Kafka as a sink - meaning that we should import the "com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector" as stated here https://docs.microsoft.com/en-us/azure/cosmos-db/sql/kafka-connector-source "Kafka Connect for Azure Cosmos DB is a connector to read from and write data to Azure Cosmos DB. The Azure Cosmos DB source connector provides the capability to read data from the Azure Cosmos DB change feed and publish this data to a Kafka topic."
upvoted 6 times
...
Garyn
Most Recent 7 months, 2 weeks ago
Selected Answer: ACD
To configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account and store it in a Kafka topic named "iot" while using a compact binary format, you should include the following configuration items: A. "connector.class": "com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector" This specifies the source connector class for ingesting data from Azure Cosmos DB. B. "key.converter": "org.apache.kafka.connect.json.JsonConverter" This configuration specifies the key converter. However, since you want to store data in a compact binary format, you should use Avro as the key and value converter instead of JSON. C. "key.converter": "io.confluent.connect.avro.AvroConverter" Avro is a compact binary format, and using this converter for both key and value will store data in Avro format. D. "connect.cosmos.containers.topicmap": "iot#telemetry" This configuration maps the Cosmos DB container "telemetry" to the Kafka topic "iot." So, you should include configurations A, C, and D in the solution for your specific requirements.
upvoted 4 times
...
azuredemo2022three
10 months, 3 weeks ago
Selected Answer: ACD
Answer
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago