Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-420 Exam - Topic 2 Question 74 Discussion

Actual exam question for Microsoft's DP-420 exam
Question #: 74
Topic #: 2
[All DP-420 Questions]

You need to configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account. The data from a container named telemetry must be added to a Kafka topic named iot. The solution must store the data in a compact binary format.

Which three configuration items should you include in the solution? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer: C, D, F

C: Avro is binary format, while JSON is text.

F: Kafka Connect for Azure Cosmos DB is a connector to read from and write data to Azure Cosmos DB. The Azure Cosmos DB sink connector allows you to export data from Apache Kafka topics to an Azure Cosmos DB database. The connector polls data from Kafka to write to containers in the database based on the topics subscription.

D: Create the Azure Cosmos DB sink connector in Kafka Connect. The following JSON body defines config for the sink connector.

Extract:

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter'

'connect.cosmos.containers.topicmap': 'hotels#kafka'

Incorrect Answers:

B: JSON is plain text.

Note, full example:

{

'name': 'cosmosdb-sink-connector',

'config': {

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'tasks.max': '1',

'topics': [

'hotels'

],

'value.converter': 'org.apache.kafka.connect.json.AvroConverter',

'value.converter.schemas.enable': 'false',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter',

'key.converter.schemas.enable': 'false',

'connect.cosmos.connection.endpoint': 'https://<cosmosinstance-name>.documents.azure.com:443/',

'connect.cosmos.master.key': '<cosmosdbprimarykey>',

'connect.cosmos.databasename': 'kafkaconnect',

'connect.cosmos.containers.topicmap': 'hotels#kafka'

}

}


https://docs.microsoft.com/en-us/azure/cosmos-db/sql/kafka-connector-sink

https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/

Contribute your Thoughts:

0/2000 characters
Marci
21 hours ago
I feel like we practiced a similar question where we had to choose the right connector and converters, and I think A and C might be the right choices here.
upvoted 0 times
...
Florinda
6 days ago
I think option D makes sense because it maps the Cosmos container to the Kafka topic, but I'm not completely confident about the syntax.
upvoted 0 times
...
Cordelia
11 days ago
I'm a bit unsure about the converters; I thought we needed a binary format, but I'm not sure if that means we should use Avro or JSON.
upvoted 0 times
...
Annelle
16 days ago
I remember we discussed the CosmosDBSourceConnector in class, so I think option A is definitely one of the correct answers.
upvoted 0 times
...

Save Cancel