Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-420 Exam - Topic 10 Question 2 Discussion

Actual exam question for Microsoft's DP-420 exam
Question #: 2
Topic #: 10
[All DP-420 Questions]

You need to configure an Apache Kafka instance to ingest data from an Azure Cosmos DB Core (SQL) API account. The data from a container named telemetry must be added to a Kafka topic named iot. The solution must store the data in a compact binary format.

Which three configuration items should you include in the solution? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Show Suggested Answer Hide Answer
Suggested Answer: C, D, F

C: Avro is binary format, while JSON is text.

F: Kafka Connect for Azure Cosmos DB is a connector to read from and write data to Azure Cosmos DB. The Azure Cosmos DB sink connector allows you to export data from Apache Kafka topics to an Azure Cosmos DB database. The connector polls data from Kafka to write to containers in the database based on the topics subscription.

D: Create the Azure Cosmos DB sink connector in Kafka Connect. The following JSON body defines config for the sink connector.

Extract:

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter'

'connect.cosmos.containers.topicmap': 'hotels#kafka'

Incorrect Answers:

B: JSON is plain text.

Note, full example:

{

'name': 'cosmosdb-sink-connector',

'config': {

'connector.class': 'com.azure.cosmos.kafka.connect.sink.CosmosDBSinkConnector',

'tasks.max': '1',

'topics': [

'hotels'

],

'value.converter': 'org.apache.kafka.connect.json.AvroConverter',

'value.converter.schemas.enable': 'false',

'key.converter': 'org.apache.kafka.connect.json.AvroConverter',

'key.converter.schemas.enable': 'false',

'connect.cosmos.connection.endpoint': 'Error! Hyperlink reference not valid.',

'connect.cosmos.master.key': '<cosmosdbprimarykey>',

'connect.cosmos.databasename': 'kafkaconnect',

'connect.cosmos.containers.topicmap': 'hotels#kafka'

}

}


https://docs.microsoft.com/en-us/azure/cosmos-db/sql/kafka-connector-sink

https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/

Contribute your Thoughts:

0/2000 characters
Geoffrey
5 months ago
Wait, can we really use Cosmos DB with Kafka like this? Sounds too good to be true!
upvoted 0 times
...
Hester
5 months ago
'connect.cosmos.containers.topicmap': 'iot#telemetry' seems right for mapping.
upvoted 0 times
...
Emeline
5 months ago
Not sure about that, I prefer JSON for flexibility.
upvoted 0 times
...
Gerri
5 months ago
I think 'key.converter': 'io.confluent.connect.avro.AvroConverter' is the way to go.
upvoted 0 times
...
Azzie
5 months ago
Definitely need 'connector.class': 'com.azure.cosmos.kafka.connect.source.CosmosDBSourceConnector'!
upvoted 0 times
...
Layla
5 months ago
I feel like option B with the JsonConverter might not fit since we need a binary format. I think we should focus on Avro for the key.converter instead.
upvoted 0 times
...
Cherri
5 months ago
I practiced a similar question where we had to map containers to topics. I think option D, with the topic map, is crucial for linking telemetry to the iot topic.
upvoted 0 times
...
Paulina
5 months ago
I'm a bit unsure about the converters. I thought we needed a compact binary format, which might mean using Avro, but I'm not completely certain if that’s option C or D.
upvoted 0 times
...
Tommy
5 months ago
I remember that the CosmosDBSourceConnector is essential for pulling data from Cosmos DB, so I think option A is definitely one of the answers.
upvoted 0 times
...
Eulah
6 months ago
This is a tricky one. The Remote Call-In pattern seems interesting, but I'm not sure if it's the best fit for this scenario. I'll need to review the integration patterns more closely to make a confident decision.
upvoted 0 times
...
Tori
6 months ago
Alright, let's think this through. Universal Packages sounds like it could be the right solution here. It allows you to manage both public and private packages in a single feed, which is exactly what the question is asking for. I'm feeling pretty confident about that one.
upvoted 0 times
...
Selma
6 months ago
Ah yes, the critical section group is all about managing shared resources and ensuring only one process instance executes the grouped activities at a time. I've got this - C and D are my picks.
upvoted 0 times
...
Yuonne
6 months ago
Wait, I'm a little confused now. I thought the master budget was just the income statement and cash flow, not the balance sheet too. Let me re-read the question and options carefully before answering.
upvoted 0 times
...
Celestina
6 months ago
I think the Multifactor Approach sounds right since it deals with various aspects of a person's life and how they fit into terrorism.
upvoted 0 times
...

Save Cancel