New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-420 Exam - Topic 8 Question 30 Discussion

Actual exam question for Microsoft's DP-420 exam
Question #: 30
Topic #: 8
[All DP-420 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure Data Factory pipeline that uses Azure Cosmos DB Core (SQL) API as the input and Azure Blob Storage as the output.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B, C

Contribute your Thoughts:

0/2000 characters
Rodrigo
3 months ago
Wait, can you really use Blob Storage as a middleman like that?
upvoted 0 times
...
Aron
3 months ago
Sounds good, but what about data latency issues?
upvoted 0 times
...
Florencia
4 months ago
I’m not so sure, isn’t there a direct way to connect Cosmos DB to Stream Analytics?
upvoted 0 times
...
Hannah
4 months ago
Totally agree, that should work for moving data!
upvoted 0 times
...
Becky
4 months ago
This solution uses Azure Data Factory, which is a valid approach.
upvoted 0 times
...
Holley
4 months ago
I'm a bit confused about the requirements. If the goal is just to make the data available, then maybe this solution is fine, but I feel like there might be a more direct method.
upvoted 0 times
...
Alpha
4 months ago
My notes mention that Stream Analytics can directly read from Cosmos DB, so I'm leaning towards saying this solution might not meet the goal.
upvoted 0 times
...
Lisbeth
4 months ago
I remember a practice question where we had to use Azure Functions instead of Data Factory. I wonder if that's a better approach here.
upvoted 0 times
...
Xochitl
5 months ago
I think using Azure Data Factory to move data to Blob Storage could work, but I'm not entirely sure if that's the best way to make it available for Stream Analytics.
upvoted 0 times
...
Jeff
5 months ago
I'm a bit confused by the wording of the question. It mentions that we won't be able to return to this question, so I want to make sure I understand the requirements fully before answering. Let me re-read the question and think through any potential alternative solutions that might be worth considering.
upvoted 0 times
...
Gracia
5 months ago
Hmm, I'm a bit unsure about this one. The question mentions that there might be multiple correct solutions, so I want to make sure I'm not missing something. I'll double-check the requirements and see if there are any other ways to approach this that might be more efficient or better align with the goals.
upvoted 0 times
...
Lynda
5 months ago
This looks like a straightforward question. I'll start by reviewing the requirements - we need to make the contents of container1 in Cosmos DB available as reference data for an Azure Stream Analytics job. The solution provided is to create an Azure Data Factory pipeline that uses Cosmos DB as the input and Blob Storage as the output. I think this should meet the goal, so I'll select "Yes".
upvoted 0 times
...
Eleonora
5 months ago
Okay, I think I've got a handle on this. The key is to make the Cosmos DB data available as reference data for the Stream Analytics job. The solution provided seems like a reasonable approach, as it involves moving the data from Cosmos DB to Blob Storage, which could then be used as a reference data source for the Stream Analytics job. I'll go with "Yes" on this one.
upvoted 0 times
...
Dong
5 months ago
Easy peasy! No changes needed to the manifests, the Kubernetes objects will be automatically discovered by Cisco ACI. As long as the integration is set up properly, this should just work.
upvoted 0 times
...
Ligia
5 months ago
Scripts, APIs, or XML - those all seem like reasonable options for how the management plane might interact with hypervisors. I'll have to weigh the pros and cons of each.
upvoted 0 times
...
Naomi
5 months ago
This looks like a tricky question. I'll need to carefully read through the code fragment and the answer choices to figure out the right approach.
upvoted 0 times
...
Luis
9 months ago
I bet the person who came up with this solution has a secret fetish for extra steps in their cloud architecture.
upvoted 0 times
France
8 months ago
User 3: Yes
upvoted 0 times
...
Jenise
9 months ago
User 2: No
upvoted 0 times
...
Lizbeth
9 months ago
User 1: Yes
upvoted 0 times
...
...
Vinnie
9 months ago
Haha, using Data Factory to move data from Cosmos DB to Blob Storage? That's like taking the long way around to get to the store just down the street.
upvoted 0 times
...
Carin
9 months ago
This solution seems overly complicated. Why not just use the Cosmos DB input option in the Azure Stream Analytics job? That would be the most straightforward way to achieve the goal.
upvoted 0 times
Gilma
8 months ago
User 3: I agree with An, using the Cosmos DB input option in Azure Stream Analytics would be simpler.
upvoted 0 times
...
An
8 months ago
User 2: No
upvoted 0 times
...
Mauricio
9 months ago
User 1: Yes
upvoted 0 times
...
...
Keshia
10 months ago
I'm not sure if this solution meets the goal. Wouldn't it be simpler to just use the Cosmos DB input option in the Azure Stream Analytics job directly?
upvoted 0 times
Teddy
9 months ago
User 3: I agree with Teddy, it would be simpler to use the Cosmos DB input option directly.
upvoted 0 times
...
Shawnta
9 months ago
User 2: No
upvoted 0 times
...
Cruz
9 months ago
User 1: Yes
upvoted 0 times
...
...
German
10 months ago
The solution involves using Azure Data Factory to move data from Cosmos DB to Blob Storage, which doesn't seem to be the most direct way to make the data available for the Azure Stream Analytics job.
upvoted 0 times
Myrtie
9 months ago
User 3: No
upvoted 0 times
...
Tammi
9 months ago
User 2: No
upvoted 0 times
...
Lavonda
10 months ago
User 1: Yes
upvoted 0 times
...
...
Mona
10 months ago
No, I don't think that will meet the goal. We should use Azure Stream Analytics directly with Cosmos DB.
upvoted 0 times
...
Amira
10 months ago
I think it's a good solution because Azure Data Factory can easily move data between different services.
upvoted 0 times
...
Shenika
10 months ago
Yes, that should work.
upvoted 0 times
...
Kristofer
10 months ago
No, I don't think so. We should use Azure Stream Analytics directly to access container1.
upvoted 0 times
...
Earnestine
11 months ago
I think it's a good solution because Azure Data Factory can easily connect to Azure Cosmos DB and Azure Blob Storage.
upvoted 0 times
...
Farrah
11 months ago
Yes, that should work.
upvoted 0 times
...

Save Cancel