Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-420 Topic 8 Question 30 Discussion

Actual exam question for Microsoft's DP-420 exam
Question #: 30
Topic #: 8
[All DP-420 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have a container named container1 in an Azure Cosmos DB Core (SQL) API account.

You need to make the contents of container1 available as reference data for an Azure Stream Analytics job.

Solution: You create an Azure Data Factory pipeline that uses Azure Cosmos DB Core (SQL) API as the input and Azure Blob Storage as the output.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B, C

Contribute your Thoughts:

Luis
16 days ago
I bet the person who came up with this solution has a secret fetish for extra steps in their cloud architecture.
upvoted 0 times
Lizbeth
2 days ago
User 1: Yes
upvoted 0 times
...
...
Vinnie
19 days ago
Haha, using Data Factory to move data from Cosmos DB to Blob Storage? That's like taking the long way around to get to the store just down the street.
upvoted 0 times
...
Carin
20 days ago
This solution seems overly complicated. Why not just use the Cosmos DB input option in the Azure Stream Analytics job? That would be the most straightforward way to achieve the goal.
upvoted 0 times
Mauricio
2 days ago
User 1: Yes
upvoted 0 times
...
...
Keshia
1 months ago
I'm not sure if this solution meets the goal. Wouldn't it be simpler to just use the Cosmos DB input option in the Azure Stream Analytics job directly?
upvoted 0 times
Teddy
1 days ago
User 3: I agree with Teddy, it would be simpler to use the Cosmos DB input option directly.
upvoted 0 times
...
Shawnta
16 days ago
User 2: No
upvoted 0 times
...
Cruz
20 days ago
User 1: Yes
upvoted 0 times
...
...
German
1 months ago
The solution involves using Azure Data Factory to move data from Cosmos DB to Blob Storage, which doesn't seem to be the most direct way to make the data available for the Azure Stream Analytics job.
upvoted 0 times
Myrtie
16 days ago
User 3: No
upvoted 0 times
...
Tammi
21 days ago
User 2: No
upvoted 0 times
...
Lavonda
1 months ago
User 1: Yes
upvoted 0 times
...
...
Mona
2 months ago
No, I don't think that will meet the goal. We should use Azure Stream Analytics directly with Cosmos DB.
upvoted 0 times
...
Amira
2 months ago
I think it's a good solution because Azure Data Factory can easily move data between different services.
upvoted 0 times
...
Shenika
2 months ago
Yes, that should work.
upvoted 0 times
...
Kristofer
2 months ago
No, I don't think so. We should use Azure Stream Analytics directly to access container1.
upvoted 0 times
...
Earnestine
2 months ago
I think it's a good solution because Azure Data Factory can easily connect to Azure Cosmos DB and Azure Blob Storage.
upvoted 0 times
...
Farrah
2 months ago
Yes, that should work.
upvoted 0 times
...

Save Cancel