Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-203 Topic 4 Question 76 Discussion

Actual exam question for Microsoft's DP-203 exam
Question #: 76
Topic #: 4
[All DP-203 Questions]

Note: This question it part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data *rom the staging zone, transform the data by executing an R script and then insert the transformed data into a data warehouse in Azure Synapse Analytics.

Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes a mapping data flow, and then inserts the data into the data warehouse.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Billy
2 months ago
I'm going to have to go with 'Yes' on this one. Can't go wrong with the good old 'ETL in the cloud' approach. Now, where's the multiple choice option for 'Profit'?
upvoted 0 times
Brandee
14 days ago
Definitely, it's a straightforward solution for ingesting and transforming data.
upvoted 0 times
...
Olga
28 days ago
I agree, using Azure Data Factory for ETL processes is a solid choice.
upvoted 0 times
...
Lelia
1 months ago
Yes
upvoted 0 times
...
...
Audry
2 months ago
Ha! They're really trying to trick us with this one. Of course this meets the goal - using Azure Data Factory is a textbook solution for this scenario.
upvoted 0 times
...
Maryrose
2 months ago
Hmm, I'm not sure about using a mapping data flow. Wouldn't a custom code activity be more flexible for the R script transformation? I'd want to dig into the details a bit more.
upvoted 0 times
Lizbeth
12 days ago
User1: That's a good point, we should consider that option as well.
upvoted 0 times
...
Veronique
21 days ago
User3: I agree with User2, custom code activity could be more flexible.
upvoted 0 times
...
Tammi
1 months ago
User2: I'm not so sure, a custom code activity might be more flexible for the R script transformation.
upvoted 0 times
...
Lenna
2 months ago
User1: I think using a mapping data flow is the way to go.
upvoted 0 times
...
...
Marisha
2 months ago
I'm not sure, I think we should consider other options as well.
upvoted 0 times
...
Rebecka
2 months ago
This solution seems straightforward and covers the key requirements - ingesting incremental data, transforming with an R script, and loading to the data warehouse. Looks good to me!
upvoted 0 times
Jesusa
18 days ago
User 4: A) Yes
upvoted 0 times
...
Veronica
19 days ago
User 3: A) Yes
upvoted 0 times
...
Sarah
22 days ago
User 2: B) NO
upvoted 0 times
...
Karma
25 days ago
User 1: A) Yes
upvoted 0 times
...
...
Cristina
3 months ago
I agree with Denae, using Azure Data Factory seems like a good approach.
upvoted 0 times
...
Denae
3 months ago
I think the solution meets the goal.
upvoted 0 times
...

Save Cancel