New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-300 Exam - Topic 1 Question 44 Discussion

Actual exam question for Microsoft's DP-300 exam
Question #: 44
Topic #: 1
[All DP-300 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.

Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes mapping data flow, and then inserts the data into the data warehouse.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B

If you need to transform data in a way that is not supported by Data Factory, you can create a custom activity,

not a mapping flow,5 with your own data processing logic and use the activity in the pipeline. You can create a

custom activity to run R scripts on your HDInsight cluster with R installed.


https://docs.microsoft.com/en-US/azure/data-factory/transform-data

Contribute your Thoughts:

0/2000 characters
Patti
4 months ago
Not sure if mapping data flow is the best choice here.
upvoted 0 times
...
Isidra
4 months ago
Yes, this definitely meets the goal!
upvoted 0 times
...
Leoma
4 months ago
Wait, can Azure Data Factory run R scripts directly?
upvoted 0 times
...
Princess
5 months ago
I think it might miss some steps.
upvoted 0 times
...
Ora
5 months ago
Sounds like a solid plan!
upvoted 0 times
...
Hyman
5 months ago
I remember that we had a question where we had to ensure the transformation was done correctly before loading into Synapse. This solution seems a bit off to me.
upvoted 0 times
...
Beth
5 months ago
I recall that we discussed using triggers in Azure Data Factory, but I'm uncertain if mapping data flows can handle R scripts directly. Maybe we need a different activity?
upvoted 0 times
...
Billy
5 months ago
I think this is similar to a practice question we did where we had to use Azure Data Factory for data ingestion. I feel like it could work, but I'm not confident about the R script part.
upvoted 0 times
...
Tien
5 months ago
I'm not entirely sure if using a mapping data flow is the best approach for executing an R script. I remember something about needing a specific activity for that.
upvoted 0 times
...
Fabiola
5 months ago
I'm pretty sure a Kubernetes namespace is the same as a "Project" in this context, so I'll go with option D.
upvoted 0 times
...
Dulce
5 months ago
I'm not totally sure, but I vaguely remember something about static indexes too. They might play a role in searching?
upvoted 0 times
...
Kayleigh
5 months ago
Didn't we practice a similar question where strengthening equity was one of the recommended strategies? I think option C might be correct here.
upvoted 0 times
...

Save Cancel