New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-300 Exam - Topic 8 Question 42 Discussion

Actual exam question for Microsoft's DP-300 exam
Question #: 42
Topic #: 8
[All DP-300 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You have an Azure Data Lake Storage account that contains a staging zone.

You need to design a daily process to ingest incremental data from the staging zone, transform the data by executing an R script, and then insert the transformed data into a data warehouse in Azure Synapse Analytics.

Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that executes an Azure Databricks notebook, and then inserts the data into the data warehouse.

Does this meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

0/2000 characters
Shawna
4 months ago
Not sure if this is the best approach, seems complicated.
upvoted 0 times
...
Tamera
4 months ago
Wait, can R scripts run directly in Databricks?
upvoted 0 times
...
Luz
4 months ago
Yes, that should work perfectly!
upvoted 0 times
...
Dong
5 months ago
I think using Databricks is a bit overkill for this.
upvoted 0 times
...
Bernadine
5 months ago
Sounds like a solid plan!
upvoted 0 times
...
Lashaun
5 months ago
I believe using a notebook in Databricks is a good approach, but I wonder if there are better alternatives for the transformation step.
upvoted 0 times
...
Elke
5 months ago
I'm a bit confused about whether the pipeline can handle incremental data ingestion properly. I hope it does!
upvoted 0 times
...
Kattie
5 months ago
I think using Azure Data Factory with a schedule trigger makes sense for this scenario, but I'm not entirely sure if it covers all the transformation needs.
upvoted 0 times
...
Keith
5 months ago
I remember a practice question where we had to use Databricks for data transformation, so this seems similar. I feel like it could work.
upvoted 0 times
...
Flo
5 months ago
Hmm, I'm a bit unsure about this one. I know NetDevOps is about automation and collaboration, but I'm not sure how that translates to the monitoring strategy. I'll need to think through the options carefully.
upvoted 0 times
...
Chuck
5 months ago
I've got this! The Milestone with entry criteria for Technician Wrap Up status is the way to go. Then we can use a Workflow or Process Builder to close the Milestone once the Technician confirms the wrap-up is complete. Easy peasy!
upvoted 0 times
...
Kenneth
5 months ago
Wait, I'm confused. They don't actually provide services, just manage contracts... that seems different from a typical HIO
upvoted 0 times
...

Save Cancel