New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-100 Exam - Topic 10 Question 46 Discussion

Actual exam question for Microsoft's DP-100 exam
Question #: 46
Topic #: 10
[All DP-100 Questions]

You use the Azure Machine Learning designer to create and run a training pipeline.

The pipeline must be run every night to inference predictions from a large volume of files. The folder where the files will be stored is defined as a dataset.

You need to publish the pipeline as a REST service that can be used for the nightly inferencing run.

What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.

You can submit a batch inference job by pipeline_run, or through REST calls with a published pipeline.


https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/machine-learning-pipelines/parallel-run/README.md

Contribute your Thoughts:

0/2000 characters
Christiane
4 months ago
I thought real-time pipelines were for immediate predictions, not nightly ones?
upvoted 0 times
...
Therese
4 months ago
Setting the compute target is important, but it’s not the main step here.
upvoted 0 times
...
Elouise
4 months ago
Wait, can you really publish a batch pipeline as a REST service?
upvoted 0 times
...
Olene
5 months ago
I agree, option A makes the most sense here!
upvoted 0 times
...
Mose
5 months ago
A batch inference pipeline is definitely the way to go for nightly runs.
upvoted 0 times
...
Sylvie
5 months ago
Cloning the pipeline seems like an option, but I don't see how that helps with the REST service for inferencing.
upvoted 0 times
...
Rima
5 months ago
I feel like this question is similar to one we practiced about real-time vs batch processing. I think batch inference makes more sense for nightly runs.
upvoted 0 times
...
Cruz
5 months ago
I'm not entirely sure, but I remember something about setting the compute target for the pipeline. Could that be relevant here?
upvoted 0 times
...
Hillary
5 months ago
I think we need to create a batch inference pipeline since we're dealing with a large volume of files every night.
upvoted 0 times
...
Lacey
5 months ago
I think the Cumulative Promotion is the best fit here. The company is looking to reward customers for making multiple purchases within a single month, which aligns with the description of that promotion type.
upvoted 0 times
...
Dick
5 months ago
I remember a question like this on practice tests! I think it was about how delivery is part of the sequence when an attachment is sent.
upvoted 0 times
...
Laurene
5 months ago
Wait, I'm confused. If the sessions are load-balanced, won't changing the Drain mode just shift the load to the other hosts? I'm not sure this is the right approach.
upvoted 0 times
...

Save Cancel