Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam PL-400 Topic 8 Question 83 Discussion

Actual exam question for Microsoft's PL-400 exam
Question #: 83
Topic #: 8
[All PL-400 Questions]

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.

After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.

You are designing a one-way integration from Microsoft Dataverse to another system.

You must use an Azure Function to update the other system. The integration must send only newly created records to the other system. The solution must support scenarios where a component of the integration is unavailable for more than a few seconds to avoid data loss.

You need to design the integration solution.

Solution: Register a webhook in the Dataverse instance that connects to the Azure Function. Register a step on the webhook which runs synchronously on the record's Create message and in the post-operation stage.

Does the solution meet the goal?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Cortney
2 days ago
That's a good point. Maybe a hybrid approach with both synchronous and asynchronous components would be better to handle component unavailability. We could use the webhook to trigger the Azure Function, but then have the Function queue the data for processing later.
upvoted 0 times
...
Hannah
3 days ago
Hmm, I'm not sure about this one. Doesn't the requirement mention that the solution should support scenarios where a component of the integration is unavailable for more than a few seconds? I'm wondering if a synchronous approach is the best way to handle that.
upvoted 0 times
...
Nichelle
4 days ago
Ah, I see what you're getting at. That would be a more resilient solution. We don't want to lose any data if the connection to the other system goes down for a bit. Queuing the data for later processing is a smart way to address that.
upvoted 0 times
...
Ligia
5 days ago
Good call! Batching the records would be a great way to optimize the performance of the integration. We don't want the Azure Function to get bogged down and cause delays in the data transfer.
upvoted 0 times
...

Save Cancel