New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Certified MuleSoft Platform Integration Architect (Mule-Arch-202) Exam - Topic 6 Question 26 Discussion

Actual exam question for Salesforce's Salesforce Certified MuleSoft Platform Integration Architect (Mule-Arch-202) exam
Question #: 26
Topic #: 6
[All Salesforce Certified MuleSoft Platform Integration Architect (Mule-Arch-202) Questions]

An organization is designing an integration solution to replicate financial transaction data from a legacy system into a data warehouse (DWH).

The DWH must contain a daily snapshot of financial transactions, to be delivered as a CSV file. Daily transaction volume exceeds tens of millions of records, with significant spikes in volume during popular shopping periods.

What is the most appropriate integration style for an integration solution that meets the organization's current requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

0/2000 characters
Mirta
2 months ago
Definitely leaning towards Batch-triggered ETL for efficiency!
upvoted 0 times
...
Trinidad
2 months ago
Wait, are we sure tens of millions of records can be processed daily?
upvoted 0 times
...
Katina
3 months ago
Event-driven architecture might be overkill for this use case.
upvoted 0 times
...
Youlanda
3 months ago
Sounds like Batch-triggered ETL is the way to go for daily snapshots.
upvoted 0 times
...
Heike
3 months ago
I think API-led connectivity could handle the spikes better.
upvoted 0 times
...
Zana
3 months ago
Batch-triggered ETL seems to align with the requirement for a daily CSV file, but I wonder if it can handle the spikes effectively.
upvoted 0 times
...
Yoko
3 months ago
I practiced a similar question where API-led connectivity was mentioned, but I'm not convinced it's the best option for daily snapshots.
upvoted 0 times
...
Dorthy
4 months ago
I'm not entirely sure, but I feel like event-driven architecture could handle spikes in transaction volume better.
upvoted 0 times
...
Johnna
4 months ago
I remember studying about batch processing for large volumes of data, so I think Batch-triggered ETL might be a good fit here.
upvoted 0 times
...
Gracia
4 months ago
Event-driven architecture seems like an interesting option, but I'm not sure it's the most efficient way to handle the scale and frequency of the data updates. I'd lean more towards a batch-triggered ETL approach.
upvoted 0 times
...
Leigha
4 months ago
Based on the details provided, I think a batch-triggered ETL solution would be the most appropriate. It can handle the high volume and spikes, and the daily snapshot requirement fits well with a batch process.
upvoted 0 times
...
Golda
4 months ago
I'm a bit confused here. Microservices or API-led connectivity could also work, but I'd need to understand more about the legacy system and the data warehouse requirements.
upvoted 0 times
...
Alise
5 months ago
Hmm, not sure about that. With the daily snapshot requirement, an event-driven architecture might be a better fit to capture the changes in real-time.
upvoted 0 times
...
Penney
5 months ago
This looks like a classic data integration problem. I'd probably go with a batch-triggered ETL approach to handle the high transaction volume and spikes.
upvoted 0 times
...
Erinn
10 months ago
API-led connectivity? More like API-led confusion if you ask me. Batch ETL all the way, folks. It's the classic choice for a reason.
upvoted 0 times
Gwen
9 months ago
Event-driven architecture might be too complex for this scenario, batch ETL seems like the safer bet.
upvoted 0 times
...
Louvenia
9 months ago
I agree, it's a reliable and efficient choice for this kind of integration.
upvoted 0 times
...
Clare
10 months ago
Batch ETL is definitely the way to go for handling large volumes of data.
upvoted 0 times
...
...
Deeanna
10 months ago
I'm not sure why anyone would even consider microservices for this use case. That's like trying to build a skyscraper with toothpicks!
upvoted 0 times
Tommy
9 months ago
D) Batch-triggered ETL
upvoted 0 times
...
Reynalda
9 months ago
C) API-led connectivity
upvoted 0 times
...
Miles
9 months ago
A) Event-driven architecture
upvoted 0 times
...
...
Irma
10 months ago
Hmm, I don't think an event-driven architecture would work well here. The high transaction volume and need for a daily snapshot make a batch-triggered ETL the logical choice in my opinion.
upvoted 0 times
...
Cordelia
11 months ago
I'm leaning towards option D. With millions of records and spikes in volume, a batch-triggered ETL process seems like the most efficient way to handle the data load without risking performance issues.
upvoted 0 times
Mitsue
9 months ago
I agree with you. Option D seems like the best choice for handling such a large volume of data efficiently.
upvoted 0 times
...
Kallie
9 months ago
D) Batch-triggered ETL
upvoted 0 times
...
Matthew
10 months ago
C) API-led connectivity
upvoted 0 times
...
Pamella
10 months ago
B) Microservice architecture
upvoted 0 times
...
Nathalie
10 months ago
A) Event-driven architecture
upvoted 0 times
...
...
Malcom
11 months ago
That's a good point, but wouldn't Microservice architecture also be able to handle the daily snapshot requirement efficiently?
upvoted 0 times
...
Vincent
11 months ago
I disagree, I believe Event-driven architecture would be better for handling spikes in volume.
upvoted 0 times
...
Malcom
11 months ago
I think the most appropriate integration style would be Batch-triggered ETL.
upvoted 0 times
...

Save Cancel