New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Salesforce Certified Marketing Cloud Email Specialist (MC-202) Exam - Topic 7 Question 71 Discussion

Actual exam question for Salesforce's Salesforce Certified Marketing Cloud Email Specialist (MC-202) exam
Question #: 71
Topic #: 7
[All Salesforce Certified Marketing Cloud Email Specialist (MC-202) Questions]

Northern Trail Outfitter historically received a bulk data file from a vendor per day in its Marketing cloud SFTP. The vendor is updating its sending cadence and will be delivering files over approximately eight hours throughout the day. The files will maintain the same naming convention and include a timestamp.

Which update should be implemented to the automation to process the fries as they are received while minimizing network?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

0/2000 characters
Ethan
3 months ago
Not sure if any of these will really minimize network load.
upvoted 0 times
...
Erinn
3 months ago
Option C sounds like overkill for this situation.
upvoted 0 times
...
Virgina
4 months ago
Surprised they’re changing the cadence! Hope it works out.
upvoted 0 times
...
Adell
4 months ago
I disagree, option B seems more efficient with the API.
upvoted 0 times
...
Stevie
4 months ago
I think option A is the best choice for file drops.
upvoted 0 times
...
Timothy
4 months ago
Replicating the automation sounds like it could work, but it might not be efficient. I’m leaning towards A, but I’m not completely confident.
upvoted 0 times
...
Glen
4 months ago
I feel like we practiced a similar question where we had to choose between scheduling and file drops. I think A is definitely the way to go.
upvoted 0 times
...
Thea
5 months ago
I'm not entirely sure, but I think using an API for every file transfer could create too much overhead. Option B seems risky.
upvoted 0 times
...
Novella
5 months ago
I remember we discussed file drop options in class, so I think option A might be the right choice to handle the files as they come in.
upvoted 0 times
...
Ivan
5 months ago
This is a tricky one, but I think option A is the way to go. Replacing the Schedule with a File Drop and using a filename pattern seems like the most efficient way to handle the new file delivery schedule while minimizing network impact. The other options seem a bit more complex and potentially less effective.
upvoted 0 times
...
Rossana
5 months ago
I'm leaning towards option B - implementing an API to start the automation with each file transfer. That way, we can react to the files as they come in without having to worry about the network load. But I'm not 100% sure if that's the best approach, so I'll need to think about it some more.
upvoted 0 times
...
Sarina
5 months ago
Okay, I think I've got this. The question is asking us to choose the best way to update the automation to handle the new file delivery schedule. Based on the information provided, I'd go with option A - replacing the Schedule with a File Drop and using a filename pattern. That way, the automation can process the files as they come in without having to wait for a scheduled run.
upvoted 0 times
...
Jackie
5 months ago
Hmm, I'm a bit confused by the question. It sounds like the vendor is changing the way they deliver the files, but I'm not sure exactly what the best solution is. I'll need to think this through carefully.
upvoted 0 times
...
Juliana
5 months ago
This seems like a straightforward question about optimizing file processing. I think the key is to find a way to process the files as they come in without overloading the network.
upvoted 0 times
...
Royce
5 months ago
This question seems straightforward, but I want to make sure I understand the key concepts of management by objectives before selecting an answer.
upvoted 0 times
...
Pansy
5 months ago
I'm pretty confident that multi-factor authentication is one of the key features to prevent abuse of stolen credentials. I'll make sure to select that one.
upvoted 0 times
...
Selma
5 months ago
Okay, I'm pretty confident that the answer is dedicated compute nodes. That feature allows you to scale compute resources independently for desktop virtualization workloads.
upvoted 0 times
...
Lili
5 months ago
I'm a bit confused on this one. The question mentions "privileged users" but doesn't specify if they need remote access or just local access. I'd want to clarify that before deciding between the NAC, jump box, or VPN options.
upvoted 0 times
...
Gladis
10 months ago
Option C? Really, Anglea? That's like using a sledgehammer to crack a nut. Just keep it simple with the file drop, folks. Less is more when it comes to marketing auSamiraation.
upvoted 0 times
Teddy
9 months ago
Option C? Really, Anglea? That's like using a sledgehammer to crack a nut. Just keep it simple with the file drop, folks. Less is more when it comes to marketing automation.
upvoted 0 times
...
Markus
9 months ago
B) Implement an API to start automation with every file transfer.
upvoted 0 times
...
Juan
9 months ago
A) Replace the Schedule with File Drop and use a filename pattern
upvoted 0 times
...
...
Anglea
10 months ago
I'm going to have to go with Option C on this one. Eight hours of file drops sounds like a nightmare to manage. Replicating the auSamiraation and spreading it out across the day seems like the most elegant solution.
upvoted 0 times
Monroe
10 months ago
Exactly, Option C will help minimize network traffic and make the process more manageable.
upvoted 0 times
...
Maia
10 months ago
It definitely beats having to manage file drops every hour throughout the day.
upvoted 0 times
...
Flo
10 months ago
I agree, spreading out the automation execution over eight hours seems like the most efficient solution.
upvoted 0 times
...
Rana
10 months ago
Option C sounds like the best choice here.
upvoted 0 times
...
...
Benedict
10 months ago
Option B with an API call might work, but that feels a bit overkill for this scenario. I'd go with the file drop approach and save the API option for a more complex use case.
upvoted 0 times
...
Lizbeth
10 months ago
But option C could also work by replicating the automation and scheduling it to execute every eight hours.
upvoted 0 times
...
Samira
10 months ago
I agree with Sherman. The file drop approach sounds like the most efficient solution, especially given the new delivery schedule from the vendor. It should help minimize network usage and keep the process running smoothly.
upvoted 0 times
Elbert
9 months ago
User 2: I agree, that sounds like the best way to handle the new delivery schedule and minimize network usage.
upvoted 0 times
...
France
9 months ago
User 1: I think we should go with option A) Replace the Schedule with File Drop and use a filename pattern.
upvoted 0 times
...
...
Tamekia
10 months ago
I disagree, I believe option B is more efficient as it starts an automation with every file transfer using an API.
upvoted 0 times
...
Sherman
11 months ago
Option A seems like the obvious choice here. Using a File Drop and a filename pattern would allow the automation to process the files as they come in without the need for constant polling or API calls.
upvoted 0 times
Steffanie
10 months ago
Great, so it looks like we're all in agreement that Option A is the best update to implement for processing the files from the vendor.
upvoted 0 times
...
Cecilia
10 months ago
I agree. It's a more efficient way to handle the incoming files without putting too much strain on the network.
upvoted 0 times
...
Gail
10 months ago
That makes sense. It would definitely streamline the process and minimize network usage.
upvoted 0 times
...
Elizabeth
10 months ago
Option A seems like the obvious choice here. Using a File Drop and a filename pattern would allow the automation to process the files as they come in without the need for constant polling or API calls.
upvoted 0 times
...
...
Lizbeth
11 months ago
I think option A is the best choice because it allows for automation based on the filename pattern.
upvoted 0 times
...

Save Cancel