A customer has an ongoing scheduled batch dataflow (S3 source connector) that runs in every 8 hours starting 2 PM UTC. The customer requested for a schedule update to change the start time to 3 PM UTC and to run in every 6 hours. Which is the best possible solution to achieve that?
In Adobe Experience Platform, the Dataflow User Interface (UI) currently has limitations regarding the modification of an active dataflow's schedule once it has been established. To update the 'start time' and 'frequency' of an existing batch dataflow without deleting it and losing historical context, the Flow Service API must be used.
By performing a PATCH request to the /flows endpoint, a developer can update the schedule object within the dataflow's JSON definition. This approach is superior to Options A and C because it maintains the existing dataflow ID and configuration, avoiding the 'double ingestion' or 'gaps' that can occur when creating new dataflows. Option B is incorrect as the current UI allows for very limited schedule edits (often only frequency, but not the base start time for certain connectors). Using the Flow Service API is the most efficient and 'clean' solution, ensuring that the S3 source connector continues to function with the updated requirements of 6-hour intervals starting at 3 PM UTC while preserving all lineage and monitoring history.
======
Would you like me to move on to the next set of questions from your list?
Currently there are no comments in this discussion, be the first to comment!