Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon-DEA-C01 Exam - Topic 1 Question 26 Discussion

Actual exam question for Amazon's Amazon-DEA-C01 exam
Question #: 26
Topic #: 1
[All Amazon-DEA-C01 Questions]

A company is building a new application that ingests CSV files into Amazon Redshift. The company has developed the frontend for the application.

The files are stored in an Amazon S3 bucket. Files are no larger than 5 MB.

A data engineer is developing the extract, transform, and load (ETL) pipeline for the CSV files. The data engineer configured a Redshift cluster and an AWS Lambda function that copies the data out of the files into the Redshift cluster.

Which additional steps should the data engineer perform to meet these requirements?

Show Suggested Answer Hide Answer
Suggested Answer: A

Option A is the most direct and operationally efficient way to trigger the existing Lambda-based load into Amazon Redshift whenever a new CSV file is uploaded to Amazon S3. The key requirement is event-driven automation on ''new object created'' events, and the study material explicitly describes using Amazon EventBridge to react to S3 uploads by creating a rule on the default event bus and routing matching events to a target application.

Compared with queue-based designs (Options B and D), Option A reduces components: there is no need to manage an SQS queue, batching/visibility timeout behavior, or retry semantics at the consumer layer. This is especially appropriate because the files are small ( 5 MB) and the Lambda function is already implemented to perform the copy/load step. Using DMS (Option C) is not designed for ''S3 object arrival Lambda Redshift'' event triggering; it introduces unnecessary services and operational work for a simple file-ingestion trigger.


Contribute your Thoughts:

0/2000 characters

Currently there are no comments in this discussion, be the first to comment!


Save Cancel