Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam SCS-C02 Topic 7 Question 19 Discussion

Actual exam question for Amazon's SCS-C02 exam
Question #: 19
Topic #: 7
[All SCS-C02 Questions]

A company's data scientists want to create AI/ML training models using Amazon SageMaker. The training models will use large datasets in an Amazon S3 bucket. The datasets contain sensitive information. On average, the data scientists need 30 days to train models. The S3 bucket has been secured appropriately. The company's data retention policy states that all data older than 45 days must be removed from the S3 bucket.

Show Suggested Answer Hide Answer
Suggested Answer: C

To mitigate a credential stuffing attack against a web-based application behind an Application Load Balancer (ALB), creating an AWS WAF web ACL with a custom rule to block requests containing the known malicious user agent string is an effective solution. This approach allows for precise targeting of the attack vector (the user agent string of the device emulator) without impacting legitimate users. AWS WAF provides the capability to inspect HTTP(S) requests and block those that match defined criteria, such as specific strings in the user agent header, thereby preventing malicious requests from reaching the application.


Contribute your Thoughts:

Cathrine
13 days ago
Option F: Automate the process with a 'delete-45-day-old-data' Lambda function. That's 'S3riously' the way to go!
upvoted 0 times
...
Ashlyn
15 days ago
Option E: Delete the entire S3 bucket and start fresh every 45 days. Who needs data, anyway?
upvoted 0 times
...
Amina
16 days ago
Option D sounds fancy, but I'm not sure if the company's budget can handle the extra costs of Intelligent-Tiering.
upvoted 0 times
...
Viola
2 months ago
Option C is a neat solution, but I'm not sure if it's the most efficient way to manage the data. What if the training needs to be done more frequently?
upvoted 0 times
Lisha
23 hours ago
A) But what if the training needs to be done more frequently?
upvoted 0 times
...
Arlean
8 days ago
C) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an Amazon EventBridge rule to invoke the Lambda function each month.
upvoted 0 times
...
Charlena
10 days ago
B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation.
upvoted 0 times
...
Denae
1 months ago
A) Configure an S3 Lifecycle rule on the S3 bucket to delete objects after 45 days.
upvoted 0 times
...
Keshia
1 months ago
D) Configure S3 Intelligent-Tiering on the S3 bucket to automatically transition objects to another storage class.
upvoted 0 times
...
Vilma
1 months ago
B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation.
upvoted 0 times
...
...
Vernell
2 months ago
Option B looks good, but creating an S3 event notification for each PutObject operation could get expensive if there are a lot of uploads.
upvoted 0 times
...
Brett
2 months ago
Hmm, that's a good point. Option B does offer more automation which can be helpful in the long run.
upvoted 0 times
...
Argelia
2 months ago
Option A seems like the easiest solution, but it may not be flexible enough if the training needs change in the future.
upvoted 0 times
...
Kenny
2 months ago
I disagree, I believe option B is better. It provides more control and automation.
upvoted 0 times
...
Brett
2 months ago
I think option A is the best choice. It's simple and straightforward.
upvoted 0 times
...

Save Cancel