New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon SCS-C02 Exam - Topic 7 Question 19 Discussion

Actual exam question for Amazon's SCS-C02 exam
Question #: 19
Topic #: 7
[All SCS-C02 Questions]

A company's data scientists want to create AI/ML training models using Amazon SageMaker. The training models will use large datasets in an Amazon S3 bucket. The datasets contain sensitive information. On average, the data scientists need 30 days to train models. The S3 bucket has been secured appropriately. The company's data retention policy states that all data older than 45 days must be removed from the S3 bucket.

Show Suggested Answer Hide Answer
Suggested Answer: C

To mitigate a credential stuffing attack against a web-based application behind an Application Load Balancer (ALB), creating an AWS WAF web ACL with a custom rule to block requests containing the known malicious user agent string is an effective solution. This approach allows for precise targeting of the attack vector (the user agent string of the device emulator) without impacting legitimate users. AWS WAF provides the capability to inspect HTTP(S) requests and block those that match defined criteria, such as specific strings in the user agent header, thereby preventing malicious requests from reaching the application.


Contribute your Thoughts:

0/2000 characters
Carman
3 months ago
D won't help with deletion, just transitions.
upvoted 0 times
...
Eura
3 months ago
Wait, why not just use A? Seems like the easiest way.
upvoted 0 times
...
Dick
3 months ago
C sounds good, but do we really need a monthly check?
upvoted 0 times
...
Shaun
4 months ago
I think B is overcomplicating things.
upvoted 0 times
...
Lili
4 months ago
Option A is the simplest solution!
upvoted 0 times
...
Rory
4 months ago
I feel like the Intelligent-Tiering option is not really related to the retention policy we need to enforce.
upvoted 0 times
...
Vilma
4 months ago
I practiced a similar question where we had to automate data deletion, and I think the Lifecycle rule is definitely the right choice here.
upvoted 0 times
...
Aimee
4 months ago
I'm not entirely sure, but I think using a Lambda function might be overkill for this situation.
upvoted 0 times
...
Macy
5 months ago
I remember we discussed S3 Lifecycle rules in class, and they seem like the simplest way to handle data retention.
upvoted 0 times
...
Jaime
5 months ago
This is a good one. I think option A, setting up an S3 Lifecycle rule, is the simplest and most straightforward solution. It's automated and doesn't require any additional AWS services. Unless there's a specific reason why that wouldn't work, I'm leaning towards that.
upvoted 0 times
...
Yaeko
5 months ago
I'm a little confused by the options. Do we need to consider the 30-day training time for the models? Or is the focus solely on the data retention policy? I want to make sure I understand the requirements fully before selecting an answer.
upvoted 0 times
...
Kara
5 months ago
Okay, I've got this. The answer is clearly option B - creating a Lambda function to check the last-modified date and delete objects older than 45 days, triggered by an S3 event notification. That way, the deletion happens automatically without any manual intervention.
upvoted 0 times
...
Laurel
5 months ago
This seems like a straightforward question about managing data retention in an S3 bucket. I think the key is to find the most efficient and automated way to delete objects older than 45 days.
upvoted 0 times
...
Dominga
5 months ago
Hmm, I'm a bit unsure about this one. There are a few options presented, and I'm not sure which one is the best approach. I'll need to think it through carefully.
upvoted 0 times
...
Heidy
5 months ago
Okay, let's see here. The question is asking what the added line of code achieves, so I'll need to analyze that specific line and how it might impact the search functionality.
upvoted 0 times
...
Lynna
5 months ago
Okay, I've got this. The key is to establish a quality review process for the project schedules to make sure they're properly baselined and resource-leveled. That will help ensure the overall program meets its goals.
upvoted 0 times
...
Cathrine
9 months ago
Option F: Automate the process with a 'delete-45-day-old-data' Lambda function. That's 'S3riously' the way to go!
upvoted 0 times
...
Ashlyn
9 months ago
Option E: Delete the entire S3 bucket and start fresh every 45 days. Who needs data, anyway?
upvoted 0 times
...
Amina
9 months ago
Option D sounds fancy, but I'm not sure if the company's budget can handle the extra costs of Intelligent-Tiering.
upvoted 0 times
Alisha
8 months ago
That's a good idea, let's make sure we're making the most cost-effective decision for the company.
upvoted 0 times
...
Melina
8 months ago
Maybe we can run some cost analysis to see if it's worth it in the long run.
upvoted 0 times
...
Shalon
8 months ago
I agree, we should weigh the benefits of Intelligent-Tiering against the potential extra costs.
upvoted 0 times
...
Claudia
9 months ago
Option D is definitely a great feature, but we need to consider the budget constraints.
upvoted 0 times
...
...
Viola
10 months ago
Option C is a neat solution, but I'm not sure if it's the most efficient way to manage the data. What if the training needs to be done more frequently?
upvoted 0 times
Lisha
9 months ago
A) But what if the training needs to be done more frequently?
upvoted 0 times
...
Arlean
9 months ago
C) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an Amazon EventBridge rule to invoke the Lambda function each month.
upvoted 0 times
...
Charlena
9 months ago
B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation.
upvoted 0 times
...
Denae
10 months ago
A) Configure an S3 Lifecycle rule on the S3 bucket to delete objects after 45 days.
upvoted 0 times
...
Keshia
10 months ago
D) Configure S3 Intelligent-Tiering on the S3 bucket to automatically transition objects to another storage class.
upvoted 0 times
...
Vilma
10 months ago
B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation.
upvoted 0 times
...
...
Vernell
10 months ago
Option B looks good, but creating an S3 event notification for each PutObject operation could get expensive if there are a lot of uploads.
upvoted 0 times
...
Brett
10 months ago
Hmm, that's a good point. Option B does offer more automation which can be helpful in the long run.
upvoted 0 times
...
Argelia
10 months ago
Option A seems like the easiest solution, but it may not be flexible enough if the training needs change in the future.
upvoted 0 times
...
Kenny
11 months ago
I disagree, I believe option B is better. It provides more control and automation.
upvoted 0 times
...
Brett
11 months ago
I think option A is the best choice. It's simple and straightforward.
upvoted 0 times
...

Save Cancel