New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon SCS-C02 Exam - Topic 6 Question 27 Discussion

Actual exam question for Amazon's SCS-C02 exam
Question #: 27
Topic #: 6
[All SCS-C02 Questions]

A company's data scientists want to create artificial intelligence and machine learning (AI/ML) training models by using Amazon SageMaker. The training models will use large datasets in an Amazon S3 bucket. The datasets contain sensitive information.

On average. the data scientists need 30 days to train models. The S3 bucket has been secured appropriately The companfs data retention policy states that all data that is older than 45 days must be removed from the S3 bucket.

Which action should a security engineer take to enforce this data retention policy?

Show Suggested Answer Hide Answer
Suggested Answer: B

For increased security while ensuring functionality, adjusting NACL3 to allow inbound traffic on port 5432 from the CIDR blocks of the application instance subnets, and allowing outbound traffic on ephemeral ports (1024-65536) back to those subnets creates a secure path for database access. Removing default allow-all rules enhances security by implementing the principle of least privilege, ensuring that only necessary traffic is permitted.


Contribute your Thoughts:

0/2000 characters
Casie
3 months ago
Wait, can you really rely on Lifecycle rules alone for sensitive data?
upvoted 0 times
...
Ceola
3 months ago
B sounds more flexible with the Lambda function.
upvoted 0 times
...
Hyman
3 months ago
I’m not so sure about A, what if there are exceptions needed?
upvoted 0 times
...
Ricarda
4 months ago
I agree, A is straightforward and efficient!
upvoted 0 times
...
Barabara
4 months ago
Option A is the simplest way to enforce the retention policy.
upvoted 0 times
...
Garry
4 months ago
I vaguely remember something about S3 Intelligent-Tiering, but it doesn't seem to directly address the retention policy. I think that rules out option D for me.
upvoted 0 times
...
Isreal
4 months ago
I practiced a similar question where we had to manage data retention, and I think the Lifecycle rule is definitely the most straightforward approach. Option A feels like the best fit here.
upvoted 0 times
...
Keith
4 months ago
I'm not entirely sure, but I think using a Lambda function for checking last-modified dates could be more flexible. Options B and C both sound reasonable, but I can't recall which one is better.
upvoted 0 times
...
Pedro
5 months ago
I remember we discussed S3 Lifecycle rules in class, and they seem like the simplest way to handle data retention. Option A might be the right choice.
upvoted 0 times
...
Lea
5 months ago
I'm a little confused by the options. The Lambda function in option B and C both seem like they could work, but I'm not sure if the S3 event notification or the EventBridge rule is the better approach. I'll need to review the pros and cons of each.
upvoted 0 times
...
Ricki
5 months ago
Okay, I think I've got it. The S3 Lifecycle rule in option A looks like the simplest and most straightforward solution. It will automatically delete objects after 45 days, which should work well with the 30-day training requirement.
upvoted 0 times
...
Nu
5 months ago
Hmm, I'm a bit unsure about this one. The question mentions that the data scientists need 30 days to train the models, so I'm wondering if we need to account for that in the solution. I'll have to think this through carefully.
upvoted 0 times
...
Diane
5 months ago
This seems like a straightforward question about enforcing a data retention policy on an S3 bucket. I think the key is to find the most efficient and automated way to delete objects older than 45 days.
upvoted 0 times
...
Mona
5 months ago
I'm leaning towards option A with the S3 Lifecycle rule. It seems like the most reliable and low-maintenance solution, and it should meet the data retention policy requirements without any additional manual intervention. I think I'll go with that one.
upvoted 0 times
...
Virgina
5 months ago
Alright, time to put on my thinking cap. I know the bank requires me to change my password every 6 months, and I've done that 5 times already. So the answer is probably related to the maximum password duration or password history.
upvoted 0 times
...
Katy
10 months ago
Option E: Hire a team of highly trained squirrels to scurry around the S3 bucket and manually delete the old files. They'd probably do it faster than any AWS service!
upvoted 0 times
Maricela
9 months ago
That's a funny idea, but I think using AWS Lambda function is a more efficient way to enforce the data retention policy.
upvoted 0 times
...
Merissa
9 months ago
A) Configure an S3 Lifecycle rule on the S3 bucket to delete objects after 45 days.
upvoted 0 times
...
Juliann
9 months ago
B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation.
upvoted 0 times
...
...
Albina
10 months ago
Option D is interesting, but I'm not sure if S3 Intelligent-Tiering is the right choice here. We need to make sure the data is completely deleted, not just transitioned to a different storage class.
upvoted 0 times
Chery
9 months ago
C: Yeah, we need to ensure the data is completely removed from the S3 bucket, so option A is the way to go.
upvoted 0 times
...
Altha
9 months ago
B: I agree, that seems like the most straightforward solution to enforce the data retention policy.
upvoted 0 times
...
Kimbery
9 months ago
A: I think option A is the best choice. It will automatically delete objects after 45 days.
upvoted 0 times
...
...
Darrin
10 months ago
Haha, Option C is like hiring a personal assistant to clean up your room every month. Seems a bit overkill for this use case.
upvoted 0 times
Walker
8 months ago
Nan: Agreed, keeping it simple with Option A is the way to go.
upvoted 0 times
...
Johanna
9 months ago
User 3: Option B could work too, but it does seem a bit complex for this scenario.
upvoted 0 times
...
Nan
9 months ago
User 2: Yeah, setting up an S3 Lifecycle rule would automate the process efficiently.
upvoted 0 times
...
Franchesca
10 months ago
User 1: Option A seems like the most straightforward solution.
upvoted 0 times
...
...
Lai
10 months ago
I'm not sure if Option B is the best approach. Invoking a Lambda function for every PutObject operation could get expensive and might not scale well.
upvoted 0 times
Hermila
10 months ago
B: Yeah, I agree. It's a more straightforward approach compared to constantly invoking a Lambda function for every PutObject operation.
upvoted 0 times
...
Cristy
10 months ago
A: I think Option A is the best choice. Setting up an S3 Lifecycle rule to delete objects after 45 days seems like a simple and effective solution.
upvoted 0 times
...
...
Christa
10 months ago
I see both sides, but I think option C is a good compromise. It automates the process while also being scheduled monthly.
upvoted 0 times
...
Na
11 months ago
I disagree, I believe option B is more efficient. Using a Lambda function to automatically delete old objects seems like a better solution.
upvoted 0 times
...
Aretha
11 months ago
Option A seems like the simplest and most straightforward solution. Why complicate things with Lambda functions and EventBridge when we can just set a lifecycle rule?
upvoted 0 times
Sherron
9 months ago
Yeah, keeping it simple with a lifecycle rule is definitely the way to go.
upvoted 0 times
...
Donte
9 months ago
Using Lambda functions and EventBridge might be overcomplicating things in this scenario.
upvoted 0 times
...
Nan
10 months ago
I agree, setting a lifecycle rule on the S3 bucket to delete objects after 45 days is the most efficient way to enforce the data retention policy.
upvoted 0 times
...
Bette
10 months ago
Option A seems like the simplest and most straightforward solution.
upvoted 0 times
...
...
Yuette
11 months ago
I think option A is the best choice. It's simple and directly enforces the data retention policy.
upvoted 0 times
...

Save Cancel