New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon DVA-C02 Exam - Topic 2 Question 40 Discussion

Actual exam question for Amazon's DVA-C02 exam
Question #: 40
Topic #: 2
[All DVA-C02 Questions]

A developer is creating an AWS Lambda function that consumes messages from an Amazon Simple Queue Service (Amazon SQS) standard queue. The developer notices that the Lambda function processes some messages multiple times.

How should developer resolve this issue MOST cost-effectively?

Show Suggested Answer Hide Answer
Suggested Answer: D

AWS Secrets Manager:Built for managing secrets, providing encryption, automatic rotation, and access control.

Customer Master Key (CMK):Provides an extra layer of control over encryption through AWS KMS.

Automatic Rotation:Enhances security by regularly changing the secret.

User Data Script:Allows secrets retrieval at instance startup and sets them as environment variables for seamless use within the application.


AWS Secrets Manager Documentation:https://docs.aws.amazon.com/secretsmanager/

AWS KMS Documentation:https://docs.aws.amazon.com/kms/

User Data for EC2 Instances:https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html

Contribute your Thoughts:

0/2000 characters
Elliott
3 months ago
Kinesis? Really? Seems like overkill for just message duplication.
upvoted 0 times
...
Kiley
3 months ago
Dead-letter queues are useful, but not for this specific issue.
upvoted 0 times
...
Tracie
3 months ago
Wait, can’t you just set the concurrency limit to 1 instead?
upvoted 0 times
...
Sylvia
4 months ago
I agree, FIFO queues handle deduplication automatically!
upvoted 0 times
...
Bettina
4 months ago
Switching to a FIFO queue is the best way to avoid duplicates.
upvoted 0 times
...
Keneth
4 months ago
Using Kinesis instead of SQS seems like a big change; I’m not confident that’s necessary just to fix the duplication problem.
upvoted 0 times
...
Leontine
4 months ago
I practiced a similar question where limiting Lambda concurrency helped reduce processing overlap. Maybe that's worth considering here?
upvoted 0 times
...
Ashley
4 months ago
I think setting a dead-letter queue might be a good way to handle failed messages, but it doesn't really solve the duplication issue directly.
upvoted 0 times
...
Elvera
5 months ago
I remember reading that switching to a FIFO queue could help with message duplication, but I'm not sure if that's the most cost-effective option.
upvoted 0 times
...
Jeannine
5 months ago
Changing the message processing to use Amazon Kinesis Data Streams instead of Amazon SQS seems like a more complex solution. I'll need to evaluate if that's necessary for this specific use case.
upvoted 0 times
...
Vivienne
5 months ago
Limiting the concurrency of the Lambda function to 1 might work, but I'm not sure if that's the most cost-effective solution. I'll need to consider the trade-offs.
upvoted 0 times
...
Lillian
5 months ago
Setting up a dead-letter queue could be a good option to handle the unprocessed messages. I'll need to research how to properly configure that.
upvoted 0 times
...
Thersa
5 months ago
Hmm, I'm a bit confused on the differences between standard and FIFO queues. I'll need to review the details on those options to determine the best approach.
upvoted 0 times
...
Magnolia
5 months ago
I think the most cost-effective solution here would be to change the Amazon SQS standard queue to an Amazon SQS FIFO queue. This should help prevent the duplicate message processing.
upvoted 0 times
...
Lashaunda
5 months ago
I remember studying about crowd sourcing, and it could provide a lot of diverse testing environments, but it might lack control over the quality of testing.
upvoted 0 times
...
Ashton
10 months ago
Option A: 'Change the queue type? Ain't nobody got time for that!' (in a humorous tone)
upvoted 0 times
Toshia
9 months ago
C: Set the maximum concurrency limit of the AWS Lambda function to 1
upvoted 0 times
...
Beula
9 months ago
B: Set up a dead-letter queue.
upvoted 0 times
...
Winfred
9 months ago
A: Change the queue type? Ain't nobody got time for that! (in a humorous tone)
upvoted 0 times
...
...
Renea
10 months ago
Option D is an interesting idea, but using Kinesis Data Streams might be overkill for this use case. It could end up being more complex and expensive than necessary.
upvoted 0 times
...
Bobbye
10 months ago
Limiting the concurrency to 1 in Option C might work, but it could impact the overall performance of the Lambda function. I'm not sure if it's the best approach.
upvoted 0 times
Reita
8 months ago
A: True, but changing the message processing to use Amazon Kinesis Data Streams could be another effective option to consider.
upvoted 0 times
...
Skye
9 months ago
B: Setting up a dead-letter queue could also help in handling the duplicate messages.
upvoted 0 times
...
Deeanna
9 months ago
A: I think changing the Amazon SQS standard queue to an Amazon SQS FIFO queue with deduplication ID might be the best solution.
upvoted 0 times
...
...
Lizbeth
10 months ago
I like the idea of setting up a dead-letter queue in Option B. That way we can identify and re-process the problematic messages.
upvoted 0 times
Chauncey
8 months ago
Setting up a dead-letter queue seems like the most practical solution to prevent processing messages multiple times.
upvoted 0 times
...
Allene
9 months ago
We should definitely consider implementing a dead-letter queue to improve message processing.
upvoted 0 times
...
Dominque
9 months ago
Let's go with Option B then. It seems like the most practical solution for our problem.
upvoted 0 times
...
Isaac
9 months ago
I agree, setting up a dead-letter queue is a cost-effective way to handle these issues.
upvoted 0 times
...
Phuong
10 months ago
I agree, setting up a dead-letter queue is a cost-effective way to handle the issue.
upvoted 0 times
...
Brandon
10 months ago
Option B sounds like a good solution. It will help us identify and re-process problematic messages.
upvoted 0 times
...
Kayleigh
10 months ago
Option B sounds like a good solution. We can easily identify and re-process problematic messages.
upvoted 0 times
...
...
Brett
10 months ago
But wouldn't setting the maximum concurrency limit of the Lambda function to 1 be a more cost-effective solution?
upvoted 0 times
...
Annabelle
10 months ago
Option A seems like the easiest solution, but I'm not sure if it's the most cost-effective. FIFO queues can be more expensive than standard queues.
upvoted 0 times
Basilia
10 months ago
B: That's a good point, but changing the Amazon SQS standard queue to a FIFO queue with deduplication ID might be more cost-effective in the long run.
upvoted 0 times
...
Lakeesha
10 months ago
A: I think setting up a dead-letter queue could help prevent processing the same messages multiple times.
upvoted 0 times
...
...
Gladys
10 months ago
I disagree, setting up a dead-letter queue could also help in resolving the issue.
upvoted 0 times
...
Brett
11 months ago
I think the best option is to change the Amazon SQS standard queue to an Amazon SQS FIFO queue.
upvoted 0 times
...

Save Cancel