Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam BDS-C00 Topic 12 Question 78 Discussion

Actual exam question for Amazon's BDS-C00 exam
Question #: 78
Topic #: 12
[All BDS-C00 Questions]

A medical record filing system for a government medical fund is using an Amazon S3 bucket to archive documents related to patients. Every patient visit to a physician creates a new file, which can add up to millions of files each month. Collection of these files from each physician is handled via a batch process that runs every night using AWS Data Pipeline. This is sensitive data, so the data and any associated metadata must be encrypted at rest.

Auditors review some files on a quarterly basis to see whether the records are maintained according to regulations. Auditors must be able to locate any physical file in the S3 bucket or a given data, patient, or physician. Auditors spend a signification amount of time locating such files.

What is the most cost-and time-efficient collection methodology in this situation?

Show Suggested Answer Hide Answer
Suggested Answer: D

Contribute your Thoughts:

Cyndy
1 months ago
I'm just glad I don't have to deal with this many files on a daily basis. That must be a real headache for the IT team!
upvoted 0 times
Krystina
2 days ago
A: I agree, managing millions of files each month sounds like a nightmare.
upvoted 0 times
...
Barrett
17 days ago
A: I agree, managing millions of files each month sounds like a nightmare.
upvoted 0 times
...
...
Fletcher
1 months ago
Putting the metadata in Redshift instead of DynamoDB is an interesting twist. I wonder if that would be more efficient for the auditing process.
upvoted 0 times
...
Reta
1 months ago
Partitioning the files in DynamoDB based on the month and year could be a good way to make it easier for auditors to locate specific files. I like that approach.
upvoted 0 times
Jaime
8 days ago
C: Using Amazon kinesis to get the data feeds directly from physicians and then storing them in Amazon S3 with folders separated per physician could also be efficient.
upvoted 0 times
...
Sabra
24 days ago
B: Yeah, that would definitely make it easier for auditors to locate specific files when needed.
upvoted 0 times
...
Lai
1 months ago
A: I think using Amazon S3 event notifications to populate an Amazon DynamoDB table with metadata and partitioning based on the month and year is a good idea.
upvoted 0 times
...
...
Hildred
2 months ago
Using Kinesis to get the data feeds directly from the physicians sounds like a good idea, but I'm not sure if it's the most cost-effective.
upvoted 0 times
Vanna
20 days ago
B: Yeah, that could help with organizing the files efficiently.
upvoted 0 times
...
Meaghan
1 months ago
A: I think using Amazon S3 event notifications to populate an Amazon DynamoDB table with metadata about every file loaded to Amazon S3 is a good option.
upvoted 0 times
...
...
Josephine
2 months ago
Hmm, this seems like a tricky question. I'll have to think carefully about the cost and time efficiency of each option.
upvoted 0 times
...
Kenda
2 months ago
That's a good point, option C could indeed save time for auditors by organizing files based on month and year.
upvoted 0 times
...
Shaun
2 months ago
I disagree, I believe option C is more efficient as it uses Amazon S3 event notifications to populate a DynamoDB table with metadata.
upvoted 0 times
...
Kenda
2 months ago
I think option A is the best choice because it directly gets data feeds from physicians and stores them in folders per physician.
upvoted 0 times
...

Save Cancel