Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon AIF-C01 Exam - Topic 2 Question 26 Discussion

Actual exam question for Amazon's AIF-C01 exam
Question #: 26
Topic #: 2
[All AIF-C01 Questions]

Why does overfilting occur in ML models?

Show Suggested Answer Hide Answer
Suggested Answer: A

Overfitting occurs when an ML model learns the training data too well, including noise and patterns that do not generalize to new data. A key cause of overfitting is when the training dataset does not represent all possible input values, leading the model to over-specialize on the limited data it was trained on, failing to generalize to unseen data.

Exact Extract from AWS AI Documents:

From the Amazon SageMaker Developer Guide:

'Overfitting often occurs when the training dataset is not representative of the broader population of possible inputs, causing the model to memorize specific patterns, including noise, rather than learning generalizable features.'

(Source: Amazon SageMaker Developer Guide, Model Evaluation and Overfitting)

Detailed

Option A: The training dataset does not represent all possible input values.This is the correct answer. If the training dataset lacks diversity and does not cover the range of possible inputs, the model overfits by learning patterns specific to the training data, failing to generalize.

Option B: The model contains a regularization method.Regularization methods (e.g., L2 regularization) are used to prevent overfitting, not cause it. This option is incorrect.

Option C: The model training stops early because of an early stopping criterion.Early stopping is a technique to prevent overfitting by halting training when performance on a validation set degrades. It does not cause overfitting.

Option D: The training dataset contains too many features.While too many features can contribute to overfitting (e.g., by increasing model complexity), this is less directly tied to overfitting than a non-representative dataset. The dataset's representativeness is the primary cause.


Amazon SageMaker Developer Guide: Model Evaluation and Overfitting (https://docs.aws.amazon.com/sagemaker/latest/dg/model-evaluation.html)

AWS AI Practitioner Learning Path: Module on Model Performance and Evaluation

AWS Documentation: Understanding Overfitting (https://aws.amazon.com/machine-learning/)

Contribute your Thoughts:

0/2000 characters
Doyle
3 months ago
Not sure about that early stopping thing, sounds a bit off.
upvoted 0 times
...
Michal
3 months ago
Totally agree, especially with too many features!
upvoted 0 times
...
Carlee
3 months ago
Wait, so regularization actually helps prevent overfitting?
upvoted 0 times
...
Tawna
4 months ago
I think it's more about the dataset not being representative.
upvoted 0 times
...
Kristel
4 months ago
Overfitting happens when the model learns noise from the training data.
upvoted 0 times
...
Oretha
4 months ago
I’m a bit confused about early stopping. I thought it was supposed to help avoid overfitting, but I can’t remember if it’s directly related to the question.
upvoted 0 times
...
Merlyn
4 months ago
I practiced a question similar to this, and I think regularization methods help prevent overfitting, so option B seems unlikely.
upvoted 0 times
...
Annice
4 months ago
I’m not entirely sure, but I feel like overfitting happens when the training dataset doesn’t cover all possible inputs. It’s like the model learns noise instead of the actual pattern.
upvoted 0 times
...
Gilbert
5 months ago
I remember discussing overfitting in class, and I think it has to do with the model being too complex for the training data. Maybe it's related to having too many features?
upvoted 0 times
...
Micheal
5 months ago
Ah, overfilting - that's when the model becomes too specialized on the training data and can't perform well on new, unseen data. I'd say Option B about regularization is the key factor that can help prevent overfilting.
upvoted 0 times
...
Jody
5 months ago
Overfilting happens when the model is too complex and fits the training data too closely, losing the ability to generalize. I think Option D about too many features is the most relevant explanation here.
upvoted 0 times
...
Kristine
5 months ago
Hmm, I'm a bit confused on this one. I know overfilting has to do with the model being too complex and not generalizing well, but I'm not sure which of these options best explains that. I'll have to think it through carefully.
upvoted 0 times
...
Rikki
5 months ago
I think overfilting occurs when the model is too complex and starts to memorize the training data instead of learning the underlying patterns. Option A seems like the most likely explanation to me.
upvoted 0 times
...

Save Cancel