Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft DP-100 Exam - Topic 4 Question 134 Discussion

Actual exam question for Microsoft's DP-100 exam
Question #: 134
Topic #: 4
[All DP-100 Questions]

You are building recurrent neural network to perform a binary classification.

The training loss, validation loss, training accuracy, and validation accuracy of each training epoch has been provided. You need to identify whether the classification model is over fitted.

Which of the following is correct?

Show Suggested Answer Hide Answer
Suggested Answer: B

An overfit model is one where performance on the train set is good and continues to improve, whereas performance on the validation set improves to a point and then begins to degrade.


https://machinelearningmastery.com/diagnose-overfitting-underfitting-lstm-models/

Contribute your Thoughts:

0/2000 characters
Kaitlyn
14 days ago
B is definitely the best choice. It highlights the overfitting issue clearly.
upvoted 0 times
...
Nelida
19 days ago
I feel like option C doesn't indicate overfitting either. Constant training loss is odd.
upvoted 0 times
...
Ellen
24 days ago
Option A seems wrong. If training loss increases, that's a red flag.
upvoted 0 times
...
Reyes
30 days ago
I agree with Della. It shows the model is memorizing the training data.
upvoted 0 times
...
Della
1 month ago
I think option B is correct. Training loss should decrease, but if validation loss increases, that's overfitting.
upvoted 0 times
...
Tracey
1 month ago
D sounds like a stable model, not overfitting for sure.
upvoted 0 times
...
Tammi
1 month ago
Wait, how can training loss increase? That seems odd.
upvoted 0 times
...
Lanie
2 months ago
Definitely leaning towards B, makes sense!
upvoted 0 times
...
Rosalind
2 months ago
I think A could also indicate some issues.
upvoted 0 times
...
Leana
2 months ago
B is the classic sign of overfitting.
upvoted 0 times
...
Verdell
3 months ago
Haha, this question is a piece of cake! Option B is the only one that makes sense. Overfitting is like trying to cram for an exam - you might ace the test, but you'll forget everything the next day.
upvoted 0 times
...
Nikita
3 months ago
Option B is the way to go. Overfitting is when the model performs well on the training data but fails to generalize to new, unseen data.
upvoted 0 times
...
Carmen
3 months ago
Definitely B. If the training loss is decreasing but the validation loss is increasing, that means the model is memorizing the training data instead of learning the underlying patterns.
upvoted 0 times
...
Marica
3 months ago
I'm pretty sure the answer is B. If the training loss goes down but the validation loss goes up, that's a textbook case of overfitting.
upvoted 0 times
...
Chantell
3 months ago
I recall that if the training accuracy is high but validation accuracy is low, it suggests overfitting, but I can't pinpoint which option describes that scenario.
upvoted 0 times
...
Antonio
4 months ago
I practiced a similar question, and I feel like option A might be correct, but it seems odd for training loss to increase.
upvoted 0 times
...
Quentin
4 months ago
I think if the training loss decreases and the validation loss increases, that indicates overfitting, which sounds like option B.
upvoted 0 times
...
Linwood
4 months ago
I remember that overfitting usually shows a divergence between training and validation metrics, but I'm not sure which option reflects that.
upvoted 0 times
...
Chun
4 months ago
I'm pretty confident that option B is the correct answer. When a model is overfitting, the training loss will decrease while the validation loss increases, as the model becomes too specialized on the training data.
upvoted 0 times
...
Glenna
5 months ago
Okay, let me think this through. Overfitting means the model is performing well on the training data but not generalizing well to the validation data. So the training loss going down while the validation loss goes up sounds like the right sign of overfitting. I'll go with B.
upvoted 0 times
...
Graham
5 months ago
D is also not realistic. Constant losses mean the model isn't learning at all.
upvoted 0 times
...
Daron
5 months ago
Option B is the correct answer. The training loss decreasing while the validation loss increases is a clear sign of overfitting.
upvoted 0 times
...
Shizue
5 months ago
Hmm, I'm a bit confused. I thought overfitting meant the training loss and validation loss both decrease, but the validation loss decreases at a slower rate. I'm not sure which option best describes that scenario.
upvoted 0 times
...
Jaleesa
5 months ago
I think I know the answer to this one. If the model is overfitting, the training loss should decrease while the validation loss increases, so I'll go with option B.
upvoted 0 times
Matt
4 days ago
True, option A indicates a different issue.
upvoted 0 times
...
Rikki
9 days ago
But what about option A? Isn't that a sign of underfitting?
upvoted 0 times
...
Cheryl
4 months ago
Yeah, overfitting usually shows that pattern.
upvoted 0 times
...
Ashley
4 months ago
I agree with you, option B makes sense.
upvoted 0 times
...
...

Save Cancel