New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon MLS-C01 Exam - Topic 4 Question 98 Discussion

Actual exam question for Amazon's MLS-C01 exam
Question #: 98
Topic #: 4
[All MLS-C01 Questions]

This graph shows the training and validation loss against the epochs for a neural network

The network being trained is as follows

* Two dense layers one output neuron

* 100 neurons in each layer

* 100 epochs

* Random initialization of weights

Which technique can be used to improve model performance in terms of accuracy in the validation set?

Show Suggested Answer Hide Answer
Suggested Answer: A

Stratified sampling is a technique that preserves the class distribution of the original dataset when creating a smaller or split dataset. This means that the proportion of examples from each class in the original dataset is maintained in the smaller or split dataset. Stratified sampling can help improve the validation accuracy of the model by ensuring that the validation dataset is representative of the original dataset and not biased towards any class. This can reduce the variance and overfitting of the model and increase its generalization ability. Stratified sampling can be applied to both oversampling and undersampling methods, depending on whether the goal is to increase or decrease the size of the dataset.

The other options are not effective ways to improve the validation accuracy of the model. Acquiring additional data about the majority classes in the original dataset will only increase the imbalance and make the model more biased towards the majority classes. Using a smaller, randomly sampled version of the training dataset will not guarantee that the class distribution is preserved and may result in losing important information from the minority classes. Performing systematic sampling on the original dataset will also not ensure that the class distribution is preserved and may introduce sampling bias if the original dataset is ordered or grouped by class.

References:

* Stratified Sampling for Imbalanced Datasets

* Imbalanced Data

* Tour of Data Sampling Methods for Imbalanced Classification


Contribute your Thoughts:

0/2000 characters
Mickie
3 months ago
I agree, early stopping is a solid choice for improving validation accuracy!
upvoted 0 times
...
Nadine
3 months ago
Adding another layer sounds interesting, but is it really necessary?
upvoted 0 times
...
Lacey
3 months ago
Wait, can random weight initialization really make that much of a difference?
upvoted 0 times
...
Allene
4 months ago
I think increasing the number of epochs might just lead to more overfitting.
upvoted 0 times
...
Makeda
4 months ago
Early stopping can really help prevent overfitting!
upvoted 0 times
...
Markus
4 months ago
Adding another layer could potentially improve the model, but I wonder if it would really help given that we already have two dense layers.
upvoted 0 times
...
Dell
4 months ago
Increasing the number of epochs seems like a common approach, but I feel like it might just lead to overfitting instead of improving validation accuracy.
upvoted 0 times
...
Princess
4 months ago
I think random initialization with a seed could help, but we also talked about how it might not always lead to better performance.
upvoted 0 times
...
Iesha
5 months ago
I remember we discussed early stopping in class, but I'm not entirely sure how it directly affects validation accuracy.
upvoted 0 times
...
Nikita
5 months ago
Random initialization of weights with an appropriate seed could help improve the model's performance. That way, the model won't get stuck in a local minimum.
upvoted 0 times
...
Annalee
5 months ago
I'm a bit confused by the graph. The training loss keeps decreasing, but the validation loss starts increasing after a while. I'm not sure what the best approach would be.
upvoted 0 times
...
Ma
5 months ago
Hmm, this looks like a classic case of overfitting. I think early stopping would be the best way to go here.
upvoted 0 times
...
Josephine
5 months ago
Increasing the number of epochs might not be the best idea since the validation loss is already starting to increase. Adding another layer could help, but I'm not sure if that's the most efficient solution.
upvoted 0 times
...
Bobbye
5 months ago
This error message sounds like a configuration issue with the search head cluster. I'll need to carefully read through the options and think about the potential causes before selecting an answer.
upvoted 0 times
...
Lyndia
5 months ago
I'm pretty confident I know the answer to this one. The 3GPP rel 15 specification mentions that pooling is supported for the AMF network function.
upvoted 0 times
...
Tamra
10 months ago
If this was a cake-baking exam, I'd say the solution is to add more sugar. But since it's a neural network, I guess I'll have to use my brain instead of my sweet tooth.
upvoted 0 times
Lavera
9 months ago
C) Increasing the number of epochs
upvoted 0 times
...
Mariann
9 months ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Ming
9 months ago
A) Early stopping
upvoted 0 times
...
...
Rosina
10 months ago
Random initialization of weights with an appropriate seed? Sounds like a job for the weights and biases fairy. I wonder if they take résumés.
upvoted 0 times
Torie
9 months ago
User 3: Random initialization of weights with an appropriate seed is crucial for training neural networks effectively.
upvoted 0 times
...
Raylene
9 months ago
User 2: Adding another layer with the 100 neurons might also improve the model performance.
upvoted 0 times
...
Esteban
10 months ago
User 1: Early stopping could help improve accuracy in the validation set.
upvoted 0 times
...
...
Alecia
10 months ago
Adding another layer with 100 neurons? Seriously, that's like throwing more spaghetti at the wall, hoping it sticks. Not a very strategic approach.
upvoted 0 times
Alpha
9 months ago
C) Increasing the number of epochs
upvoted 0 times
...
Erick
9 months ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Caprice
10 months ago
A) Early stopping
upvoted 0 times
...
...
Sena
10 months ago
Increasing the number of epochs won't help here. The model has already converged, and continuing to train would just lead to more overfitting.
upvoted 0 times
...
Gail
10 months ago
I agree with Martina, increasing the number of epochs can help improve model performance.
upvoted 0 times
...
Lilli
11 months ago
The training and validation loss curves indicate that the model is overfitting. Early stopping would be the best choice to prevent overfitting and improve validation accuracy.
upvoted 0 times
Eleonore
9 months ago
Random initialization of weights with appropriate seed could also help in preventing overfitting and improving model performance.
upvoted 0 times
...
Joana
9 months ago
Increasing the number of epochs might not necessarily improve validation accuracy, early stopping is a better approach.
upvoted 0 times
...
Glen
9 months ago
I agree, adding another layer with 100 neurons might make the overfitting issue worse.
upvoted 0 times
...
Nancey
10 months ago
Early stopping would definitely help prevent overfitting in this case.
upvoted 0 times
...
Wenona
10 months ago
C) Increasing the number of epochs
upvoted 0 times
...
Lashandra
10 months ago
A) Early stopping
upvoted 0 times
...
...
Martina
11 months ago
I disagree, I believe the answer is C) Increasing the number of epochs.
upvoted 0 times
...
Launa
11 months ago
I think the answer is A) Early stopping.
upvoted 0 times
...

Save Cancel