Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam MLS-C01 Topic 4 Question 98 Discussion

Actual exam question for Amazon's MLS-C01 exam
Question #: 98
Topic #: 4
[All MLS-C01 Questions]

This graph shows the training and validation loss against the epochs for a neural network

The network being trained is as follows

* Two dense layers one output neuron

* 100 neurons in each layer

* 100 epochs

* Random initialization of weights

Which technique can be used to improve model performance in terms of accuracy in the validation set?

Show Suggested Answer Hide Answer
Suggested Answer: A

Stratified sampling is a technique that preserves the class distribution of the original dataset when creating a smaller or split dataset. This means that the proportion of examples from each class in the original dataset is maintained in the smaller or split dataset. Stratified sampling can help improve the validation accuracy of the model by ensuring that the validation dataset is representative of the original dataset and not biased towards any class. This can reduce the variance and overfitting of the model and increase its generalization ability. Stratified sampling can be applied to both oversampling and undersampling methods, depending on whether the goal is to increase or decrease the size of the dataset.

The other options are not effective ways to improve the validation accuracy of the model. Acquiring additional data about the majority classes in the original dataset will only increase the imbalance and make the model more biased towards the majority classes. Using a smaller, randomly sampled version of the training dataset will not guarantee that the class distribution is preserved and may result in losing important information from the minority classes. Performing systematic sampling on the original dataset will also not ensure that the class distribution is preserved and may introduce sampling bias if the original dataset is ordered or grouped by class.

References:

* Stratified Sampling for Imbalanced Datasets

* Imbalanced Data

* Tour of Data Sampling Methods for Imbalanced Classification


Contribute your Thoughts:

Tamra
30 days ago
If this was a cake-baking exam, I'd say the solution is to add more sugar. But since it's a neural network, I guess I'll have to use my brain instead of my sweet tooth.
upvoted 0 times
Mariann
15 hours ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Ming
17 days ago
A) Early stopping
upvoted 0 times
...
...
Rosina
1 months ago
Random initialization of weights with an appropriate seed? Sounds like a job for the weights and biases fairy. I wonder if they take résumés.
upvoted 0 times
Raylene
9 days ago
User 2: Adding another layer with the 100 neurons might also improve the model performance.
upvoted 0 times
...
Esteban
28 days ago
User 1: Early stopping could help improve accuracy in the validation set.
upvoted 0 times
...
...
Alecia
1 months ago
Adding another layer with 100 neurons? Seriously, that's like throwing more spaghetti at the wall, hoping it sticks. Not a very strategic approach.
upvoted 0 times
Erick
3 days ago
B) Random initialization of weights with appropriate seed
upvoted 0 times
...
Caprice
28 days ago
A) Early stopping
upvoted 0 times
...
...
Sena
1 months ago
Increasing the number of epochs won't help here. The model has already converged, and continuing to train would just lead to more overfitting.
upvoted 0 times
...
Gail
2 months ago
I agree with Martina, increasing the number of epochs can help improve model performance.
upvoted 0 times
...
Lilli
2 months ago
The training and validation loss curves indicate that the model is overfitting. Early stopping would be the best choice to prevent overfitting and improve validation accuracy.
upvoted 0 times
Eleonore
6 days ago
Random initialization of weights with appropriate seed could also help in preventing overfitting and improving model performance.
upvoted 0 times
...
Joana
6 days ago
Increasing the number of epochs might not necessarily improve validation accuracy, early stopping is a better approach.
upvoted 0 times
...
Glen
14 days ago
I agree, adding another layer with 100 neurons might make the overfitting issue worse.
upvoted 0 times
...
Nancey
23 days ago
Early stopping would definitely help prevent overfitting in this case.
upvoted 0 times
...
Wenona
30 days ago
C) Increasing the number of epochs
upvoted 0 times
...
Lashandra
1 months ago
A) Early stopping
upvoted 0 times
...
...
Martina
2 months ago
I disagree, I believe the answer is C) Increasing the number of epochs.
upvoted 0 times
...
Launa
2 months ago
I think the answer is A) Early stopping.
upvoted 0 times
...

Save Cancel