New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Professional Data Engineer Exam - Topic 3 Question 70 Discussion

Actual exam question for Google's Professional Data Engineer exam
Question #: 70
Topic #: 3
[All Professional Data Engineer Questions]

Which of the following are examples of hyperparameters? (Select 2 answers.)

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

0/2000 characters
Pearlene
3 months ago
Yeah, A and B are the classic examples.
upvoted 0 times
...
Alton
3 months ago
Wait, are weights not hyperparameters? That’s surprising!
upvoted 0 times
...
Annmarie
4 months ago
Totally agree, A and B are the right choices!
upvoted 0 times
...
Wynell
4 months ago
I thought biases were hyperparameters too?
upvoted 0 times
...
Katy
4 months ago
A and B are definitely hyperparameters!
upvoted 0 times
...
Virgina
4 months ago
I'm a bit confused; I thought both the number of hidden layers and the number of nodes were hyperparameters, but I can't recall for sure.
upvoted 0 times
...
Eliz
4 months ago
I feel like biases and weights are more like parameters, not hyperparameters.
upvoted 0 times
...
Rikki
5 months ago
I remember practicing a question like this, and I think the number of nodes in each hidden layer is also a hyperparameter.
upvoted 0 times
...
Twana
5 months ago
I think the number of hidden layers is definitely a hyperparameter, but I'm not sure about the other one.
upvoted 0 times
...
Wei
5 months ago
Yep, I agree with Emily. Biases and weights are model parameters that get updated during training, not hyperparameters that are set beforehand.
upvoted 0 times
...
Brianne
5 months ago
I think the key is to focus on what can be adjusted before training versus what gets learned during training. The number of layers and nodes are set upfront, so those are hyperparameters.
upvoted 0 times
...
Mari
5 months ago
Wait, are biases and weights considered hyperparameters too? I'm a bit confused on that.
upvoted 0 times
...
Bambi
5 months ago
Okay, I've got this. The number of hidden layers and the number of nodes in each layer are definitely hyperparameters, since they're set before training the model.
upvoted 0 times
...
Gerri
5 months ago
Hmm, this looks like a tricky one. I'll need to think carefully about the difference between hyperparameters and model parameters.
upvoted 0 times
...
Miriam
5 months ago
Hmm, I'm a little unsure about this one. The options seem pretty similar, and I'm not totally clear on the difference between interfund reimbursement and interfund exchange. I'll have to think this through carefully.
upvoted 0 times
...
Glen
5 months ago
Hmm, I'm a bit unsure about this one. The options seem similar, and I want to make sure I understand the differences between them. Let me think this through carefully.
upvoted 0 times
...
Chanel
5 months ago
Isn't Service Discoverability more about how easily services can be found rather than restricting access? I feel a bit uncertain on this one.
upvoted 0 times
...
Emilio
9 months ago
I'm going with A and B. Gotta love those hidden layers and node counts! They're like the dials on a machine learning control panel.
upvoted 0 times
Domitila
8 months ago
It's important to understand the role of each hyperparameter in order to optimize the model.
upvoted 0 times
...
Dulce
8 months ago
I think biases and weights are more like parameters that get adjusted during training.
upvoted 0 times
...
Cassi
8 months ago
Yeah, those hidden layers and node counts can really impact the performance of the model.
upvoted 0 times
...
Oliva
9 months ago
I agree, A and B are definitely hyperparameters that can be tuned.
upvoted 0 times
...
...
Tabetha
9 months ago
Haha, this is a tricky one! I bet the exam creators just wanted to see if we can distinguish between the model's parameters and its hyperparameters. Time to brush up on my ML terminology!
upvoted 0 times
Jennifer
8 months ago
B) Number of nodes in each hidden layer
upvoted 0 times
...
Sharmaine
8 months ago
A) Number of hidden layers
upvoted 0 times
...
...
Lizbeth
10 months ago
Wait, I thought hyperparameters were external to the model, like the learning rate or the batch size. Isn't that what we're supposed to be looking for here?
upvoted 0 times
Anjelica
9 months ago
B) Number of nodes in each hidden layer
upvoted 0 times
...
Gerri
9 months ago
A) Number of hidden layers
upvoted 0 times
...
...
Georgeanna
10 months ago
I think biases and weights are also hyperparameters because they are set before training the model.
upvoted 0 times
...
Brent
10 months ago
I agree with Tawanna, hyperparameters include the number of hidden layers and nodes in each layer.
upvoted 0 times
...
Frederic
11 months ago
I'm pretty sure C and D are also hyperparameters. Biases and weights are part of the model's architecture and can be adjusted during the training process.
upvoted 0 times
Gilma
9 months ago
I agree, adjusting biases and weights can have a significant impact on the model's learning process.
upvoted 0 times
...
Salley
9 months ago
Yes, biases and weights are important hyperparameters that can affect the model's performance.
upvoted 0 times
...
Becky
9 months ago
I think you're right, biases and weights are also hyperparameters that can be tuned.
upvoted 0 times
...
Linn
9 months ago
D) Weights
upvoted 0 times
...
Dierdre
9 months ago
C) Biases
upvoted 0 times
...
Annmarie
9 months ago
B) Number of nodes in each hidden layer
upvoted 0 times
...
Eden
10 months ago
A) Number of hidden layers
upvoted 0 times
...
...
Tawanna
11 months ago
A) Number of hidden layers and B) Number of nodes in each hidden layer
upvoted 0 times
...
German
11 months ago
Hmm, I think A and B are the correct hyperparameters. The number of hidden layers and nodes in each layer are crucial hyperparameters that can be tuned to optimize the model's performance.
upvoted 0 times
...

Save Cancel