Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

NVIDIA Exam NCA-GENL Topic 1 Question 1 Discussion

Actual exam question for NVIDIA's NCA-GENL exam
Question #: 1
Topic #: 1
[All NCA-GENL Questions]

[Fundamentals of Machine Learning and Neural Networks]

When comparing and contrasting the ReLU and sigmoid activation functions, which statement is true?

Show Suggested Answer Hide Answer
Suggested Answer: D

ReLU (Rectified Linear Unit) and sigmoid are activation functions used in neural networks. According to NVIDIA's deep learning documentation (e.g., cuDNN and TensorRT), ReLU, defined as f(x) = max(0, x), is computationally efficient because it involves simple thresholding, avoiding expensive exponential calculations required by sigmoid, f(x) = 1/(1 + e^(-x)). Sigmoid outputs values in the range

[0, 1], making it suitable for predicting probabilities in binary classification tasks. ReLU, with an unbounded positive range, is less suited for direct probability prediction but accelerates training by mitigating vanishing gradient issues. Option A is incorrect, as ReLU is non-linear (piecewise linear). Option B is false, as ReLU is more efficient and not inherently more accurate. Option C is wrong, as ReLU's range is

[0, ), not

[0, 1].


NVIDIA cuDNN Documentation: https://docs.nvidia.com/deeplearning/cudnn/developer-guide/index.html

Goodfellow, I., et al. (2016). 'Deep Learning.' MIT Press.

Contribute your Thoughts:

Judy
15 days ago
Wait, are we sure this isn't a trick question? What if the answer is 'all of the above'? Just kidding, but it's always good to double-check!
upvoted 0 times
...
Jeff
25 days ago
I'd say C is the correct answer. Both functions have a range of 0 to 1, right? Seems pretty straightforward to me.
upvoted 0 times
...
Erick
29 days ago
D for sure! ReLU is faster, but sigmoid is the way to go if you need probabilities. Easy peasy!
upvoted 0 times
...
Dean
1 months ago
Hmm, I'm not sure. I guess I'll go with B since it sounds like the most comprehensive answer.
upvoted 0 times
Izetta
14 days ago
User 3: I'm going with B, it seems like the most comprehensive option.
upvoted 0 times
...
Peggie
15 days ago
User 2: I disagree, I believe D is the right choice.
upvoted 0 times
...
Silva
25 days ago
User 1: I think A is the correct answer.
upvoted 0 times
...
...
Andra
1 months ago
I'm not sure, but I think B) ReLU is less computationally efficient than sigmoid, but it is more accurate than sigmoid.
upvoted 0 times
...
Onita
1 months ago
I disagree, I believe the answer is D) ReLU is more computationally efficient, but sigmoid is better for predicting probabilities.
upvoted 0 times
...
Dana
2 months ago
I think the answer is A) ReLU is a linear function while sigmoid is non-linear.
upvoted 0 times
...
Ryan
2 months ago
I think the answer is D. ReLU is more efficient, but sigmoid is better for probabilities.
upvoted 0 times
Katheryn
1 months ago
Oh, I see. Thanks for clarifying that!
upvoted 0 times
...
William
1 months ago
Actually, the correct answer is A. ReLU is a linear function while sigmoid is non-linear.
upvoted 0 times
...
Magda
2 months ago
I think the answer is D. ReLU is more efficient, but sigmoid is better for probabilities.
upvoted 0 times
...
...

Save Cancel