New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CertNexus AIP-210 Exam - Topic 2 Question 29 Discussion

Actual exam question for CertNexus's AIP-210 exam
Question #: 29
Topic #: 2
[All AIP-210 Questions]

Normalization is the transformation of features:

Show Suggested Answer Hide Answer
Suggested Answer: C

Normalization is the transformation of features so that they are on a similar scale, usually between 0 and 1 or -1 and 1. This can help reduce the influence of outliers and improve the performance of some machine learning algorithms that are sensitive to the scale of the features, such as gradient descent, k-means, or k-nearest neighbors. Reference: [Feature scaling - Wikipedia], [Normalization vs Standardization --- Quantitative analysis]


Contribute your Thoughts:

0/2000 characters
Johnna
3 months ago
C is the right answer, they need to be on a similar scale!
upvoted 0 times
...
Rosalind
3 months ago
I thought it was more about making everything comparable?
upvoted 0 times
...
Beatriz
4 months ago
Wait, isn't normalization just about the mean and std dev?
upvoted 0 times
...
Evan
4 months ago
Totally agree, it's key for model performance.
upvoted 0 times
...
Elsa
4 months ago
Normalization is all about scaling features!
upvoted 0 times
...
Lynda
4 months ago
I definitely remember that normalization is important for machine learning, but I’m confused about whether it’s about scaling or transforming into a normal distribution.
upvoted 0 times
...
Karima
5 months ago
I feel like normalization could also relate to the normal distribution, but I can't recall the exact details.
upvoted 0 times
...
Kanisha
5 months ago
I remember practicing a question where normalization was described as putting features on a similar scale, so I might lean towards option C.
upvoted 0 times
...
Terrilyn
5 months ago
I think normalization is about scaling features, but I'm not sure if it's specifically about the mean and standard deviation.
upvoted 0 times
...
Paris
5 months ago
Ah yes, normalization is all about centering and scaling the data, so subtracting the mean and dividing by the standard deviation sounds right to me. I'll choose A.
upvoted 0 times
...
Christiane
5 months ago
Hmm, I'm a bit confused on the difference between normalization and standardization. I'll have to think this through carefully.
upvoted 0 times
...
Shanice
5 months ago
I'm pretty sure normalization is about scaling the features to a common range, like 0 to 1, so I'll go with option C.
upvoted 0 times
...
Shalon
5 months ago
Okay, I remember learning that normalization transforms the features to follow a normal distribution, so I'll select B for this one.
upvoted 0 times
...
Ronald
5 months ago
I think I've seen this type of question before. Let me think through the options carefully.
upvoted 0 times
...
Serina
1 year ago
I'm feeling option C. Normalizing to a similar scale just makes good sense. Although, now I'm wondering if I should've just rolled the dice and gone with option D. Keeping things interesting, you know?
upvoted 0 times
...
Felicitas
1 year ago
D is the one for me. Normalizing to different scales? That's just asking for trouble. Variety is the spice of life, but not in my data!
upvoted 0 times
Jeffrey
1 year ago
D) To different scales from each other.
upvoted 0 times
...
Cortney
1 year ago
C) So that they are on a similar scale.
upvoted 0 times
...
Cory
1 year ago
A) By subtracting from the mean and dividing by the standard deviation.
upvoted 0 times
...
...
Ulysses
1 year ago
B, hands down. Transforming features into a normal distribution is where it's at. Gotta love those bell curves!
upvoted 0 times
Susy
1 year ago
B) Into the normal distribution.
upvoted 0 times
...
Gussie
1 year ago
C) So that they are on a similar scale.
upvoted 0 times
...
Selma
1 year ago
A) By subtracting from the mean and dividing by the standard deviation.
upvoted 0 times
...
...
Jesus
1 year ago
I believe it's option C, to make features on a similar scale.
upvoted 0 times
...
Lenny
1 year ago
Hmm, I'd go with A. Subtracting the mean and dividing by the standard deviation is the classic normalization technique, right? Keeps things nice and standardized.
upvoted 0 times
Chu
1 year ago
I would choose C as well. It's important to have features on a similar scale for accurate analysis.
upvoted 0 times
...
Beatriz
1 year ago
I agree with A. It helps keep everything on the same scale.
upvoted 0 times
...
Sina
1 year ago
Definitely, A is the classic normalization technique. It ensures consistency in the data.
upvoted 0 times
...
Noel
1 year ago
I think A is the way to go too. It helps in making sure all the features are on a similar scale.
upvoted 0 times
...
Charlene
1 year ago
I think B is also a valid option. Normalizing into a normal distribution can be useful.
upvoted 0 times
...
Tracey
1 year ago
Yes, you're correct. A is the classic normalization technique.
upvoted 0 times
...
Dalene
1 year ago
Yes, you're right! A is the correct answer. It helps in standardizing the features.
upvoted 0 times
...
...
Glendora
1 year ago
I agree with Luisa, it helps in comparing different features easily.
upvoted 0 times
...
Luisa
1 year ago
I think normalization is about making features on a similar scale.
upvoted 0 times
...
Laquanda
1 year ago
Option C is the way to go! Normalizing features to a similar scale is the way to make sure they're all playing on the same field.
upvoted 0 times
Mariann
1 year ago
Yes, it's important to have features on a similar scale for better analysis and modeling.
upvoted 0 times
...
Mica
1 year ago
I agree, normalizing features to a similar scale helps in comparing them accurately.
upvoted 0 times
...
...

Save Cancel