New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

CompTIA DY0-001 Exam - Topic 3 Question 2 Discussion

Actual exam question for CompTIA's DY0-001 exam
Question #: 2
Topic #: 3
[All DY0-001 Questions]

A data scientist is analyzing a data set with categorical features and would like to make those features more useful when building a model. Which of the following data transformation techniques should the data scientist use? (Choose two.)

Show Suggested Answer Hide Answer
Suggested Answer: B

One-hot encoding creates binary indicator columns for each category, allowing models to treat nominal categories without implying any order.

Label encoding maps categories to integer labels, which can be useful for tree-based models or when you need a single numeric column (though you must ensure the algorithm can handle treated ordinality appropriately).


Contribute your Thoughts:

0/2000 characters
Rolande
2 months ago
Pivoting could help in some cases, but not a primary method here.
upvoted 0 times
...
Valentin
2 months ago
Wait, is linearization even a thing for categorical data? Sounds off.
upvoted 0 times
...
Tony
2 months ago
One-hot encoding is definitely a go-to for categorical features!
upvoted 0 times
...
Tish
3 months ago
I think label encoding can be useful too, especially for ordinal data.
upvoted 0 times
...
Tamesha
3 months ago
Normalization? Not really needed for categorical features.
upvoted 0 times
...
Ardella
3 months ago
I vaguely recall something about pivoting, but I don't think that's what we need here. One-hot encoding definitely sounds right!
upvoted 0 times
...
Deeanna
3 months ago
I feel like normalization and scaling are more for numerical data, so I’m leaning towards one-hot and label encoding.
upvoted 0 times
...
Gracia
4 months ago
I'm not entirely sure, but I think label encoding might also be useful. It can convert categories to numbers, right?
upvoted 0 times
...
Arlette
4 months ago
I remember we talked about one-hot encoding in class; it seems like a good choice for categorical features.
upvoted 0 times
...
Dudley
4 months ago
One-hot encoding is definitely one of the techniques I would use. For the other, I'm torn between label encoding and normalization. I'll have to think about which one would be more appropriate here.
upvoted 0 times
...
Annamaria
4 months ago
I'm confident that one-hot encoding is the way to go for this. It creates new binary columns for each category, which should help the model work with the categorical data. Not sure about the other option though.
upvoted 0 times
...
Eveline
4 months ago
Okay, let me think this through. One-hot encoding converts categorical variables into a format that algorithms can better understand. And label encoding assigns numerical labels to each category. Those sound like the right choices for this question.
upvoted 0 times
...
Nu
5 months ago
I'm a bit unsure about this one. I know one-hot encoding is used for categorical features, but I'm not sure about the other option.
upvoted 0 times
...
Bev
5 months ago
Hmm, this seems straightforward. I think one-hot encoding and label encoding are the two techniques I should use here.
upvoted 0 times
...
Twana
7 months ago
Normalization? Scaling? What is this, a beauty pageant? Give me those one-hot and label encoders any day! Let's keep it simple, folks.
upvoted 0 times
Kristel
5 months ago
Normalization and scaling can be useful too, but sometimes simpler is better.
upvoted 0 times
...
Phillip
6 months ago
I agree, one-hot encoding and label encoding are the way to go.
upvoted 0 times
...
Markus
6 months ago
Normalization and scaling can be useful too, but sometimes simpler is better.
upvoted 0 times
...
Latrice
7 months ago
I agree, one-hot encoding and label encoding are the way to go.
upvoted 0 times
...
...
Dustin
8 months ago
I'm with Elza on this one. B and D are the way to go. Although, I do like Ellen's 'make everything linear' approach. Sounds like a real time-saver!
upvoted 0 times
Jess
7 months ago
I'm not sure about linearization, but I definitely think B and D are the way to go.
upvoted 0 times
...
Val
7 months ago
I think Ellen's idea of linearizing everything is interesting, but I still prefer B and D.
upvoted 0 times
...
Elin
7 months ago
I agree with Elza, B and D are the best choices.
upvoted 0 times
...
...
Ellen
8 months ago
Hmm, I was thinking C and F. Linearization and pivoting, you know? Who needs all that fancy one-hot stuff when you can just make everything linear, right? *wink wink*
upvoted 0 times
...
Elza
8 months ago
Definitely B and D. One-hot encoding to create binary columns, and label encoding to turn the categories into numerical values. Easy peasy!
upvoted 0 times
Peter
7 months ago
Both techniques will definitely make the categorical features more useful for modeling.
upvoted 0 times
...
Wynell
7 months ago
Label encoding is also useful for turning categories into numerical values.
upvoted 0 times
...
Aretha
7 months ago
I agree, one-hot encoding is great for creating binary columns.
upvoted 0 times
...
Val
7 months ago
Label encoding is also useful for turning categories into numerical values.
upvoted 0 times
...
Marisha
8 months ago
I agree, one-hot encoding is great for creating binary columns.
upvoted 0 times
...
...
Jeniffer
8 months ago
I'm not sure about Label encoding. I think Normalization and Scaling would be more useful for building a model.
upvoted 0 times
...
Stefanie
8 months ago
I agree with Ona. One-hot encoding helps with categorical variables and Label encoding assigns a unique numerical value to each category.
upvoted 0 times
...
Ona
8 months ago
I think the data scientist should use One-hot encoding and Label encoding.
upvoted 0 times
...
Corinne
8 months ago
I'm pretty sure one-hot encoding and label encoding are the way to go here. Normalizing and scaling won't help with categorical features.
upvoted 0 times
Terrilyn
8 months ago
Pivoting and linearization wouldn't be useful for transforming categorical features.
upvoted 0 times
...
Ardella
8 months ago
Normalization and scaling are more for numerical features, not categorical ones.
upvoted 0 times
...
Kate
8 months ago
I agree, one-hot encoding and label encoding are the best choices for categorical features.
upvoted 0 times
...
...

Save Cancel