New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Databricks Machine Learning Professional Exam - Topic 12 Question 46 Discussion

Actual exam question for Databricks's Databricks Machine Learning Professional exam
Question #: 46
Topic #: 12
[All Databricks Machine Learning Professional Questions]

Which of the following is a reason for using Jensen-Shannon (JS) distance over a Kolmogorov-Smirnov (KS) test for numeric feature drift detection?

Show Suggested Answer Hide Answer
Suggested Answer: D

Contribute your Thoughts:

0/2000 characters
Rhea
11 hours ago
I think option E is a big plus for JS too.
upvoted 0 times
...
Verlene
6 days ago
Hmm, I'll have to go with the option that doesn't mention any donuts. Can't be too careful these days.
upvoted 0 times
...
Vincent
11 days ago
D) JS is more robust when working with large datasets. That's crucial for real-world machine learning.
upvoted 0 times
...
Sylvie
16 days ago
C) None of these reasons. I'm still not convinced JS is better than KS for this use case.
upvoted 0 times
...
Joanne
21 days ago
E) JS does not require any manual threshold or cutoff determinations. That's a big plus for production use.
upvoted 0 times
...
Janessa
26 days ago
If I recall correctly, the JS distance is indeed more user-friendly since it doesn't require cutoffs, but I’m not sure if that’s the only reason we should choose it over KS.
upvoted 0 times
...
Julie
1 month ago
I'm a bit confused about the normalization aspect; I thought both methods had their own ways of handling that.
upvoted 0 times
...
An
1 month ago
I think I saw a practice question about JS being easier because it doesn't need manual thresholds, which might be a key point for this question.
upvoted 0 times
...
Rupert
1 month ago
I remember that JS distance is often preferred for its robustness, especially with larger datasets, but I'm not entirely sure if that's the main reason here.
upvoted 0 times
...
Albert
2 months ago
Ah, I remember this from class. JS is preferred because it's normalized and doesn't need any manual cutoffs. That makes it a more reliable choice, especially for large-scale applications.
upvoted 0 times
...
Christoper
2 months ago
I'm a bit confused on the specifics here. I know JS and KS are both used for feature drift, but I'm not sure about the advantages of one over the other. Guess I'll have to do some more research.
upvoted 0 times
...
Venita
2 months ago
Alright, I've got this. JS is more robust than KS when dealing with big data, and it doesn't require manual thresholds. That's gotta be the answer.
upvoted 0 times
...
Alease
2 months ago
D) JS is more robust when working with large datasets. That's the key advantage I've heard about.
upvoted 0 times
...
Chery
2 months ago
JS is definitely more robust with large datasets!
upvoted 0 times
...
Novella
3 months ago
Wait, JS isn't normalized? That sounds odd.
upvoted 0 times
...
Deonna
3 months ago
A) All of these reasons. The JS distance just seems more versatile overall.
upvoted 0 times
...
Reita
3 months ago
Okay, let me see. I know JS is a good option for large datasets, but I'm not sure about the other reasons. I'll have to review my notes on feature drift detection.
upvoted 0 times
...
Marget
3 months ago
Hmm, this seems like a tricky one. I'll need to think through the differences between JS and KS to figure out the best approach.
upvoted 0 times
Cory
2 months ago
I think JS is better for large datasets.
upvoted 0 times
...
...

Save Cancel