New Year Sale 2026! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Databricks Certified Professional Data Scientist Exam - Topic 3 Question 26 Discussion

Actual exam question for Databricks's Databricks Certified Professional Data Scientist exam
Question #: 26
Topic #: 3
[All Databricks Certified Professional Data Scientist Questions]

Consider flipping a coin for which the probability of heads is p, where p is unknown, and our goa is to estimate p. The obvious approach is to count how many times the coin came up heads and divide by the total number of coin flips. If we flip the coin 1000 times and it comes up heads 367 times, it is very reasonable to estimate p as approximately 0.367. However, suppose we flip the coin only twice and we get heads both times. Is it reasonable to estimate p as 1.0? Intuitively, given that we only flipped the coin twice, it seems a bit

rash to conclude that the coin will always come up heads, and____________is a way of avoiding such rash

conclusions.

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

0/2000 characters
Ashley
4 months ago
But what if the coin is rigged? Wouldn't that change everything?
upvoted 0 times
...
Catalina
4 months ago
I think Laplace smoothing is the right call here!
upvoted 0 times
...
In
4 months ago
Wait, so Laplace smoothing really helps with this? Sounds too good to be true.
upvoted 0 times
...
Anissa
4 months ago
Totally agree, can't just assume p is 1 after 2 flips.
upvoted 0 times
...
Ashley
4 months ago
Seems like a solid way to avoid overconfidence!
upvoted 0 times
...
Malcom
5 months ago
I vaguely recall that naive Bayes is more about classification, so it might not apply here. I think Laplace smoothing is definitely the way to go!
upvoted 0 times
...
Ben
5 months ago
I feel like Laplace smoothing makes sense because it helps avoid extreme estimates, especially with small sample sizes.
upvoted 0 times
...
Domingo
5 months ago
I think we practiced something similar with smoothing techniques, but I’m not entirely sure if Laplace smoothing is the right answer here.
upvoted 0 times
...
Tamesha
5 months ago
I remember discussing how estimating probabilities from very few samples can lead to overconfidence, like assuming p is 1.0 after just two heads.
upvoted 0 times
...
Pa
5 months ago
Hmm, I'm a bit unsure about this one. I'll need to review the options carefully to make sure I choose the right actions.
upvoted 0 times
...
Na
5 months ago
Okay, let's see here. The output shows the "Global" zone, which is always present, and one other zone named "zone2". So I think the correct answers are that there is one nonglobal zone installed and one nonglobal zone configured. I'll double-check the options to be sure.
upvoted 0 times
...

Save Cancel