Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Splunk Exam SPLK-2002 Topic 1 Question 78 Discussion

Actual exam question for Splunk's SPLK-2002 exam
Question #: 78
Topic #: 1
[All SPLK-2002 Questions]

What is the best method for sizing or scaling a search head cluster?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Filiberto
1 months ago
Hmm, that's an interesting perspective. I can see how that method could also work well for sizing a search head cluster.
upvoted 0 times
...
Theron
1 months ago
I disagree, I believe option D) Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head is the way to go.
upvoted 0 times
...
Filiberto
1 months ago
I think the best method is A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
Adelina
2 months ago
I personally think option B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads makes the most sense.
upvoted 0 times
...
Tom
2 months ago
Haha, C is just ridiculous. Might as well just roll a dice to determine the number of search heads.
upvoted 0 times
Krissy
10 days ago
B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads.
upvoted 0 times
...
Harley
12 days ago
A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
Amber
19 days ago
B) Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads.
upvoted 0 times
...
Sina
27 days ago
A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...
...
Fletcher
2 months ago
Hey, I was going to pick C, but that sounds like a total guess. Dividing indexers by 3? What kind of magic number is that?
upvoted 0 times
...
Sherly
2 months ago
I disagree, I believe option D) Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head is the way to go.
upvoted 0 times
...
Michell
2 months ago
I was about to choose B, but D makes more sense. Gotta account for those peak concurrency numbers, not just total volume.
upvoted 0 times
...
Abel
2 months ago
Hmm, D seems like the most reasonable approach. Sizing the search head cluster based on the maximum concurrent searches makes the most sense to me.
upvoted 0 times
Bulah
10 days ago
User4: D does seem like a logical choice for scaling the search head cluster.
upvoted 0 times
...
Margot
24 days ago
User3: I would go with D as well. Sizing based on concurrent searches is practical.
upvoted 0 times
...
Silvana
25 days ago
User2: I agree, it seems like the most reasonable approach.
upvoted 0 times
...
Emerson
1 months ago
User1: I think D is the way to go. Sizing based on maximum concurrent searches makes sense.
upvoted 0 times
...
...
Viki
2 months ago
I think the best method is A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted 0 times
...

Save Cancel