Deal of The Day! Hurry Up, Grab the Special Discount - Save 25%
- Ends In
00:00:00
Coupon code:
SAVE25
X
Welcome to Pass4Success
Login
|
Sign up
-
Free
Preparation Discussions
Mail Us
support@pass4success.com
Location
US
MENU
Home
Popular vendors
Salesforce
Microsoft
Nutanix
Amazon
Google
CompTIA
SAP
VMware
Fortinet
PeopleCert
Eccouncil
HP
Palo Alto Networks
Adobe
ServiceNow
Dell EMC
CheckPoint
Linux Foundation
Discount Deals
New
About
Contact
Login
Sign up
Home
Discussions
Splunk Discussions
Exam SPLK-2002 Topic 1 Question 78 Discussion
Splunk Exam SPLK-2002 Topic 1 Question 78 Discussion
Actual exam question for Splunk's SPLK-2002 exam
Question #: 78
Topic #: 1
[All SPLK-2002 Questions]
What is the best method for sizing or scaling a search head cluster?
A
Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
B
Estimate the total number of searches per day and divide by the number of CPU cores available on the search heads.
C
Divide the number of indexers by three to achieve the correct number of search heads.
D
Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head.
Show Suggested Answer
Hide Answer
Suggested Answer:
B
by
Nicolette
at
Dec 12, 2023, 02:08 PM
Limited Time Offer
25%
Off
Get Premium SPLK-2002 Questions as Interactive Web-Based Practice Test or PDF
Contribute your Thoughts:
Submit
Cancel
Tom
10 hours ago
Haha, C is just ridiculous. Might as well just roll a dice to determine the number of search heads.
upvoted
0
times
...
Fletcher
3 days ago
Hey, I was going to pick C, but that sounds like a total guess. Dividing indexers by 3? What kind of magic number is that?
upvoted
0
times
...
Sherly
4 days ago
I disagree, I believe option D) Estimate the maximum concurrent number of searches and divide by the number of CPU cores per search head is the way to go.
upvoted
0
times
...
Michell
4 days ago
I was about to choose B, but D makes more sense. Gotta account for those peak concurrency numbers, not just total volume.
upvoted
0
times
...
Abel
7 days ago
Hmm, D seems like the most reasonable approach. Sizing the search head cluster based on the maximum concurrent searches makes the most sense to me.
upvoted
0
times
...
Viki
12 days ago
I think the best method is A) Estimate the maximum daily ingest volume in gigabytes and divide by the number of CPU cores per search head.
upvoted
0
times
...
Log in to Pass4Success
×
Sign in:
Forgot my password
Log in
Report Comment
×
Is the comment made by
USERNAME
spam or abusive?
Commenting
×
In order to participate in the comments you need to be logged-in.
You can
sign-up
or
login
Save
Cancel
Tom
10 hours agoFletcher
3 days agoSherly
4 days agoMichell
4 days agoAbel
7 days agoViki
12 days ago