Troubleshooting the Splunk Forwarder

Orient Beach, St. Martin, French Caribbean

 

This post is one of several I will make about how to troubleshoot Splunk Enterprise. This one is basically a step by step guide of what to look at when troubleshooting a Splunk forwarder.

 

First, check if Splunk is running on the forwarder

For windows check in services

For a Linux system use the command below:

ps -ef |grep splunkd

Or you can use:

cd $SPLUNK HOME/bin

./splunk status

 

Second, check if receiving is enabled on the indexer on port 9997 and also check that port 9997 is open on the indexer

Also make sure that receiving is configured as follows:

On the indexer go to settings>>forwarding and receiving >> check if receiving is enabled on port 9997. If it is not enabled then go ahead and enable it

 

Third, check if you are able to telnet the indexer from the forwarder host

telnet indexer name 9997

If you are not able to telnet to the server then check to see if it is a network and firewall issue that is causing the problem.

 

Fourth, confirm on the indexer if your file is already indexed or not. You can to this by using the search query below

In the Splunk user interface run the following search:

index=_internal “FileInputTracker” *<path_to_file>*

 

Fifth, check if the forwarder has completed processing the log file i.e. tailing process by using the url below

https://splunk forwarder server name:8089/services/admin/inputstatus/TailingProcessor:FileStatus

In the tailing process output you can check if the forwarder is having any issues with processing the file

 

Sixth, check out the log file permissions which you are sending to Splunk and verify if the Splunk user has access to the log file

 

Seven, checkout the filestamp for the last modification, and verify if the forwarder is monitoring it.

 

Eight, verify the inputs.conf file and outputs.conf file and make sure they are configured correctly.

 

Nine, check the disk space availability on the indexer

Check splunkd.log on forwarder at the following location: $SPLUNK_HOME/var/log/splunk for any errors, for example, look for messages that are from ‘TcpOutputProc’, they should give you an indication as to what is occurring when the forwarder tries to connect to the indexer

 

Ten, tcpdump port 997 data for any errors

tcpdump -i etho port 997

 

Eleven, check out ulimit if you have installed the forwarder on a Linux system and go ahead and set it to unlimited or max

ulimit is limit set by default in Linux. It is limit for number files opened by a process. You can check this using the following command:

ulimit -n

To set a new ulimit use the following command:

ulimit -n expected size

 

Twelve, check metrics.log if any queue is blocked , if it is blocked then resolve the issue.

 

Thirteen, Restart the forwarder