📜 ⬆️ ⬇️

Splunk Scripted Input. Or how to use scripts to obtain data on the operation of systems and analyze them in Splunk

Earlier we wrote how to upload logs to Splunk from the catalog or using syslog, how to pick up standard Windows and Linux events, but what if we need to get more granular information about the operation of our systems?
In this case, scripts come to the rescue!



When, what and how you can use Splunk scripts to get data - you can find out under the cat.

Typical Uses


Scripts are often used in cases where:
')

For a script, you can set the interval at which it will be played and transfer data to Splunk.

As scripts, you can use shell scripts, python scripts, Windows command files, PowerShell, or any other utilities that can generate and transmit data.

Example


Within the framework of the article we will consider an example of loading data using a script.

Suppose we have a file server and a directory, the size of which we need to monitor for some reason, and we also want it not to exceed the threshold value (for a test of 45 MB). Let's write a script that will consider the size of this directory at an interval of 30 seconds, and also make an alert that will notify us if the threshold value is exceeded.
The size of the folder will be read using the script below, which will output a timestamp, the path to the folder and its size in bytes.

import os import time from datetime import datetime dir_path="///for_script" def get_size(start_path = '.'): total_size = 0 for dirpath, dirnames, filenames in os.walk(start_path): for f in filenames: fp = os.path.join(dirpath, f) total_size += os.path.getsize(fp) return total_size time_of_event=datetime.strftime(datetime.now(), "%Y.%m.%d %H:%M:%S") print time_of_event, dir_path, get_size(dir_path) 

For more information on downloading data from remote sources, we discussed in previous articles ( here and here ). Therefore, we now discuss this briefly.

We need:

• Remote machine where Splunk Universal Forwarder is installed
• Splunk-indexer, on which we create the send-to-indexer application, transfer it to deployment-apps and configure Forwarder managemen t.

Also on the Splunk indexer, we create the monitor_scripts application , transfer it to the deployment-apps folder. In the application, we create the local folder and the inputs.conf file with the following contents:

 [script://./bin/scripts/foldersize.py] disabled = false index = test_script interval = 30.0 sourcetype = test_script 

We also add our script there to the / bin directory

Restarting the deployment server
.../splunk/bin/splunk reload deploy-server

And ... get the data!

Data processing and creating an alert


Splunk automatically chose a timestamp, but the rest of the information remained in the form of raw data, so you need to select the fields (We wrote about how to do this in the previous article ) In this case, we selected 2 fields: folder path (folder_path) and size (size)

Folder size is represented in bytes, let's translate this number in MB. (This can be done in the script, but we will show how to do it in Splunk)

Create a new calculated field (Settings - Fields - Calculated fields - New)
Specify the source type of our data, the name of the new field and the expression to calculate. Now this calculated field will be added to the data with the specified source type.



We have received all the fields of interest to us, let's create a graph that shows the dynamics of the folder resizing and whether it reaches the threshold value.



Create an alert. Let, when the size of the folder exceeds 45 MB, Splunk will send us an email. For more information on how to send an e-mail alert, we wrote here , and in Slack - here .

The alert will be based on a new query, so that you can insert fields from the query into the messages.



We save the request as an alert and prescribe its conditions:




And we get a letter:



In the settings of the alert, we found that if the folder size does not decrease within 15 minutes, the alert will come again.

Conclusion


In this simple example, we showed the principle of loading data into Splunk through scripts. You can create a script that will solve your problem: upload the necessary information to Splunk and quickly get the result.

We hope that this information will be useful for you.

We are happy to answer all your questions and comments on this topic. Also, if you are interested in something specifically in this area, or in the field of machine data analysis in general, we are ready to refine the existing solutions for you, for your specific task. To do this, you can write about it in the comments or simply send us a request through the form on our website .

We are the official Premier Splunk Partner .



PS


On June 28, 2018,Splunk Getting Started ” will be taught in Moscow , where in 6 hours the participants will receive a theoretical base and practical skills for working in Splunk. Learn more about learning and register at this link .

Source: https://habr.com/ru/post/353892/


All Articles