Executable Input
Overview
This integration is responsible for executing commands and parse their output given in one of the supported formats to collect metrics.
Setup
The exec
monitoring is accomplished by bridge
plugin which is included with the SolarWinds Snap Agent by default. Follow the directions below to enable it for an agent instance.
The bridge
plugin utilize Telegraf Exec plugin.
Configuration
The agent provides an example task file to help you get started quickly, but requires you to provide the correct settings for your installation. In this example a script called /tmp/test.sh
, a script called /tmp/test2.sh
, and all scripts matching glob pattern /tmp/collect_*.sh
are configured for JSON format. Glob patterns are matched on every run, so adding new scripts that match the pattern will cause them to be picked up immediately. To enable the task:
-
Make a copy of the exec example task file
task-bridge-exec.yaml.example
, renaming it totask-bridge-exec.yaml
.To do this on a Windows machine, use File Explorer or open Powershell and use the command:
copy "C:\ProgramData\SolarWinds\Snap\tasks-autoload.d\task-bridge-exec.yaml.example" "C:\ProgramData\SolarWinds\Snap\tasks-autoload.d\task-bridge-exec.yaml"
To do this on a Linux machine, use the following command in the command line:
sudo cp -p /opt/SolarWinds/Snap/etc/tasks-autoload.d/task-bridge-exec.yaml.example /opt/SolarWinds/Snap/etc/tasks-autoload.d/task-bridge-exec.yaml
-
Edit the task file with settings specific to your case, for example:
--- version: 2 schedule: type: cron interval: "0 * * * * *" plugins: - plugin_name: bridge config: exec: ## Commands array. Please use linux path format on all platforms. commands: - "/tmp/test.sh" - "/usr/bin/mycollector --foo=bar" - "/tmp/collect_*.sh" - "C:/Users/user/mycollector.bat" ## Timeout for each command to complete. timeout: "5s" ## measurement name suffix (for separating different commands) name_suffix: "_mycollector" ## Data format to consume: "collectd", "graphite", "influx", "json", "nagios", or "value". ## Each data format has its own unique set of configuration options, read more about them here: ## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md data_format: "json" ## Metric name prefix # bridge_prefix: "" publish: - plugin_name: publisher-appoptics
-
Restart the agent:
To do this on a Windows machine, use the following command in the command line:
net stop swisnapd net start swisnapd
To do this on a Linux machine, use the following command in the command line:
sudo service swisnapd restart
-
Enable the Exec integration in AppOptics
On the Integrations Page you will see Exec integration available if the previous steps were successful. It may take a couple minutes before the Exec integration is identified. Select the Exec integration to open the configuration menu in the UI, and enable it. If you do not see it, see Troubleshooting Linux..
Let's say that we have the above configuration, and mycollector outputs the following JSON:
{
"a": 0.5,
"b": {
"c": 0.1,
"d": 5
}
}
The collected metrics will be stored as fields under the measurement "exec_mycollector":
exec_mycollector a=0.5,b_c=0.1,b_d=5 1452815002357578567
If using JSON, only numeric values are parsed and turned into floats. Booleans and strings will be ignored.
See more examples in the Telegraf Exec Input Plugin docs.
Testing Integration
To check if and what metrics can be collected with given configuration, run bridge
plugin in debug mode.
To do this on a Windows machine, use the following command in the command line:
"C:\Program Files\SolarWinds\Snap\bin\snap-plugin-collector-bridge.exe" --debug-mode --plugin-config "{\"exec\": {\"commands\": [\"C:/Users/user/mycollector.bat\"], \"data_format\": \"json\"}}"
To do this on a Linux machine, use the following command in the command line:
/opt/SolarWinds/Snap/bin/snap-plugin-collector-bridge --debug-mode --plugin-config "{\"exec\": {\"commands\": [\"/usr/bin/mycollector --foo=bar\"], \"data_format\": \"json\"}}"
Metrics and Tags
The metrics that are generated depend on how this integration is configured.
Tags
Tags can be set dynamically based on property-key names defined in the tag_keys
field. To do this, use an exec
-uted command that returns data following a specific format that distinguishes between a tag name, a tag value, and a metric value. In your task file, set the data format to json and define the tags to be set in the tag_keys
field.
Example
This example tags metrics dynamically, resulting in a metric named exec.exec_mycollector.value2
with a value of 5
, and a value
tag applied to the value 42
.
In the exec command file ~/exec-collector.sh
:
#!/bin/bash
echo '{"value":42,"value2":5}'
In the task file:
---
version: 2
schedule:
type: cron
interval: "0 * * * * *"
plugins:
- plugin_name: bridge
config:
exec:
## Commands array
commands:
- "/Users/user.name/exec-collector.sh"
## Timeout for each command to complete.
timeout: "5s"
## measurement name suffix (for separating different commands)
name_suffix: "_mycollector"
## Data format to consume: "collectd", "graphite", "influx", "json", "nagios", or "value".
## Each data format has its own unique set of configuration options, read more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md
data_format: "json"
## Metric name prefix
# bridge_prefix: ""
tag_keys:
- "value"
publish:
- plugin_name: publisher-appoptics
Navigation Notice: When the APM Integrated Experience is enabled, AppOptics shares a common navigation and enhanced feature set with other integrated experience products. How you navigate AppOptics and access its features may vary from these instructions.
The scripts are not supported under any SolarWinds support program or service. The scripts are provided AS IS without warranty of any kind. SolarWinds further disclaims all warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The risk arising out of the use or performance of the scripts and documentation stays with you. In no event shall SolarWinds or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the scripts or documentation.