In the Sensu observability pipeline, checks generate events, which Sensu then filters, transforms, and processes. A mutator is a component that transforms the event data. For example, a mutator can transform a Sensu event into a different JSON structure that you can send to a data platform’s API via a Sensu handler.
Traditionally, all Sensu mutators have been pipe mutators. Pipe mutators include an executable command, usually provided by a Sensu plugin, that will be executed on a Sensu backend. When the backend processes an event, the pipe mutator runs as an external process, with the event data being passed via STDIN. The pipe mutator then transforms the event and returns the output to the backend via STDOUT.
Performance and flexibility as you scale
datastore attribute to the event, which is derived from the
- If the event status is OK (
0), add the
datastoreattribute with the
- If the event status is non-OK [warning (
1), critical (
2), or other status] add the
datastoreattribute with the
Create a handler
Now you’ll need a handler to process mutated events. In a real-life setting, the handler might look at the new
datastore attribute to select the appropriate datastore folder for long-term event storage. For this example however, all we want to do is have a look at the processed events to understand how the mutator transformed their data, so we’ll create a debug handler that just prints events to a JSON file.
Add the following handler definition:
--- type: Handler api_version: core/v2 metadata: name: debug spec: type: pipe command: cat > /var/log/sensu/debug-event.json timeout: 2
Configure a pipeline
Configure a pipeline that includes your new
debug handler in a single workflow. Add this pipeline definition:
--- type: Pipeline api_version: core/v2 metadata: name: jsmutator_test spec: workflows: - name: datastores mutator: name: assign_datastore type: Mutator api_version: core/v2 handler: name: debug type: Handler api_version: core/v2
Reference a pipeline in a check
To start sending events to your new pipeline, you’ll need a check to supply the events.
If you already have a check you want to use, update your existing check definition with the following stanza, which will add the
jsmutator_test pipeline reference:
pipelines: - api_version: core/v2 name: jsmutator_test type: Pipeline
If you do not have a check, follow these steps to add one:
- Add the Sensu CPU usage check dynamic runtime asset so it is available for your check to use:
sensuctl asset add sensu/check-cpu-usage:0.2.2 -r check-cpu-usage
- Add the
systemsubscription to an entity:
sensuctl entity update <entity_name>
Entity Class:press enter.
systemand press enter.
- Create a check definition that uses the http-checks asset and references your pipeline:
type: CheckConfig api_version: core/v2 metadata: name: check_cpu spec: command: check-cpu-usage -w 1 -c 2 interval: 15 pipelines: - api_version: core/v2 name: jsmutator_test type: Pipeline publish: true round_robin: false runtime_assets: - check-cpu-usage subscriptions: - system
NOTE: This check is based on the
check_cpu check in Monitor server resources with checks.
Explore the results
As soon as your check starts sending events, you can print the contents of the file your debug handler creates, to explore the output of your mutator. To do that, run the following command:
If the most recent check execution produced an event with an OK status, the event data should include
datastore: DB_archive. If the most recent event had a warning or critical status, the event data should include
datastore: DB_exceptions instead.
In regular usage, you probably won’t send all of your events to a debug file. Instead, you could send your mutated events to a service like Sumo Logic or InfluxDB for storage, analysis, and visualization. Read Sensu Plus to transmit your Sensu observability data to Sumo Logic or Populate metrics in InfluxDB with handlers to send data to InfluxDB.
Make sure to join our Discourse community, where you can share with and learn from other Sensu users and get updates on the latest Sensu product releases.