User Guide Functional Overview Requirements Architecture System Installation NetEye Additional Components Installation Setup The neteye Command Director NetEye Self Monitoring Tornado Business Service Monitoring IT Operation Analytics - Telemetry Geo Maps NagVis Audit Log Shutdown Manager Reporting ntopng Visual Monitoring with Alyvix Elastic Stack IT Operations (Command Orchestrator) Asset Management Service Level Management Cyber Threat Intelligence - SATAYO NetEye Update & Upgrade How To NetEye Extension Packs Troubleshooting Security Policy Glossary
module icon Tornado
Concepts Collecting Events Add a Filter Node WHERE Conditions Iterating over Event fields Retrieving Payload of an Event Extract Variables Create a Rule Tornado Actions Test your Configuration Export and Import Configuration Example Under the hood Development Retry Strategy Configuration Thread Pool Configuration API Reference
Director NetEye Self Monitoring Tornado Business Service Monitoring IT Operation Analytics - Telemetry Geo Maps NagVis Audit Log Shutdown Manager Reporting Introduction to NetEye Monitoring Business Service Monitoring IT Operation Analytics Visualization Network Visibility Log Management & Security Orchestrated Datacenter Shutdown Application Performance Monitoring User Experience Service Management Service Level Management & Reporting Requirements for a Node Cluster Requirements and Best Practices NetEye Satellite Requirements TCP and UDP Ports Requirements Additional Software Installation Introduction Single Node Cluster NetEye Master Master-Satellite Architecture Underlying Operating System Acquiring NetEye ISO Image Installing ISO Image Single Nodes and Satellites Cluster Nodes Configuration of Tenants Satellite Nodes Only Nodes behind a Proxy Additional NetEye Components Single Node Cluster Node Satellites Nodes only Verify if a module is running correctly Accessing the New Module Cluster Satellite Security Identity and Access Management External Identity Providers Configure federated LDAP/AD Emergency Reset of Keycloak Configuration Advanced Configuration Authorization Resources Tuning Advanced Topics Basic Concepts & Usage Advanced Topics Monitoring Environment Templates Monitored Objects Import Monitored Objects Data Fields Deployment Icinga 2 Agents Configuration Baskets Dashboard Monitoring Status VMD Permissions Notifications Jobs API Configuring Icinga Monitoring Retention Policy NetEye Self Monitoring 3b Concepts Collecting Events Add a Filter Node WHERE Conditions Iterating over Event fields Retrieving Payload of an Event Extract Variables Create a Rule Tornado Actions Test your Configuration Export and Import Configuration Example Under the hood Development Retry Strategy Configuration Thread Pool Configuration API Reference Configure a new Business Process Create your first Business Process Node Importing Processes Operators The ITOA Module Configuring User Permissions Telegraf Metrics in NetEye Telegraf Configuration Telegraf on Monitored Hosts Visualizing Dashboards Customizing Performance Graph The NetEye Geo Map Visualizer Map Viewer Configuring Geo Maps NagVis 3b Audit Log 3b Overview Shutdown Manager user Shutdown Manager GUI Shutdown Commands Advanced Topics Overview User Role Management Cube Use Cases ntopng and NetEye Integration Permissions Retention Advanced Topics Overview User Roles Nodes Test Cases Dashboard Use Cases Overview Architecture Authorization Elasticsearch Overview Enabling El Proxy Sending custom logs to El Proxy Configuration files Commands Elasticsearch Templates and Retentions El Proxy DLQ Blockchain Verification Handling Blockchain Corruptions El Proxy Metrics El Proxy Security El Proxy REST Endpoints Agents Logstash Elastic APM Elastic RUM Log Manager - Deprecated Overview Authorization in the Command Orchestrator Module Configuring CLI Commands Executing Commands Overview Permissions Installation Single Tenancy Multitenancy Communication through a Satellite Asset collection methods Display asset information in monitoring host page Overview Customers Availability Event Adjustment Outages Resource Advanced Topics Introduction Getting Started SATAYO Items Settings Managed Service Mitre Attack Coverage Changelog Before you start Update Procedure Single Node Upgrade from 4.41 to 4.42 Cluster Upgrade from 4.41 to 4.42 Satellite Upgrade from 4.41 to 4.42 DPO machine Upgrade from 4.41 to 4.42 Create a mirror of the RPM repository Sprint Releases Feature Troubleshooting Tornado Networking Service Management - Incident Response IT Operation Analytics - Telemetry Identity Provider (IdP) Configuration Introduction to NEP Getting Started with NEPs Online Resources Obtaining NEP Insights Available Packages Advanced Topics Upgrade to NetEye 4.31 Setup Configure swappiness Restarting Stopped Services Enable stack traces in web UI How to access standard logs Director does not deploy when services assigned to a host have the same name How to enable/disable debug logging Activate Debug Logging for Tornado Modules/Services do not start Sync Rule fails when trying to recreate Icinga object How to disable InfluxDB query logging Managing an Elasticsearch Cluster with a Full Disk Some logs are not indexed in Elasticsearch Elasticsearch is not functioning properly Reporting: Error when opening a report Debugging Logstash file input filter Bugfix Policy Reporting Vulnerabilities Glossary 3b

Test your Configuration

Test Event panel serves as a helper tool that can be applied at various stages of building your configuration.

The tool allows you to simulate incoming events (such as monitoring events, network events, email or SMS messages) in order to check if the Processing Tree was configured in a proper way, i.e. Filters’ structure is working as expected, or to test that the rules you configure will properly match those events and correctly invoke the chosen actions.

It can also appear to be helpful for checking the correctness of the data provided and conditions set at the stage of extracting variables for further usage in Rule creation.

Use the Test Panel button in the top right corner of the screen to open the panel.

Note

In order to check the correctness of a Draft without impacting the deployed configuration, make sure you’re running a test in Edit mode. If the Edit mode is set to OFF, the deployed configuration will be used for a test.

Test events can be created by providing data in a dedicated form:

../../_images/test-panel.png

Define the Event type, e.g. email, sms, snmptrap, etc., and creation time of an Event. Tenants selection is only available for the Admin users. Please check Tornado Multitenancy for more information on running Tornado in tenant-aware installations.

Enable execution of actions option should be switched OFF in case you’re only checking the correctness of the draft, and not matching events to the rules within a ruleset.

Now you can execute a test by clicking “Run Test” button.

The Event is sent to Tornado and the outcome of the operation will be reported in the Processing Tree. Following the yellow line it is possible to see the path that the event has taken. The nodes that have matched the event are distinguishable by a full yellow lightning bolt while those partially matched have an empty bolt.

../../_images/processing_tree_with_event.png

Fig. 116 A Processing Tree with an event result

While checking if an Event matches the rules within the ruleset, for indication of the test outcome one of the following states is used:

  • matched: If a rule matched the Event.

  • stopped: If a rule matched the Event and then stopped the execution flow. This happens if the continue flag of the rule is set to false.

  • partially matched: If where condition of the Rule was matched but it was not possible to process the required extracted variables.

  • not matched: If the rule did not match the Event.

In Test Result tab, the list of triggered actions, extracted and not extracted variables can be found. In case of errors, they will be displayed in the related Notifications panel. In this panel, unlike graphically in the processing tree, all errors will be shown regardless of their origin, including all errors generated within an iterator node’s branches. Clicking any of the encountered errors will automatically focus on the corresponding variable or action in the Processing Tree that originated it. The scope of errors is global, and unless it originated within an iterator node, will be graphically shown in the whole processing tree even if no rulesets have been selected. If it originated within an iteration instead, the error will be visible in the processing tree only when that specific iteration is selected in the test result panel.

Please note that whenever multiple actions are being tested sequentially inside a rule, and an error is encountered on any of them, the execution flow will be stopped and the following actions will not be tested until the error is fixed.

../../_images/error-in-tree.png

Fig. 117 Errors are displayed all over the Processing Tree

If no ruleset is selected when running the Test Event, only errors will be shown but there will be no test results available.

Please select a Ruleset to see the results as prompted in the UI or click on an error to see where it was generated.

As soon as you select the desired ruleset, the list of extracted variables and triggered actions related to this ruleset is displayed. Click on one of the extracted variables to see the details of the WITH tab of a corresponding rule in the Processing Tree.

../../_images/clicked-error.png

Fig. 118 Automatic focus on rule action or variable with errors detected

../../_images/clicked-extracted-variable.png

Fig. 119 Automatic focus on rule WITH once the variable is clicked

Same behaviour applies to actions results. You can also see a complete payload by expanding the action result, and copy the payload in expanded view.

../../_images/triggered-action.png

Fig. 120 Executed action with expanded payload

In case an Iterator node or any of its children is selected, the iterations panel will appear. This panel allows to easily view all iterations that occurred during the test and inspect them one by one.

../../_images/iteration-list.png

Fig. 121 List of iterations related to an Iterator node

When a ruleset is selected, for each iteration it is possible to see whether that iteration was matched or partially matched by the ruleset. On top of that, any error that was generated within that specific iteration will also be graphically shown in the processing tree.

../../_images/iteration-on-ruleset.png

Fig. 122 Ruleset with a selected iteration

With an iteration being selected, the extracted variables, actions, non-extracted variables shown in the Test Result tab, and errors shown in the processing tree will be contextual to the selected iteration and will be displayed in the Test Result tab.