What is Logsign Custom Plugin Tool?
Creating an Event Mapping File
What is Logsign Custom Plugin Tool?
The Custom Plugin tool is used to customize the normalization and explanation of data as needed. Its features can be explained as follows:
- The Custom Plugin Tool allows you to create custom parsing rules to extract log data best suited to your corporate needs.
- If there is no predefined parser for the collected event log, Logsign Unified SecOps Platform stores the information but is not able to interpret the data. Therefore, these collected events are called unknown events.
- For example, if you have a surveillance system that supports event log messages and sends log data to Logsign Unified SecOps Platform, or if you are using an electronic health record (EHR) tool, and want to parse patient IDs, login successes and failures, and EHR event types, you can create parsing rules with the Custom Plugin Tool.
How to Start?
You can access the list of custom plugins or create a new custom plugin from the Settings > Integrations > Custom Plugin menu and start the Custom plugin list by clicking the + Custom Plugin button.
Appearance and Basic Rules
Configurations are completed as follows:
- In all custom plugins created, the name EventSource.Vendor will be taken as CustomGenerated, and the name EventSource.Product will be taken as the text entered in the Product label.
- Event.SystemID, EventSource.IP, EventSource.Vendor and EventSource.Product columns, parsing and column assignment must not be done over the logs.
- After the plugin is created, it should be in the Vendor list when the deploy button is clicked and Settings > Integrations > Data Collection > +Device button is clicked.
- Plugins prepared with the Custom Plugin Tool should always appear under the Custom_Generated vendor in the Vendor List.
- In cases where the time format is different, time should not be assigned. When no time assignment is made, the time when the Logsign machine receives the log will be displayed in the Time.Generated column.
Parse Methods
Logsign offers support for seven parse methods. These methods are selected according to the log types to be parsed and the plugin is started to be created. With this option, it serves to form the behavior of the plugin according to different log types.
Defined parse methods as follows:
- Parse with rule: It is ensured that the logs are parsed according to a certain rule list.
- Parse with regex: Logs are parsed according to one or more regular expression rules.
- Parse with multiple rule: It is ensured that the logs are parsed according to more than one rule list.
- Parse with key value: It is ensured that the logs in the form of Key:Value or Key=Value are parsed.
- Parse with json: It is ensured that the logs in JSON data format are parsed.
- Parse with W3C: It is ensured that the logs received in W3C data format are parsed.
- CEF: It is ensured that the incoming logs are parsed in accordance with the Common Event Format standard.
Parse with rule
In this parse method option, the ones that are suitable for the log structure are selected in order from the ones given in the pattern list. After the sample information is defined in the form, it is checked with the "Test" button. After the preferred parsing is achieved, the work is saved with the Save button.
Pattern List
- Word: It matches a string consisting of one or more letters from A to Z (Case insensitive).
- Decimal: It matches positive numbers consisting of one or more numbers.
- IP: It matches the string that comes in IPv4 format.
- MAC: It matches the string that is incoming in MAC (48 and 64 bit) format.
- Whitespace: It matches one or more strings containing spaces.
- Non-whitespace: It matches one or more strings without spaces.
- Pass: It matches the part which comes after the pattern.
- All: It matches everything, regardless of pattern.
- %b %d %H:%M:%S: May 19 15:04:05 > It matches the datetime format in the struction.
- %b %d %H:%M:%S: %Y: May 19 15:04:05 2023 > It matches the datetime format in the construction.
- %a %b %d %H:%M:%S: Sun May 19 15:04:05 > It matches the datetime format in the construction.
Example of parse with rule
Parse with regex
It is the parse method used to parse applicable logs in regex format. In this method, you can determine which fields will be assigned to which columns by specifying the format in which you will parse the columns in the regex field.
- The log sample to be parsed is pasted into the text-box and the convenient regex for this log type is written in the Regex field, the convenience of the regex written and the groupings done here can be seen by clicking the check button.
Note: It can be written by saving more than one regex for more than one log type.
- Log formats that can be incoming in different types, after the regex is written, column assignments are done over the determined groups and saved.
Example of regex usage
Parse with multiple rule
- This parse method is sensibly the same as parse with rules. In addition to parsing with rules, it works for more than one rule. The log is entered into the matching rule and parsed.
- While creating the rules, you can test it on the sample and check if it parses sleekly.
- When the rules are not matched or column assignments are preferred to be changed, it can be edited by clicking the“Edit” button.
Example of parse with multiple rule
Parse with key-value
- In the key value method option, you can use the log format in key/value structures.
- In the key delimiter you can choose whether the key/value pair is separated by colons ( : ) or equals ( = ).
- While doing column assignments, the key name is specified in the match field.
Example of parse with key value
Parse with json
- It is the parse method used in json log formats. In this method, the key information in the log is saved in the pattern field, and the column which is assigned in the fields part is selected.
- Before deploying the custom plugin with the sample, we need to test it and check that the values are sleekly assigned to the relevant columns. After the operations are completed, it is saved.
- While doing column assignments, the key name is specified in the match field.
Example of parse with json
Parse with W3C
- It is the parse method used for W3C log formats. In the Fields section, the W3C format is entered with a comma (,) without any spaces between the fields.
- In the delimiter part, it is selected after seeing which delimiter the fields are separated by in the log example.
- Column assignments are maintained over the keys specified in the fields section. Assignment is done to the Match Field part over the values in the Fields part.
Example of parse with W3C
Parse with CEF
- It is the parse method used for CEF log formats. The supported CEF log format is as follows:
<%b %d %H:%M:%S> <hostname> <CEF:version|device_vendor|device_product|device_version|signature_id|name|severity|extension>
- In the Extension section, values with key=value can be parsed by adding the relevant key after selecting the field.
requestMethod=POST -> URL.Method | requestMethod
- Some keys contain a Label tag. The Keys are parsed differently.
cs1Label=Error code cs1=-3
- These values in the log are parse as follows:
Example of parse with CEF
Using Event Mapping
What is EventMapping?
- Event Mapping is used to apply the enhancements necessary to make sense and categorize the logs. When Event Mapping is assigned, the following values are expected in the logs:
- EventMap.Context , EventMap.Type , EventMap.SubType, EventMap.Info, EventMap.ID
Static Event Mapping
- Static Event Mapping is used if the logs represent a single event type, that is only one EventMap value is preferred to be displayed. As seen in the screenshot, a single Event Mapping assignment can be done within the plugin created by specifying EventMap.Context, EventMap.Type, EventMap.SubType information.
Event Mapping Rules
- EventMap columns can never be created from within the log. It can only be exported by assigning a mapping file (must be with .json extension.) or static mapping.
- In addition to the given EventMap columns, Event.VendorID, Severity.Name and Event.Info columns can be added in the json file (if no assignment is done in the log).
Creating an Event Mapping File
- The Event Mapping file is created and imported as json. The file format is as follows:
{ "Authentication error": { "EventMap.Context": "Identity", "EventMap.Type": "User", "EventMap.SubType": "Deny" }, "Authentication successful": { "EventMap.Context": "Identity", "EventMap.Type": "User", "EventMap.SubType": "Login" } }
- The explanation of the usage example is as follows:
- Authentication error and Authentication successful values can be displayed in the column selected as Event Key.
- When an Authentication error occurs in the value of the Event Key, the EventMap.Context: Identity, EventMap.Type:User, EventMap.SubType:Deny values are included in the log.
- When Authentication is successful in the value of the Event Key, the EventMap.Context: Identity, EventMap.Type:User, EventMap.SubType:Deny values are included in the log.
Note: One or more columns can be selected as Event Key. The reason why it is more than one selectable may occur in some logs where logs need to be categorized over more than one information.
- When the Default Mapping checkbox is selected after the file is imported, it means:
- If the value in the value of the column selected as the Event Key is not found in the mapping file, the values given in the default mapping statically are included in the log.
Example of parse with CEF, File mapping, default mapping
Custom Plugin Deployment
- After the custom plugin is created, the plugin is deployed by clicking the Settings > Integrations > Custom Plugin Tool > Deploy button.
Features of Editing
- New parse rules can be created by editing the custom plugin if there is a situation where it is preferred to be changed/updated after a plugin is completed, or if the log fields have changed after an update on the part of the integrated product. This feature gives the Custom Plugin Tool flexibility and adaptability.
- Fields that have previously been assigned a column can be affected by clicking the button.
- By clicking the + Add Item button, a new column is created that can be used for column assignment.
- When a plugin using Mapping File is preferred to be edited, the JSON File previously created and imported in the plugin can be displayed as follows.