Skip to content

[Bug] CLI Command import-rules Prompts Threshold Related Values to be Submitted #3266

Closed
@terrancedejesus

Description

@terrancedejesus

Related Community Thread

Overview

If a community member wants to use import-rules CLI command to point to an NDJSON file loaded with custom rules, the rule-prompt method in cli_utils.py is called, passing along rule contents.

Bugs

  • The issue with this is that rule_prompt, specifically for threshold rules will ask for values to be submitted during the process and will not continue until submitted. This is suppose to be an automated approach to converting a list of rules in JSON format to TOML files.
  • Rules that rely on the NewTermsRuleData dataclass have a different structure in TOML format, versus API format (JSON) for Kibana. As a result, we programmatically handle the transformation of New Terms rules to be valid from TOML -> JSON, however, we do not expose a way to do the opposite JSON -> TOML and thus when loading the dataclass object with the JSON object, schema validation fails. Thus import-rules will not currently work with New Terms rules that were exported from the Kibana.
  • import-rules dumps all of the TOML rule files in detection-rules/rules/ folder. At the moment, this will cause some unit-tests to fail that require they be placed in the appropriate folders under rules/. We may need to expose a global environment variable that would override this requirement and thus allow custom user rules to be saved in a folder of choice.
  • build time fields such as related_integrations and required_fields are valid for the Kibana API JSON format, however are built from the data within a TOML rule and thus are not hardcoded into the TOML rule. As a result, they have to be removed when JSON -> TOML, but will be programmatically built again when doing TOML -> JSON if using the rule loader

For the time being, I played around with some proof-of-concept (PoC) code for simply pointing to an NDJSON file of choice and converting them all to TOML rules that are saved in detection-rules/rules/. During conversation they are loaded through the respective dataclasses as well which means they are validated too. A handful of New Terms rules fail as expected.

from detection_rules.rule import TOMLRuleContents, TOMLRule
import json
from pathlib import Path
from datetime import datetime
import re

exported_rules = Path("/Users/tdejesus/code/src/detection-rules/releases/8.12/extras/8.12-consolidated-rules.ndjson")
toml_rule_dump_path = Path("/Users/tdejesus/code/src/detection-rules/rules/") # preferably rules/


def main():
    default_rule_date = datetime.now().strftime("%Y/%m/%d")
    default_rule_meta = {'creation_date': default_rule_date, 'updated_date': default_rule_date, 'maturity': 'development'}

    custom_rules = [json.loads(line) for line in open(f'{str(exported_rules)}')]

    for custom_rule in custom_rules:
        if custom_rule['type'] == 'new_terms':
            continue
        incompatible_field_names = [
            'related_integrations',
            'required_fields',
            'version']
        for field_name in incompatible_field_names:
            if field_name in custom_rule.keys():
                del custom_rule[field_name]

        try:
            toml_rule_contents = TOMLRuleContents.from_dict({'rule': custom_rule, 'metadata': default_rule_meta})
            file_name = re.sub(r'[^a-zA-Z0-9]', '_', custom_rule['name']).lower() + ".toml"
            save_path = toml_rule_dump_path / file_name

            toml_rule = TOMLRule(path=save_path, contents=toml_rule_contents)
            toml_rule.save_toml()
        except:
            print(f"Failed to import rule: {custom_rule['name']}")

if __name__ == "__main__":
    main()
Screenshot 2023-11-09 at 3 30 08 PM

Solution

We should revisit import-rules and make adjustments to not rely on rule_prompt() while also introducing a global environment variable to ignore rule placement if custom. We will also need to solve New Terms rule conversion from JSON back to TOML as well.

Metadata

Metadata

Labels

backlogbugSomething isn't workingpythonInternal python for the repository

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions