Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3.2 First stable demo release #6

Merged
merged 23 commits into from
Mar 21, 2018
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
0b59dd3
implementing new manager log screen
manuasir Mar 6, 2018
59394e4
Trying to sfill index with API data
manuasir Mar 6, 2018
f284881
Testing queries and showing the info in table views
manuasir Mar 7, 2018
0793c3d
Generating events for filling new indexes
manuasir Mar 7, 2018
b10f0a6
setting credentials in text
manuasir Mar 8, 2018
cc20153
Producing some stats from logs
manuasir Mar 8, 2018
51ba0a7
Adding Filtering boxes in manager logs table
manuasir Mar 8, 2018
3b88940
Making some dynamic and working filters
manuasir Mar 8, 2018
bc2c8b8
Trying to merge two event datasets and emit them properly'
manuasir Mar 9, 2018
cb58426
Adding dynamic filters
manuasir Mar 9, 2018
28c8038
Fixed typo error
manuasir Mar 12, 2018
327e7e7
Own backend endpoint to manager routes
manuasir Mar 16, 2018
e0695ae
Avoiding to index data from manager logs
manuasir Mar 16, 2018
da62b81
tabbing method properly
manuasir Mar 16, 2018
ba6b3b6
Adding new endpoint so that Splunk is able to reach it
manuasir Mar 16, 2018
4dbfdfb
Ignoring IDE stuff
manuasir Mar 19, 2018
a9d037e
Using the internal backend for fetching data
manuasir Mar 19, 2018
39da5a2
Fetching logs on demand and by own backend
manuasir Mar 20, 2018
18a5b69
Adding new structure for demo machine
manuasir Mar 21, 2018
4d9b274
Changing IPs and URLs for development
manuasir Mar 21, 2018
aa14dcf
Adapting TA for distributed architecture
manuasir Mar 21, 2018
706fef0
Deleting unnecessary inputs.conf
manuasir Mar 21, 2018
9246c35
Fixing some details in documentation
manuasir Mar 21, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Adding dynamic filters
  • Loading branch information
manuasir committed Mar 9, 2018
commit cb584261b937d4cd6d0effa1a6facb2b73c9e6e9
5 changes: 5 additions & 0 deletions SplunkAppForWazuh/default/addon_builder.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[base]
builder_version = 2.2.0
builder_build = 12
is_edited = 1

25 changes: 25 additions & 0 deletions SplunkAppForWazuh/local/data/ui/views/logs.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
<form>
<label>Manager Logs</label>
<fieldset submitButton="false" autoRun="true">
<input type="dropdown" token="value" searchWhenChanged="true">
<label>Filter by level</label>
<choice value="WARNING">WARNING</choice>
<choice value="INFO">INFO</choice>
<choice value="*">*</choice>
<initialValue>*</initialValue>
</input>
</fieldset>
<row>
<panel>
<table>
<title>Manager logs</title>
<search>
<query>index=wazuh_api sourcetype="wazuh:api:info:basic" | table logs_timestamp, logs_tag, logs_description, logs_level | search logs_level=$value$</query>
<earliest>-24h@h</earliest>
<latest>now</latest>
</search>
<option name="drilldown">none</option>
</table>
</panel>
</row>
</form>
26 changes: 26 additions & 0 deletions SplunkAppForWazuh/local/props.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
[wazuh]
DATETIME_CONFIG =
INDEXED_EXTRACTIONS = json
KV_MODE = none
NO_BINARY_CHECK = true
category = Application
disabled = false
pulldown_type = true
FIELDALIAS-rule.groups = "rule.groups{}" AS "rule.groups"
FIELDALIAS-dstuser = "data.dstuser" AS srcuser
FIELDALIAS-srcip = "data.srcip" AS srcip
FIELDALIAS-data.title = "data.title" AS title
FIELDALIAS-oscap.scan.id = "data.oscap.scan.id" AS "oscap.scan.id"
FIELDALIAS-oscap.scan.content = "data.oscap.scan.content" AS "oscap.scan.content"
FIELDALIAS-oscap.scan.profile.title = "data.oscap.scan.profile.title" AS "oscap.scan.profile.title"
FIELDALIAS-oscap.scan.score = "data.oscap.scan.score" AS "oscap.scan.score"
FIELDALIAS-oscap.check.title = "data.oscap.check.title" AS "oscap.check.title"
FIELDALIAS-oscap.check.result = "data.oscap.check.result" AS "oscap.check.result"
FIELDALIAS-oscap.check.severity = "data.oscap.check.severity" AS "oscap.check.severity"
FIELDALIAS-audit.exe = "data.audit.exe" AS "audit.exe"
FIELDALIAS-audit.file.mode = "data.audit.file.mode" AS "audit.file.mode"
FIELDALIAS-audit.egid = "data.audit.egid" AS "audit.egid"
FIELDALIAS-audit.euid = "data.audit.euid" AS "audit.euid"

[wazuh_api]
FIELDALIAS-agent-last-syscheck = las_syscheck AS last_syscheck
4 changes: 4 additions & 0 deletions SplunkAppForWazuh/local/transforms.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[pci1.csv]
batch_index_query = 0
case_sensitive_match = 1
filename = pci1.csv
4 changes: 4 additions & 0 deletions SplunkAppForWazuh/metadata/local.meta
Original file line number Diff line number Diff line change
Expand Up @@ -1364,3 +1364,7 @@ export = none
owner = admin
version = 6.6.0
modtime = 1502598678.925916000

[views/logs]
version = 7.0.2
modtime = 1520596415.232658000
23 changes: 8 additions & 15 deletions TA-wazuh-api-connector/bin/input_module_wazuh_api_info_basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,23 +39,16 @@ def collect_events(helper, ew):
for key in agent_summary:
data['agent_summary_' + key.lower().replace(' ', '')] = agent_summary[key]

dataLogs = {}
for row in logs:
for key in row:
dataLogs['logs_'+key] = row[key]
dataLogs = json.dumps(dataLogs)

# event = helper.new_event(source=helper.get_input_type(), index=helper.get_output_index(), sourcetype=helper.get_sourcetype(), data=data)
# ew.write_event(event)
data = json.dumps(data)
mergedJson = merge_two_dicts(data,dataLogs)
# mergedJson = merge_two_dicts(data,dataLogs)
event = helper.new_event(source=helper.get_input_type(), index=helper.get_output_index(), sourcetype=helper.get_sourcetype(), data=data)
ew.write_event(event)

# for row in logs:
# data = {}
# for key in row:
# data['logs_'+key] = row[key]
# data = json.dumps(data)
# event = helper.new_event(source=helper.get_input_type(), index=helper.get_output_index(), sourcetype=helper.get_sourcetype(), data=data)
# ew.write_event(event)
for row in logs:
data = {}
for key in row:
data['logs_'+key] = row[key]
data = json.dumps(data)
event = helper.new_event(source=helper.get_input_type(), index=helper.get_output_index(), sourcetype=helper.get_sourcetype(), data=data)
ew.write_event(event)