Skip to content

Commit

Permalink
Threat intel test (#673)
Browse files Browse the repository at this point in the history
* add mapping for indices storing threat intel feed data

* fix feed indices mapping

* add threat intel feed data dao

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threatIntelEnabled field in detector.

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threat intel feed service and searching feeds

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* ti feed data to doc level query convertor logic added

* plug threat intel feed into detector creation

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Preliminary framework for jobscheduler and datasource (#626)


Signed-off-by: Joanne Wang <jowg@amazon.com>

* create doc level query from threat intel feed data index docs"

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* handle threat intel enabled check during detector updation

* add tests for testing threat intel feed integration with detectors

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Threat intel feeds job runner and unit tests (#654)

* fix doc level query constructor (#651)

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add mapping for indices storing threat intel feed data

* fix feed indices mapping

* add threat intel feed data dao

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threatIntelEnabled field in detector.

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threat intel feed service and searching feeds

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* ti feed data to doc level query convertor logic added

* plug threat intel feed into detector creation

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Preliminary framework for jobscheduler and datasource (#626)


Signed-off-by: Joanne Wang <jowg@amazon.com>

* with listener and processor

Signed-off-by: Joanne Wang <jowg@amazon.com>

* removed actions

Signed-off-by: Joanne Wang <jowg@amazon.com>

* clean up

Signed-off-by: Joanne Wang <jowg@amazon.com>

* added parser

Signed-off-by: Joanne Wang <jowg@amazon.com>

* add unit tests

Signed-off-by: Joanne Wang <jowg@amazon.com>

* refactored class names

Signed-off-by: Joanne Wang <jowg@amazon.com>

* before moving db

Signed-off-by: Joanne Wang <jowg@amazon.com>

* after moving db

Signed-off-by: Joanne Wang <jowg@amazon.com>

* added actions to plugin and removed user schedule

Signed-off-by: Joanne Wang <jowg@amazon.com>

* unit tests

Signed-off-by: Joanne Wang <jowg@amazon.com>

* fix build error

Signed-off-by: Joanne Wang <jowg@amazon.com>

* changed transport naming

Signed-off-by: Joanne Wang <jowg@amazon.com>

---------

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>
Signed-off-by: Joanne Wang <jowg@amazon.com>
Co-authored-by: Surya Sashank Nistala <snistala@amazon.com>

* converge job scheduler code with threat intel feed integration in detectors

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* refactored out unecessary

Signed-off-by: Joanne Wang <jowg@amazon.com>

* added headers and cleaned up

Signed-off-by: Joanne Wang <jowg@amazon.com>

* converge job scheduler and detector threat intel code

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* working on testing

Signed-off-by: Joanne Wang <jowg@amazon.com>

* fixed the parser and build.gradle

Signed-off-by: Joanne Wang <jowg@amazon.com>

* add mapping for indices storing threat intel feed data

* fix feed indices mapping

* add threat intel feed data dao

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threatIntelEnabled field in detector.

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threat intel feed service and searching feeds

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* ti feed data to doc level query convertor logic added

* plug threat intel feed into detector creation

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Preliminary framework for jobscheduler and datasource (#626)


Signed-off-by: Joanne Wang <jowg@amazon.com>

* create doc level query from threat intel feed data index docs"

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* handle threat intel enabled check during detector updation

* add tests for testing threat intel feed integration with detectors

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Threat intel feeds job runner and unit tests (#654)

* fix doc level query constructor (#651)

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add mapping for indices storing threat intel feed data

* fix feed indices mapping

* add threat intel feed data dao

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threatIntelEnabled field in detector.

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add threat intel feed service and searching feeds

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* ti feed data to doc level query convertor logic added

* plug threat intel feed into detector creation

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* Preliminary framework for jobscheduler and datasource (#626)


Signed-off-by: Joanne Wang <jowg@amazon.com>

* with listener and processor

Signed-off-by: Joanne Wang <jowg@amazon.com>

* removed actions

Signed-off-by: Joanne Wang <jowg@amazon.com>

* clean up

Signed-off-by: Joanne Wang <jowg@amazon.com>

* added parser

Signed-off-by: Joanne Wang <jowg@amazon.com>

* add unit tests

Signed-off-by: Joanne Wang <jowg@amazon.com>

* refactored class names

Signed-off-by: Joanne Wang <jowg@amazon.com>

* before moving db

Signed-off-by: Joanne Wang <jowg@amazon.com>

* after moving db

Signed-off-by: Joanne Wang <jowg@amazon.com>

* added actions to plugin and removed user schedule

Signed-off-by: Joanne Wang <jowg@amazon.com>

* unit tests

Signed-off-by: Joanne Wang <jowg@amazon.com>

* fix build error

Signed-off-by: Joanne Wang <jowg@amazon.com>

* changed transport naming

Signed-off-by: Joanne Wang <jowg@amazon.com>

---------

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>
Signed-off-by: Joanne Wang <jowg@amazon.com>
Co-authored-by: Surya Sashank Nistala <snistala@amazon.com>

* converge job scheduler code with threat intel feed integration in detectors

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* converge job scheduler and detector threat intel code

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* add feed metadata config files in src and test

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* clean up some tests

Signed-off-by: Joanne Wang <jowg@amazon.com>

* fixed merge conflicts

Signed-off-by: Joanne Wang <jowg@amazon.com>

* adds ioc fields list in log type config files and ioc fields object in LogType POJO

* update csv parser and new metadata field

Signed-off-by: Joanne Wang <jowg@amazon.com>

* fixed job scheduler interval settings

Signed-off-by: Joanne Wang <jowg@amazon.com>

* add tests for ioc to fields for each log type

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>

* removed wildcards

Signed-off-by: Joanne Wang <jowg@amazon.com>

---------

Signed-off-by: Surya Sashank Nistala <snistala@amazon.com>
Signed-off-by: Joanne Wang <jowg@amazon.com>
Signed-off-by: Joanne Wang <109310487+jowg-amazon@users.noreply.github.com>
Co-authored-by: Joanne Wang <109310487+jowg-amazon@users.noreply.github.com>
Co-authored-by: Joanne Wang <jowg@amazon.com>
  • Loading branch information
3 people authored Oct 17, 2023
1 parent 2b59191 commit f0f8270
Show file tree
Hide file tree
Showing 74 changed files with 1,956 additions and 1,195 deletions.
14 changes: 13 additions & 1 deletion build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,7 @@ opensearchplugin {
name 'opensearch-security-analytics'
description 'OpenSearch Security Analytics plugin'
classname 'org.opensearch.securityanalytics.SecurityAnalyticsPlugin'
// extendedPlugins = ['opensearch-job-scheduler']
}

javaRestTest {
Expand Down Expand Up @@ -155,7 +156,7 @@ dependencies {
implementation group: 'org.apache.commons', name: 'commons-lang3', version: "${versions.commonslang}"
implementation "org.antlr:antlr4-runtime:4.10.1"
implementation "com.cronutils:cron-utils:9.1.6"
api files("/Users/snistala/Documents/opensearch/common-utils/build/libs/common-utils-3.0.0.0-SNAPSHOT.jar")
api "org.opensearch:common-utils:${common_utils_version}@jar"
api "org.opensearch.client:opensearch-rest-client:${opensearch_version}"
implementation "org.jetbrains.kotlin:kotlin-stdlib:${kotlin_version}"
implementation "org.opensearch:opensearch-job-scheduler-spi:${opensearch_build}"
Expand All @@ -165,6 +166,7 @@ dependencies {
zipArchive group: 'org.opensearch.plugin', name:'alerting', version: "${opensearch_build}"
zipArchive group: 'org.opensearch.plugin', name:'opensearch-notifications-core', version: "${opensearch_build}"
zipArchive group: 'org.opensearch.plugin', name:'notifications', version: "${opensearch_build}"
zipArchive group: 'org.opensearch.plugin', name:'opensearch-job-scheduler', version: "${opensearch_build}"

//spotless
implementation('com.google.googlejavaformat:google-java-format:1.17.0') {
Expand Down Expand Up @@ -291,6 +293,16 @@ testClusters.integTest {
}
}
}))
plugin(provider({
new RegularFile() {
@Override
File getAsFile() {
return configurations.zipArchive.asFileTree.matching {
include '**/opensearch-job-scheduler*'
}.singleFile
}
}
}))
}

run {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@
import org.opensearch.securityanalytics.threatIntel.DetectorThreatIntelService;
import org.opensearch.securityanalytics.threatIntel.ThreatIntelFeedDataService;
import org.opensearch.securityanalytics.threatIntel.action.*;
import org.opensearch.securityanalytics.threatIntel.common.TIFExecutor;
import org.opensearch.securityanalytics.threatIntel.common.TIFLockService;
import org.opensearch.securityanalytics.threatIntel.feedMetadata.BuiltInTIFMetadataLoader;
import org.opensearch.securityanalytics.threatIntel.jobscheduler.TIFJobParameterService;
Expand Down Expand Up @@ -121,13 +120,6 @@ public Collection<SystemIndexDescriptor> getSystemIndexDescriptors(Settings sett
return List.of(new SystemIndexDescriptor(THREAT_INTEL_DATA_INDEX_NAME_PREFIX, "System index used for threat intel data"));
}

@Override
public List<ExecutorBuilder<?>> getExecutorBuilders(Settings settings) {
List<ExecutorBuilder<?>> executorBuilders = new ArrayList<>();
executorBuilders.add(TIFExecutor.executorBuilder(settings));
return executorBuilders;
}

@Override
public Collection<Object> createComponents(Client client,
ClusterService clusterService,
Expand Down Expand Up @@ -156,17 +148,16 @@ public Collection<Object> createComponents(Client client,
DetectorThreatIntelService detectorThreatIntelService = new DetectorThreatIntelService(threatIntelFeedDataService);
TIFJobParameterService tifJobParameterService = new TIFJobParameterService(client, clusterService);
TIFJobUpdateService tifJobUpdateService = new TIFJobUpdateService(clusterService, tifJobParameterService, threatIntelFeedDataService, builtInTIFMetadataLoader);
TIFExecutor threatIntelExecutor = new TIFExecutor(threadPool);
TIFLockService threatIntelLockService = new TIFLockService(clusterService, client);

this.client = client;

TIFJobRunner.getJobRunnerInstance().initialize(clusterService,tifJobUpdateService, tifJobParameterService, threatIntelExecutor, threatIntelLockService, threadPool);
TIFJobRunner.getJobRunnerInstance().initialize(clusterService,tifJobUpdateService, tifJobParameterService, threatIntelLockService, threadPool);

return List.of(
detectorIndices, correlationIndices, correlationRuleIndices, ruleTopicIndices, customLogTypeIndices, ruleIndices,
mapperService, indexTemplateManager, builtinLogTypeLoader, builtInTIFMetadataLoader, threatIntelFeedDataService, detectorThreatIntelService,
tifJobUpdateService, tifJobParameterService, threatIntelExecutor, threatIntelLockService);
tifJobUpdateService, tifJobParameterService, threatIntelLockService);
}

@Override
Expand Down Expand Up @@ -268,7 +259,7 @@ public List<Setting<?>> getSettings() {
SecurityAnalyticsSettings.CORRELATION_TIME_WINDOW,
SecurityAnalyticsSettings.DEFAULT_MAPPING_SCHEMA,
SecurityAnalyticsSettings.ENABLE_WORKFLOW_USAGE,
SecurityAnalyticsSettings.TIFJOB_UPDATE_INTERVAL,
SecurityAnalyticsSettings.TIF_UPDATE_INTERVAL,
SecurityAnalyticsSettings.BATCH_SIZE,
SecurityAnalyticsSettings.THREAT_INTEL_TIMEOUT
);
Expand Down Expand Up @@ -304,11 +295,9 @@ public List<Setting<?>> getSettings() {
new ActionHandler<>(DeleteCustomLogTypeAction.INSTANCE, TransportDeleteCustomLogTypeAction.class),

new ActionHandler<>(PutTIFJobAction.INSTANCE, TransportPutTIFJobAction.class),
new ActionHandler<>(GetTIFJobAction.INSTANCE, TransportGetTIFJobAction.class),
new ActionHandler<>(UpdateTIFJobAction.INSTANCE, TransportUpdateTIFJobAction.class),
new ActionHandler<>(DeleteTIFJobAction.INSTANCE, TransportDeleteTIFJobAction.class)

);
);
}

@Override
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/

grant {
permission java.lang.management.ManagementPermission "reputation.alienvault.com:443" "connect,resolve";
};
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/
package org.opensearch.securityanalytics.model;

import org.apache.logging.log4j.LogManager;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,11 @@
*/
package org.opensearch.securityanalytics.settings;

import java.net.MalformedURLException;
import java.net.URISyntaxException;
import java.net.URL;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.opensearch.common.settings.Setting;
import org.opensearch.common.unit.TimeValue;
import org.opensearch.jobscheduler.repackage.com.cronutils.utils.VisibleForTesting;

import java.util.List;
import java.util.concurrent.TimeUnit;

public class SecurityAnalyticsSettings {
public static final String CORRELATION_INDEX = "index.correlation";
Expand Down Expand Up @@ -123,13 +120,10 @@ public class SecurityAnalyticsSettings {
);

// threat intel settings
/**
* Default update interval to be used in threat intel tif job creation API
*/
public static final Setting<Long> TIFJOB_UPDATE_INTERVAL = Setting.longSetting(
"plugins.security_analytics.threatintel.tifjob.update_interval_in_days",
1l,
1l, //todo: change the min value
public static final Setting<TimeValue> TIF_UPDATE_INTERVAL = Setting.timeSetting(
"plugins.security_analytics.threat_intel_timeout",
TimeValue.timeValueHours(24),
TimeValue.timeValueHours(1),
Setting.Property.NodeScope,
Setting.Property.Dynamic
);
Expand Down Expand Up @@ -161,7 +155,7 @@ public class SecurityAnalyticsSettings {
* @return a list of all settings for threat intel feature
*/
public static final List<Setting<?>> settings() {
return List.of(TIFJOB_UPDATE_INTERVAL, BATCH_SIZE, THREAT_INTEL_TIMEOUT);
return List.of(BATCH_SIZE, THREAT_INTEL_TIMEOUT, TIF_UPDATE_INTERVAL);
}

}
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/
package org.opensearch.securityanalytics.threatIntel;

import org.apache.logging.log4j.LogManager;
Expand Down Expand Up @@ -58,7 +62,7 @@ public List<DocLevelQuery> createDocLevelQueriesFromThreatIntelList(
queries.add(new DocLevelQuery(
constructId(detector, entry.getKey()), tifdList.get(0).getFeedId(),
Collections.emptyList(),
String.format(query, field),
"windows-hostname:(120.85.114.146 OR 103.104.106.223 OR 185.191.246.45 OR 120.86.237.94)",
List.of("threat_intel", entry.getKey() /*ioc_type*/)
));
}
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/
package org.opensearch.securityanalytics.threatIntel;

import org.apache.commons.csv.CSVRecord;
Expand Down Expand Up @@ -41,7 +45,12 @@
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.time.Instant;
import java.util.*;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Arrays;
import java.util.Optional;
import java.util.concurrent.CountDownLatch;
import java.util.stream.Collectors;

Expand Down Expand Up @@ -97,19 +106,19 @@ public void getThreatIntelFeedData(
if(IndexUtils.getNewIndexByCreationDate(
this.clusterService.state(),
this.indexNameExpressionResolver,
".opensearch-sap-threatintel*" //name?
".opensearch-sap-threatintel*"
) == null) {
createThreatIntelFeedData();
}
//if index exists
String tifdIndex = IndexUtils.getNewIndexByCreationDate(
this.clusterService.state(),
this.indexNameExpressionResolver,
".opensearch-sap-threatintel*" //name?
".opensearch-sap-threatintel*"
);

SearchRequest searchRequest = new SearchRequest(tifdIndex);
searchRequest.source().size(1000); //TODO: convert to scroll
searchRequest.source().size(9999); //TODO: convert to scroll
client.search(searchRequest, ActionListener.wrap(r -> listener.onResponse(ThreatIntelFeedDataUtils.getTifdList(r, xContentRegistry)), e -> {
log.error(String.format(
"Failed to fetch threat intel feed data from system index %s", tifdIndex), e);
Expand All @@ -123,11 +132,10 @@ public void getThreatIntelFeedData(

private void createThreatIntelFeedData() throws InterruptedException {
CountDownLatch countDownLatch = new CountDownLatch(1);
client.execute(PutTIFJobAction.INSTANCE, new PutTIFJobRequest("feed_updater")).actionGet();
client.execute(PutTIFJobAction.INSTANCE, new PutTIFJobRequest("feed_updater", clusterSettings.get(SecurityAnalyticsSettings.TIF_UPDATE_INTERVAL))).actionGet();
countDownLatch.await();
}


/**
* Create an index for a threat intel feed
*
Expand Down Expand Up @@ -166,18 +174,16 @@ private String getIndexMapping() {
* Puts threat intel feed from CSVRecord iterator into a given index in bulk
*
* @param indexName Index name to save the threat intel feed
* @param fields Field name matching with data in CSVRecord in order
* @param iterator TIF data to insert
* @param renewLock Runnable to renew lock
*/
public void parseAndSaveThreatIntelFeedDataCSV(
final String indexName,
final String[] fields,
final Iterator<CSVRecord> iterator,
final Runnable renewLock,
final TIFMetadata tifMetadata
) throws IOException {
if (indexName == null || fields == null || iterator == null || renewLock == null) {
if (indexName == null || iterator == null || renewLock == null) {
throw new IllegalArgumentException("Parameters cannot be null, failed to save threat intel feed data");
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,26 +1,30 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/
package org.opensearch.securityanalytics.threatIntel;

import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVParser;
import org.apache.commons.csv.CSVRecord;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.opensearch.OpenSearchException;
import org.opensearch.SpecialPermission;
import org.opensearch.common.SuppressForbidden;
import org.opensearch.securityanalytics.model.DetectorTrigger;
import org.opensearch.securityanalytics.threatIntel.common.Constants;
import org.opensearch.securityanalytics.threatIntel.common.TIFMetadata;

import java.io.*;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;
import java.security.AccessController;
import java.security.PrivilegedAction;

//Parser helper class
public class ThreatIntelFeedParser {
private static final Logger log = LogManager.getLogger(DetectorTrigger.class);
private static final Logger log = LogManager.getLogger(ThreatIntelFeedParser.class);

/**
* Create CSVParser of a threat intel feed
Expand All @@ -43,23 +47,4 @@ public static CSVParser getThreatIntelFeedReaderCSV(final TIFMetadata tifMetadat
}
});
}

/**
* Validate header
*
* 1. header should not be null
* 2. the number of values in header should be more than one
*
* @param header the header
* @return CSVRecord the input header
*/
public static CSVRecord validateHeader(CSVRecord header) {
if (header == null) {
throw new OpenSearchException("threat intel feed database is empty");
}
if (header.values().length < 2) {
throw new OpenSearchException("threat intel feed database should have at least two fields");
}
return header;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ public ActionRequestValidationException validate() {
ActionRequestValidationException errors = null;
if (VALIDATOR.validateTIFJobName(name).isEmpty() == false) {
errors = new ActionRequestValidationException();
errors.addValidationError("no such job exist");
errors.addValidationError("no such job exists");
}
return errors;
}
Expand Down

This file was deleted.

Loading

0 comments on commit f0f8270

Please sign in to comment.