Skip to content

Commit

Permalink
Fixed issue #185 : We now have a string transformer when exporting da…
Browse files Browse the repository at this point in the history
…ta (#186)

* Fixed issue #185 : We now have a string transformer when exporting data
  • Loading branch information
baubakg authored Oct 4, 2024
1 parent 93bf799 commit f15336a
Show file tree
Hide file tree
Showing 4 changed files with 89 additions and 21 deletions.
14 changes: 11 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,7 @@ The basic method for using this library is, that you create a definition for you

## Table of contents
<!-- TOC -->
* [log-parser](#log-parser)
* [Table of contents](#table-of-contents)

* [Installation](#installation)
* [Maven](#maven)
* [Running the Log Parser](#running-the-log-parser)
Expand All @@ -33,6 +32,7 @@ The basic method for using this library is, that you create a definition for you
* [Declaring the transformation Rules in setValuesFromMap](#declaring-the-transformation-rules-in-setvaluesfrommap)
* [Declaring the Key](#declaring-the-key)
* [Declare the HeaderMap, and ValueMap](#declare-the-headermap-and-valuemap)
* [Assisting Exports](#assisting-exports)
* [Code Structure](#code-structure)
* [Searching and organizing log data](#searching-and-organizing-log-data)
* [Search and Filter Mechanisms](#search-and-filter-mechanisms)
Expand All @@ -47,6 +47,7 @@ The basic method for using this library is, that you create a definition for you
* [Exporting Parse Results](#exporting-parse-results)
* [Exporting Results to a CSV File](#exporting-results-to-a-csv-file)
* [Exporting Results to an HTML File](#exporting-results-to-an-html-file)
* [Exporting Results to an JSON File](#exporting-results-to-an-json-file)
* [Command-line Execution of the Log-Parser](#command-line-execution-of-the-log-parser)
* [Changelog](#changelog)
* [1.11.0 (next version)](#1110--next-version-)
Expand Down Expand Up @@ -256,6 +257,10 @@ Depending on the fields you have defined, you will want to define how the result

You will need to give names to the headers, and provide a map that extracts the values.

##### Assisting Exports
One of the added values of writing your own log data is the possibility of using non-String objects, and perform additional operations on the data. This has the drawback that we can have odd behaviors when exporting the logs data. For this we, y default, transforml all data in an entry to a map of Strings.

In some cases the default String transformation may not be to your liking. In this case you will have to override the method `Map<String, String> fetchValueMapPrintable()`. To do this the method needs to call perform your own transformation to the results of the `fetchValueMap()` method.

## Code Structure
Below is a diagram representing the class structure:
Expand Down Expand Up @@ -420,6 +425,8 @@ We have the possibility to export the log data results into files. Currently the

All reports are stored in the directory `log-parser-reports/export/`.

If you are using an SDK to control the log parsing, you may want to override the method `fetchValueMapPrintable` to provide a more suitable export of the data. For mor information on this please refer to the chapter describing this topic.

### Exporting Results to a CSV File
We have the possibility to export the log data results into a CSV file. This is done by calling the methods `LogData#exportLogDataToCSV`.

Expand All @@ -435,7 +442,6 @@ We have the possibility to export the log data results into an JSON file. This i

You have the possibility to define the data, and order to be exported, the file name and the title of the report.


## Command-line Execution of the Log-Parser
As of version 1.11.0 we have introduced the possibility of running the log-parser from the command line. This is done by using the executable jar file or executing the main method in maven.

Expand Down Expand Up @@ -480,6 +486,8 @@ All reports are stored in the directory `log-parser-reports/export/`.
- [#57](https://github.com/adobe/log-parser/issues/57) Assertions are no longer an implicite assert equal method. We now allow Hamcrest Matchers for asserting. This can be one or more matchers.
- [#119](https://github.com/adobe/log-parser/issues/119) Cleanup of deprecated methods, and the consequences thereof.
- [#137](https://github.com/adobe/log-parser/issues/137) We can now generate an HTML report for the differences in log data.
- [#185](https://github.com/adobe/log-parser/issues/185) Resolved issue with deserializing unexpected objects in SDK Log entries..


### 1.0.10
- Moved main code and tests to the package "core"
Expand Down
18 changes: 10 additions & 8 deletions src/main/java/com/adobe/campaign/tests/logparser/core/LogData.java
Original file line number Diff line number Diff line change
Expand Up @@ -421,7 +421,7 @@ public File exportLogDataToCSV(Collection<String> in_headerSet, String in_csvFil
printer.printRecord(in_headerSet);

for (StdLogEntry lt_entry : this.getEntries().values()) {
Map lt_values = lt_entry.fetchValueMap();
Map lt_values = lt_entry.fetchValueMapPrintable();
printer.printRecord(in_headerSet.stream().map(h -> lt_values.get(h)).collect(Collectors.toList()));
}

Expand Down Expand Up @@ -472,7 +472,7 @@ public File exportLogDataToHTML(Collection<String> in_headerSet, String in_repor
sb.append("<tbody>");

for (StdLogEntry lt_entry : this.getEntries().values()) {
Map lt_values = lt_entry.fetchValueMap();
Map lt_values = lt_entry.fetchValueMapPrintable();
sb.append(HTMLReportUtils.ROW_START);
in_headerSet.stream().map(h -> lt_values.get(h)).forEach(j -> sb.append(HTMLReportUtils.fetchCell_TD(j)));
sb.append(HTMLReportUtils.ROW_END);
Expand Down Expand Up @@ -533,16 +533,18 @@ public File exportLogDataToJSON(String in_jsonFileName) throws LogDataExportToFi
* @return a JSON file containing the LogData
* @throws LogDataExportToFileException If the file could not be exported
*/
public File exportLogDataToJSON(Collection<String> in_headerSet, String in_jsonFileName) throws LogDataExportToFileException {
public File exportLogDataToJSON(Collection<String> in_headerSet, String in_jsonFileName)
throws LogDataExportToFileException {
File l_exportFile = LogParserFileUtils.createNewFile(in_jsonFileName);
List<Map<String, Object>> jsonList = new ArrayList<>();
jsonList.addAll(this.getEntries().values().stream().map(StdLogEntry::fetchValueMap).collect(Collectors.toList()));
List<Map<String, String>> jsonList = new ArrayList<>();
jsonList.addAll(this.getEntries().values().stream().map(StdLogEntry::fetchValueMapPrintable)
.collect(Collectors.toList()));

try {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.writeValue(l_exportFile, objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(jsonList));
}
catch (IOException e) {
objectMapper.writeValue(l_exportFile,
objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(jsonList));
} catch (IOException e) {
throw new LogDataExportToFileException("Encountered error while exporting the log data to a JSON file.", e);
}
return l_exportFile;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,18 @@ public Map<String, Object> fetchValueMap() {
return valuesMap;
}

/**
* For deserialization purposes, when people define their own classes we will need to have a map of Strings. This
* avoids problems when exporting data. In cases where this cannot be done automatically, please overload this
* method.
*
* @return a map of Strings representing the values of the log entry
*/
protected Map<String, String> fetchValueMapPrintable() {
return fetchValueMap().entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, e -> Optional.ofNullable(e.getValue()).orElse("").toString()));
}

/**
* Increments the frequence
*
Expand Down
66 changes: 56 additions & 10 deletions src/test/java/com/adobe/campaign/tests/logparser/core/SDKTests.java
Original file line number Diff line number Diff line change
Expand Up @@ -8,27 +8,22 @@
*/
package com.adobe.campaign.tests.logparser.core;


import com.adobe.campaign.tests.logparser.data.SDKCaseBadDefConstructor;
import com.adobe.campaign.tests.logparser.data.SDKCasePrivateDefConstructor;
import com.adobe.campaign.tests.logparser.data.SDKCaseNoDefConstructor;
import com.adobe.campaign.tests.logparser.data.SDKCaseSTD;
import com.adobe.campaign.tests.logparser.data.*;
import com.adobe.campaign.tests.logparser.exceptions.IncorrectParseDefinitionException;
import com.adobe.campaign.tests.logparser.exceptions.LogDataExportToFileException;
import com.adobe.campaign.tests.logparser.exceptions.LogParserSDKDefinitionException;
import com.adobe.campaign.tests.logparser.exceptions.StringParseException;
import org.hamcrest.Matcher;
import org.hamcrest.Matchers;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.testng.Assert;
import org.testng.annotations.Test;

import java.io.File;
import java.io.IOException;
import java.time.ZonedDateTime;
import java.util.Arrays;
import java.util.Map;

import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.is;
import static org.hamcrest.Matchers.*;

public class SDKTests {

Expand Down Expand Up @@ -195,4 +190,55 @@ public void testSimpleLogACC_SDK_negativePrivateConstructor() {

}


@Test
public void testFetchValueMapToString() throws StringParseException {
ParseDefinition l_pDefinition = SDKTests.getTestParseDefinition();
l_pDefinition.setStoreFileName(true);

String l_file = "src/test/resources/sdk/useCase1.log";

LogData<SDKCase2> l_entries = LogDataFactory.generateLogData(Arrays.asList(l_file), l_pDefinition,
SDKCase2.class);

SDKCase2 l_entry = l_entries.getEntries().values().iterator().next();

assertThat("Checking the original assumptions",l_entry.fetchValueMap().get("timeOfLog"), instanceOf(
ZonedDateTime.class));

assertThat("Checking the original assumptions",l_entry.fetchValueMapPrintable().get("timeOfLog"), instanceOf(
String.class));


}

@Test
public void testSimpleLogACC_SDK_exportDateTimeJSON() throws StringParseException, IOException {
ParseDefinition l_pDefinition = SDKTests.getTestParseDefinition();
l_pDefinition.setStoreFileName(true);

String l_file = "src/test/resources/sdk/useCase1.log";

LogData<SDKCase2> l_entries = LogDataFactory.generateLogData(Arrays.asList(l_file), l_pDefinition,
SDKCase2.class);

File l_exportedFile = l_entries.exportLogDataToJSON("jsonTest.json");

assertThat("We successfully created the file", l_exportedFile, notNullValue());
assertThat("We successfully created the file", l_exportedFile.exists());
assertThat("We successfully created the file correctly", l_exportedFile.isFile());
assertThat("Created JSON file is no Empty", l_exportedFile.length() > 0);

try {
ObjectMapper objectMapper = new ObjectMapper();
String values = objectMapper.readValue(l_exportedFile, String.class);

assertThat("JSON file contains correct verb definition", values.contains("\"timeStamp\" : \"2024-06-13T03:00:19.727Z\""));

} finally {
l_exportedFile.delete();
}

}

}

0 comments on commit f15336a

Please sign in to comment.