Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DFP pipeline module #510

Merged
64 commits merged into from
Jan 24, 2023
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
64 commits
Select commit Hold shift + click to select a range
1427674
dfp modules testing
bsuryadevara Nov 14, 2022
3d769ae
Merge branch 'nv-morpheus:branch-22.11' into dfp-training-module
bsuryadevara Nov 14, 2022
df5066b
modules stage work
bsuryadevara Nov 18, 2022
ebb2db8
Added module stage implementation
bsuryadevara Nov 22, 2022
c736828
morpehus modules integration
bsuryadevara Nov 22, 2022
1d12aee
trivial changes
bsuryadevara Nov 22, 2022
3376ca7
renamed configuration file
bsuryadevara Nov 23, 2022
4253d34
added tests
bsuryadevara Nov 28, 2022
b5ded8b
update module factory
bsuryadevara Nov 28, 2022
4b13b99
update module factory
bsuryadevara Nov 28, 2022
4789d23
Updated linear modules stage
bsuryadevara Nov 29, 2022
5cdfd68
Updated linear modules stage
bsuryadevara Nov 29, 2022
d15322a
Merge branch 'nv-morpheus:branch-23.01' into dfp-pipeline-module
bsuryadevara Nov 29, 2022
bea98a7
Updated linear modules stage
bsuryadevara Nov 29, 2022
5a4aa4a
Merge branch 'dfp-pipeline-module' of github.com:bsuryadevara/Morpheu…
bsuryadevara Nov 29, 2022
f0268b6
renamed mlflow model writer module
bsuryadevara Nov 29, 2022
25d1c5a
added functiontools wraps to a decorator func
bsuryadevara Nov 29, 2022
e4a73f2
created dfp pipeline preprocessing and training modules
bsuryadevara Nov 30, 2022
e5b770f
created dfp pipeline preprocessing and training modules
bsuryadevara Nov 30, 2022
1575fcd
created dfp pipeline preprocessing and training modules
bsuryadevara Nov 30, 2022
70fc4cf
created dfp pipeline preprocessing and training modules
bsuryadevara Nov 30, 2022
5159b0b
used dill to persist source and preprocess schema
bsuryadevara Nov 30, 2022
9619c0e
used dill to persist source and preprocess schema
bsuryadevara Nov 30, 2022
88d03b1
renamed files
bsuryadevara Dec 1, 2022
899df5c
Updated dfp pipleines with modules
bsuryadevara Dec 7, 2022
6a0bcdd
Added tests
bsuryadevara Dec 7, 2022
e85ee77
resolved merge conflicts
bsuryadevara Dec 7, 2022
dd9f2d2
created dfp (azure, duo) training pipelines with modules
bsuryadevara Dec 7, 2022
0bacf6e
style correction
bsuryadevara Dec 7, 2022
d9983dd
style correction
bsuryadevara Dec 7, 2022
2185886
style correction
bsuryadevara Dec 8, 2022
e50d1d1
added dask and distributed packages to requirements.txt
bsuryadevara Dec 8, 2022
4fdd4c9
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Dec 8, 2022
0e6d5cf
Resloved conflicts
bsuryadevara Dec 8, 2022
9158130
addedd missing tests file
bsuryadevara Dec 9, 2022
e0e7309
Merge remote-tracking branch 'upstream/branch-23.01' into dfp-pipelin…
bsuryadevara Dec 9, 2022
c91aa7f
fix to failing test
bsuryadevara Dec 10, 2022
a9d3e9e
fix to failing test
bsuryadevara Dec 10, 2022
516e746
fix tests
bsuryadevara Dec 10, 2022
bdf384a
addressed feedback comments
bsuryadevara Dec 13, 2022
c6c49a9
Update examples/digital_fingerprinting/production/morpheus/dfp_azure_…
bsuryadevara Dec 13, 2022
c93ac48
Update examples/digital_fingerprinting/production/morpheus/dfp_duo_mo…
bsuryadevara Dec 13, 2022
7e30dea
input and output port names from a module as params
bsuryadevara Dec 13, 2022
8aa84f5
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Dec 13, 2022
2fb7a5e
Update morpheus/stages/general/linear_modules_stage.py
bsuryadevara Dec 14, 2022
a4ce99e
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Dec 14, 2022
6e35679
updated readme, filenames
bsuryadevara Dec 15, 2022
7da7189
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Dec 15, 2022
ae9393b
Updated DFP readme.md
bsuryadevara Dec 15, 2022
4059466
Merge remote-tracking branch 'upstream/branch-23.01' into dfp-pipelin…
bsuryadevara Dec 20, 2022
2fc2916
moved from srf to mrc
bsuryadevara Dec 20, 2022
62149a7
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Jan 4, 2023
e0867c4
changes related to feedback
bsuryadevara Jan 18, 2023
c1af70a
Merge remote-tracking branch 'origin/branch-23.01' into dfp-pipeline-…
bsuryadevara Jan 18, 2023
0a98249
updated header and logging format
bsuryadevara Jan 18, 2023
dd8d69a
fix incomplete headers
bsuryadevara Jan 18, 2023
3d5e592
Removed commented code
bsuryadevara Jan 19, 2023
6d30b13
changed module fucntion name to align with filename
bsuryadevara Jan 23, 2023
31538f0
Merge branch 'dfp-pipeline-module' of https://github.com/bsuryadevara…
bsuryadevara Jan 23, 2023
b0a42b8
added distributed dependency to dfp example
bsuryadevara Jan 23, 2023
904a92e
Merge remote-tracking branch 'upstream/branch-23.01' into dfp-pipelin…
bsuryadevara Jan 23, 2023
ba56921
Merge branch-23.01 to dfp-pipeline-module
bsuryadevara Jan 23, 2023
6d6e030
Merge branch 'branch-23.01' into dfp-pipeline-module
bsuryadevara Jan 23, 2023
5af5813
resolved merge conflicts
bsuryadevara Jan 23, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
changes related to feedback
  • Loading branch information
bsuryadevara committed Jan 18, 2023
commit e0867c45b6584422156b975b2a563e71952353d3
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# Copyright (c) 2022-2023, NVIDIA CORPORATION.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

import logging
import pickle
import time

import mrc
from mrc.core import operators as ops

from morpheus.utils.column_info import process_dataframe
from morpheus.utils.module_ids import MODULE_NAMESPACE
from morpheus.utils.module_utils import get_module_config
from morpheus.utils.module_utils import register_module

from ..messages.multi_dfp_message import MultiDFPMessage
from ..utils.module_ids import DFP_DATA_PREP

logger = logging.getLogger(__name__)


@register_module(DFP_DATA_PREP, MODULE_NAMESPACE)
def dfp_preprocessing(builder: mrc.Builder):
bsuryadevara marked this conversation as resolved.
Show resolved Hide resolved
"""
Preprocessed data are produced by this module function for either inference or model training.

Parameters
----------
builder : mrc.Builder
Pipeline budler instance.
"""

config = get_module_config(DFP_DATA_PREP, builder)

schema_config = config.get("schema", None)
schema_str = schema_config.get("schema_str", None)
encoding = schema_config.get("encoding", None)
timestamp_column_name = config.get("timestamp_column_name", None)

schema = pickle.loads(bytes(schema_str, encoding))

def process_features(message: MultiDFPMessage):
if (message is None):
return None

start_time = time.time()

# Process the columns
df_processed = process_dataframe(message.get_meta_dataframe(), schema)

# Apply the new dataframe, only the rows in the offset
message.set_meta_dataframe(list(df_processed.columns), df_processed)

if logger.isEnabledFor(logging.DEBUG):
duration = (time.time() - start_time) * 1000.0

logger.debug("Preprocessed %s data for logs in %s to %s in %s ms",
message.mess_count,
message.get_meta(timestamp_column_name).min(),
message.get_meta(timestamp_column_name).max(),
duration)

return message

def node_fn(obs: mrc.Observable, sub: mrc.Subscriber):
obs.pipe(ops.map(process_features)).subscribe(sub)

node = builder.make_node_full(DFP_DATA_PREP, node_fn)

builder.register_module_input("input", node)
builder.register_module_output("output", node)
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) 2022, NVIDIA CORPORATION.
# Copyright (c) 2022-2023, NVIDIA CORPORATION.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Expand All @@ -18,30 +18,34 @@
import mrc

import morpheus.modules.mlflow_model_writer # noqa: F401
from morpheus.utils.module_ids import MLFLOW_MODEL_WRITER
from morpheus.utils.module_ids import MODULE_NAMESPACE
from morpheus.utils.module_utils import get_module_config
from morpheus.utils.module_utils import load_module
from morpheus.utils.module_utils import register_module

logger = logging.getLogger(f"morpheus.{__name__}")
from ..utils.module_ids import DFP_MODEL_TRAIN_DEPLOY
from ..utils.module_ids import DFP_TRAINING

logger = logging.getLogger(__name__)

@register_module("DFPPipelineTraining", "morpheus_modules")
def dfp_pipeline_training(builder: mrc.Builder):

@register_module(DFP_MODEL_TRAIN_DEPLOY, MODULE_NAMESPACE)
def dfp_model_train_deploy(builder: mrc.Builder):
"""
This module function allows for the consolidation of multiple dfp pipeline training modules into a single module.
This module function allows for the consolidation of multiple dfp training and mlflow model deployment modules into
a single module.

Parameters
----------
builder : mrc.Builder
Pipeline budler instance.
"""

module_id = "DFPPipelineTraining"

config = get_module_config(module_id, builder)
config = get_module_config(DFP_MODEL_TRAIN_DEPLOY, builder)

dfp_training_conf = config.get("DFPTraining", None)
mlflow_model_writer_conf = config.get("MLFlowModelWriter", None)
dfp_training_conf = config.get(DFP_TRAINING, None)
mlflow_model_writer_conf = config.get(MLFLOW_MODEL_WRITER, None)

dfp_training_module = load_module(dfp_training_conf, builder=builder)
mlflow_model_writer_module = load_module(mlflow_model_writer_conf, builder=builder)
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) 2022, NVIDIA CORPORATION.
# Copyright (c) 2022-2023, NVIDIA CORPORATION.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Expand All @@ -13,70 +13,63 @@
# limitations under the License.

import logging
import pickle
import time

import dfp.modules.dfp_data_prep # noqa: F401
import dfp.modules.dfp_rolling_window # noqa: F401
import dfp.modules.dfp_split_users # noqa: F401
import dfp.modules.dfp_training # noqa: F401
import mrc
from mrc.core import operators as ops

from morpheus.utils.column_info import process_dataframe
import morpheus.modules.file_batcher # noqa: F401
import morpheus.modules.file_to_df # noqa: F401
from morpheus.utils.module_ids import FILE_BATCHER
from morpheus.utils.module_ids import FILE_TO_DATAFRAME
from morpheus.utils.module_ids import MODULE_NAMESPACE
from morpheus.utils.module_utils import get_module_config
from morpheus.utils.module_utils import load_module
from morpheus.utils.module_utils import register_module

from ..messages.multi_dfp_message import MultiDFPMessage
from ..utils.module_ids import DFP_DATA_PREP
from ..utils.module_ids import DFP_PREPROCESSING
from ..utils.module_ids import DFP_ROLLING_WINDOW
from ..utils.module_ids import DFP_SPLIT_USERS

logger = logging.getLogger(f"morpheus.{__name__}")
logger = logging.getLogger(__name__)


@register_module("DFPPreprocessing", "morpheus_modules")
def dfp_preprocessing(builder: mrc.Builder):
@register_module(DFP_PREPROCESSING, MODULE_NAMESPACE)
def dfp_pipeline_preprocessing(builder: mrc.Builder):
"""
Preprocessed data are produced by this module function for either inference or model training.
This module function allows for the consolidation of multiple dfp pipeline preprocessing modules
into a single module.

Parameters
----------
builder : mrc.Builder
Pipeline budler instance.
"""

module_id = "DFPPreprocessing"

config = get_module_config(module_id, builder)

schema_config = config.get("schema", None)
schema_str = schema_config.get("schema_str", None)
encoding = schema_config.get("encoding", None)
timestamp_column_name = config.get("timestamp_column_name", None)

schema = pickle.loads(bytes(schema_str, encoding))

def process_features(message: MultiDFPMessage):
if (message is None):
return None

start_time = time.time()

# Process the columns
df_processed = process_dataframe(message.get_meta_dataframe(), schema)

# Apply the new dataframe, only the rows in the offset
message.set_meta_dataframe(list(df_processed.columns), df_processed)

if logger.isEnabledFor(logging.DEBUG):
duration = (time.time() - start_time) * 1000.0

logger.debug("Preprocessed %s data for logs in %s to %s in %s ms",
message.mess_count,
message.get_meta(timestamp_column_name).min(),
message.get_meta(timestamp_column_name).max(),
duration)

return message

def node_fn(obs: mrc.Observable, sub: mrc.Subscriber):
obs.pipe(ops.map(process_features)).subscribe(sub)

node = builder.make_node_full(module_id, node_fn)

builder.register_module_input("input", node)
builder.register_module_output("output", node)
config = get_module_config(DFP_PREPROCESSING, builder)

file_batcher_conf = config.get(FILE_BATCHER, None)
file_to_df_conf = config.get(FILE_TO_DATAFRAME, None)
dfp_split_users_conf = config.get(DFP_SPLIT_USERS, None)
dfp_rolling_window_conf = config.get(DFP_ROLLING_WINDOW, None)
dfp_data_prep_conf = config.get(DFP_DATA_PREP, None)

# Load modules
file_batcher_module = load_module(file_batcher_conf, builder=builder)
file_to_dataframe_module = load_module(file_to_df_conf, builder=builder)
dfp_split_users_modules = load_module(dfp_split_users_conf, builder=builder)
dfp_rolling_window_module = load_module(dfp_rolling_window_conf, builder=builder)
dfp_data_prep_module = load_module(dfp_data_prep_conf, builder=builder)

# Make edge between the modules.
builder.make_edge(file_batcher_module.output_port("output"), file_to_dataframe_module.input_port("input"))
builder.make_edge(file_to_dataframe_module.output_port("output"), dfp_split_users_modules.input_port("input"))
builder.make_edge(dfp_split_users_modules.output_port("output"), dfp_rolling_window_module.input_port("input"))
builder.make_edge(dfp_rolling_window_module.output_port("output"), dfp_data_prep_module.input_port("input"))

# Register input and output port for a module.
builder.register_module_input("input", file_batcher_module.input_port("input"))
builder.register_module_output("output", dfp_data_prep_module.output_port("output"))
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Copyright (c) 2022, NVIDIA CORPORATION.
# Copyright (c) 2022-2023, NVIDIA CORPORATION.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
Expand All @@ -23,16 +23,18 @@
from dfp.utils.logging_timer import log_time
from mrc.core import operators as ops

from morpheus.utils.module_ids import MODULE_NAMESPACE
from morpheus.utils.module_utils import get_module_config
from morpheus.utils.module_utils import register_module

from ..messages.multi_dfp_message import DFPMessageMeta
from ..messages.multi_dfp_message import MultiDFPMessage
from ..utils.module_ids import DFP_ROLLING_WINDOW

logger = logging.getLogger(f"morpheus.{__name__}")
logger = logging.getLogger(__name__)


@register_module("DFPRollingWindow", "morpheus_modules")
@register_module(DFP_ROLLING_WINDOW, MODULE_NAMESPACE)
def dfp_rolling_window(builder: mrc.Builder):
"""
This module function establishes a rolling window to maintain history.
Expand All @@ -43,9 +45,7 @@ def dfp_rolling_window(builder: mrc.Builder):
Pipeline budler instance.
"""

module_id = "DFPRollingWindow"

config = get_module_config(module_id, builder)
config = get_module_config(DFP_ROLLING_WINDOW, builder)

timestamp_column_name = config.get("timestamp_column_name", None)
min_history = config.get("min_history", None)
Expand Down Expand Up @@ -76,9 +76,6 @@ def get_user_cache(user_id: str):

yield user_cache

# # When it returns, make sure to save
# user_cache.save()

def build_window(message: DFPMessageMeta) -> MultiDFPMessage:

user_id = message.user_id
Expand Down Expand Up @@ -158,7 +155,7 @@ def on_data(message: DFPMessageMeta):
def node_fn(obs: mrc.Observable, sub: mrc.Subscriber):
obs.pipe(ops.map(on_data), ops.filter(lambda x: x is not None)).subscribe(sub)

node = builder.make_node_full(module_id, node_fn)
node = builder.make_node_full(DFP_ROLLING_WINDOW, node_fn)

builder.register_module_input("input", node)
builder.register_module_output("output", node)
Loading