Pipelines for both training and inference requests #1579
-
Good morning, We would like to set up a "training" pipeline for our models, as well as another pipeline for inference requests. The idea would be to receive the data via Kafka, which messages are snapshots sent by osquery, to train the model concerned according to the type of data received and to pass it to MLflow for insertion into the model repository (new models versions for inference request). We're using TensorFlow saved models. basically put, we want do continuous training ... The question we are asking us is how to switch training context according to the input data type. Is it possible? A sort of "dispatcher" stage which would target dedicated stages for training of specific data type? Basically, we would like to follow the following pattern : https://docs.nvidia.com/morpheus/developer_guide/guides/6_digital_fingerprinting_reference.html |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
@nuxwin Yes, Morpheus allows you to fork your pipeline where you can then use control messages to control message flow across the parallel branches. This is done in the Modular DFP Pipeline Example where we use a Broadcast node to fork the pipeline into parallel branches for Training and Inference. The Control Message Filter module is used to filter message flow across each branch based on task type. In your case, you would filter on data type. If you choose to build your pipeline using stages instead of modules, you can use a split stage as done here to fork your pipeline. |
Beta Was this translation helpful? Give feedback.
@nuxwin Yes, Morpheus allows you to fork your pipeline where you can then use control messages to control message flow across the parallel branches. This is done in the Modular DFP Pipeline Example where we use a Broadcast node to fork the pipeline into parallel branches for Training and Inference. The Control Message Filter module is used to filter message flow across each branch based on task type. In your case, you would filter on data type.
If you choose to build your pipeline using stages instead of modules, you can use a split stage as done here to fork your pipeline.