Skip to content

Commit

Permalink
524 doc remove usage of buffer stage from examples.rst (nv-morpheus#528)
Browse files Browse the repository at this point in the history
- Removed `buffer` stage usage
- Updated graph images

Authors:
  - Bhargav Suryadevara (https://github.com/bsuryadevara)
  - https://github.com/bsuryadev

Approvers:
  - David Gardner (https://github.com/dagardner-nv)

URL: nv-morpheus#528
  • Loading branch information
bsuryadevara authored Dec 10, 2022
1 parent e1e3b47 commit d80fc19
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 21 deletions.
32 changes: 15 additions & 17 deletions docs/source/basics/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ This example will copy the values from Kafka into ``out.jsonlines``.
.. code-block:: bash
morpheus run pipeline-nlp --viz_file=basic_usage_img/simple_identity.png \
from-kafka --input_topic test_pcap \
from-kafka --bootstrap_servers localhost:9092 --input_topic test_pcap \
deserialize \
serialize \
to-file --filename out.jsonlines
Expand Down Expand Up @@ -63,16 +63,15 @@ This example will report the throughput on the command line.
monitor --description "Lines Throughput" --smoothing 0.1 --unit "lines" \
serialize \
to-file --filename out.jsonlines
Configuring Pipeline via CLI
Starting pipeline via CLI... Ctrl+C to Quit
Pipeline visualization saved to basic_usage_img/monitor_throughput.png
Lines Throughput: 88064lines [00:11, 7529.37lines/s]
Configuring Pipeline via CLI
Starting pipeline via CLI... Ctrl+C to Quit
Lines Throughput[Complete]: 93085 lines [00:04, 19261.06 lines/s]
Pipeline visualization saved to basic_usage_img/monitor_throughput.png
Multi-Monitor Throughput
^^^^^^^^^^^^^^^^^^^^^^^^

This example will report the throughput for each stage independently. Keep in mind, ``buffer`` stages are necessary to
decouple one stage from the next. Without the buffers, all monitoring would show the same throughput.
This example will report the throughput for each stage independently.

.. image:: img/multi_monitor_throughput.png

Expand All @@ -81,20 +80,18 @@ decouple one stage from the next. Without the buffers, all monitoring would show
$ morpheus run pipeline-nlp --viz_file=basic_usage_img/multi_monitor_throughput.png \
from-file --filename examples/data/pcap_dump.jsonlines \
monitor --description "From File Throughput" \
buffer \
deserialize \
monitor --description "Deserialize Throughput" \
buffer \
serialize \
monitor --description "Serialize Throughput" \
buffer \
to-file --filename out.jsonlines --overwrite
Configuring Pipeline via CLI
Starting pipeline via CLI... Ctrl+C to Quit
From File Throughput[Complete]: 93085 messages [00:00, 93852.05 messages/s]
Deserialize Throughput[Complete]: 93085 messages [00:05, 16898.32 messages/s]
Serialize Throughput[Complete]: 93085 messages [00:08, 11110.10 messages/s]
Pipeline visualization saved to basic_usage_img/multi_monitor_throughput.png
From File Throughput: 93085messages [00:09, 83515.94messages/s]
Deserialize Throughput: 93085messages [00:20, 9783.56messages/s]
Serialize Throughput: 93085messages [00:20, 9782.07messages/s]
NLP Kitchen Sink
^^^^^^^^^^^^^^^^
Expand All @@ -108,17 +105,18 @@ This example shows an NLP Pipeline which uses most stages available in Morpheus.
$ morpheus run --num_threads=8 --pipeline_batch_size=1024 --model_max_batch_size=32 \
pipeline-nlp --viz_file=basic_usage_img/nlp_kitchen_sink.png \
from-file --filename examples/data/pcap_dump.jsonlines \
buffer --count=500 \
deserialize \
preprocess \
buffer \
inf-triton --model_name=sid-minibert-onnx --server_url=localhost:8001 \
monitor --description "Inference Rate" --smoothing=0.001 --unit "inf" \
add-class \
filter --threshold=0.8 \
serialize --include 'timestamp' --exclude '^_ts_' \
to-kafka --output_topic "inference_output"
to-kafka --bootstrap_servers localhost:9092 --output_topic "inference_output" \
monitor --description "ToKafka Rate" --smoothing=0.001 --unit "msg"
Configuring Pipeline via CLI
Starting pipeline via CLI... Ctrl+C to Quit
Inference Rate[Complete]: 93085 inf [00:07, 12334.49 inf/s]
ToKafka Rate[Complete]: 93085 msg [00:07, 13297.85 msg/s]
Pipeline visualization saved to basic_usage_img/nlp_kitchen_sink.png
Inference Rate: 16384inf [19:50, 13.83inf/s]
4 changes: 2 additions & 2 deletions docs/source/basics/img/multi_monitor_throughput.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 2 additions & 2 deletions docs/source/basics/img/nlp_kitchen_sink.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit d80fc19

Please sign in to comment.