Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tutorial] - Fixed lightweight component tutorial with bad metadata usage #3186

Merged
merged 2 commits into from
Mar 4, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions samples/tutorials/mnist/00_Kubeflow_Cluster_Setup.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,13 @@
"! tar -xvf kfctl_v0.7.0_linux.tar.gz"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**If you are using AI Platform Notebooks**, your environment is already authenticated. Skip the following cell."
]
},
{
"cell_type": "code",
"execution_count": null,
Expand Down
58 changes: 20 additions & 38 deletions samples/tutorials/mnist/01_Lightweight_Python_Components.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -186,9 +186,10 @@
"from typing import NamedTuple\n",
"\n",
"def my_divmod(dividend: float, \n",
" divisor: float, \n",
" output_dir: str = './'\n",
" ) -> NamedTuple('MyDivmodOutput', [('quotient', float), ('remainder', float)]):\n",
" divisor: float,\n",
" ) -> NamedTuple('MyDivmodOutput', [('quotient', float), ('remainder', float), \n",
" ('mlpipeline_ui_metadata', 'UI_metadata'), \n",
" ('mlpipeline_metrics', 'Metrics')]):\n",
" \n",
" '''Divides two numbers and calculate the quotient and remainder'''\n",
" \n",
Expand All @@ -201,7 +202,6 @@
"\n",
" (quotient, remainder) = divmod_helper(dividend, divisor)\n",
"\n",
" from tensorflow.python.lib.io import file_io\n",
" import json\n",
" \n",
" # Exports a sample tensorboard:\n",
Expand All @@ -211,8 +211,6 @@
" 'source': 'gs://ml-pipeline-dataset/tensorboard-train',\n",
" }]\n",
" }\n",
" with open(output_dir + 'mlpipeline-ui-metadata.json', 'w') as f:\n",
" json.dump(metadata, f)\n",
"\n",
" # Exports two sample metrics:\n",
" metrics = {\n",
Expand All @@ -224,12 +222,10 @@
" 'numberValue': float(remainder),\n",
" }]}\n",
"\n",
" with file_io.FileIO(output_dir + 'mlpipeline-metrics.json', 'w') as f:\n",
" json.dump(metrics, f)\n",
"\n",
" from collections import namedtuple\n",
" divmod_output = namedtuple('MyDivmodOutput', ['quotient', 'remainder'])\n",
" return divmod_output(quotient, remainder)"
" divmod_output = namedtuple('MyDivmodOutput', \n",
" ['quotient', 'remainder', 'mlpipeline_ui_metadata', 'mlpipeline_metrics'])\n",
" return divmod_output(quotient, remainder, json.dumps(metadata), json.dumps(metrics))"
]
},
{
Expand All @@ -247,7 +243,8 @@
"metadata": {},
"outputs": [],
"source": [
"divmod_op = comp.func_to_container_op(my_divmod)"
"divmod_op = comp.func_to_container_op(func=my_divmod, \n",
" base_image=\"tensorflow/tensorflow:1.15.0-py3\")"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it really need the tensorflow image?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really. As long as having base container image with python 3.5+ and Numpy installed I think. Previously, the tensorflow IO is used to dump the meta data to a file. It still serve the purpose though.

]
},
{
Expand Down Expand Up @@ -278,7 +275,7 @@
" \n",
" #Passing a task output reference as operation arguments\n",
" #For an operation with a single return value, the output reference can be accessed using `task.output` or `task.outputs['output_name']` syntax\n",
" divmod_task = divmod_op(add_task.output, b, '/')\n",
" divmod_task = divmod_op(add_task.output, b)\n",
"\n",
" #For an operation with a multiple return values, the output references can be accessed using `task.outputs['output_name']` syntax\n",
" result_task = add_op(divmod_task.outputs['quotient'], c)"
Expand All @@ -288,7 +285,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Compile the pipeline"
"## Submit the pipeline"
]
},
{
Expand All @@ -297,26 +294,7 @@
"metadata": {},
"outputs": [],
"source": [
"pipeline_func = calc_pipeline\n",
"pipeline_filename = pipeline_func.__name__ + '.pipeline.zip'\n",
"import kfp.compiler as compiler\n",
"compiler.Compiler().compile(pipeline_func, pipeline_filename)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"! ls ."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Submit the pipeline"
"pipeline_func = calc_pipeline"
]
},
{
Expand All @@ -325,14 +303,18 @@
"metadata": {},
"outputs": [],
"source": [
"experiment_name = 'python-functions'\n",
"\n",
"#Specify pipeline argument values\n",
"arguments = {'a': '7', 'b': '8'}\n",
"\n",
"experiment = client.create_experiment('python-functions')\n",
"\n",
"#Submit a pipeline run\n",
"run_name = pipeline_func.__name__ + ' run'\n",
"run_result = client.run_pipeline(experiment.id, run_name, pipeline_filename, arguments)"
"\n",
"# Submit pipeline directly from pipeline function\n",
"run_result = client.create_run_from_pipeline_func(pipeline_func, \n",
" experiment_name=experiment_name, \n",
" run_name=run_name, \n",
" arguments=arguments)"
]
},
{
Expand Down