Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AWS SageMaker] Integ test to check CloudWatch logs print feature #4056

Merged
merged 4 commits into from
Jul 9, 2020

Conversation

akartsky
Copy link
Contributor

Integration test checks if there is any error in fetching logs

@kubeflow-bot
Copy link

This change is Reviewable

Copy link
Contributor

@RedbackThomson RedbackThomson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this integration test just checks to make sure that we didn't see an error in the logs? We also need to test to ensure that we DID see the correct stdout in the logging.

Copy link
Contributor Author

@akartsky akartsky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this integration test just checks to make sure that we didn't see an error in the logs? We also need to test to ensure that we DID see the correct stdout in the logging.

this is being tested in the Unit tests that all logs are printed

output = utils.run_command(
f"argo logs {workflow_name} -n {utils.get_kfp_namespace()}"
)
output = get_workflow_logs(workflow_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Ideally there should be some error handling here if workflow name is not found, but I guess argo will take care of the right error.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the response to get_workflow_logs if there is a bad input? I imagine it would be a garbled error message that we are then using as the name.

output = utils.run_command(
f"argo logs {workflow_name} -n {utils.get_kfp_namespace()}"
)
output = get_workflow_logs(workflow_name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the response to get_workflow_logs if there is a bad input? I imagine it would be a garbled error message that we are then using as the name.

@RedbackThomson
Copy link
Contributor

/approve
/lgtm

@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: RedbackThomson

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot merged commit 799db47 into kubeflow:master Jul 9, 2020
Jeffwan pushed a commit to Jeffwan/pipelines that referenced this pull request Dec 9, 2020
…beflow#4056)

* Integ test for cw logs

* Update license file version to 0.5.3

* update version in yaml

* add changelog
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants