You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For runs that output many metrics, the current experiment page only displays limited number of metrics.
It will be good to add a selector for users to choose what metrics to display, so they can be used for sorting and further analysis.
Example:
Run that generates many metrics
Only 2 metrics are displayed on the experiment page
What is the use case or pain point?
To analyze runs that output many metrics and reduce the "noisy" information displayed on the run comparison page
Is there a workaround currently?
The "compare runs" function can somehow achieve the purpose. But an overview and sorting over the "focus" metrics is better.
Love this idea? Give it a 👍. We prioritize fulfilling features with the most 👍.
The text was updated successfully, but these errors were encountered:
We are thinking about implementing this feature in KFP v2, because a basic requirement is that
In v2, we envision that each metric can be a separate artifact.
And pipeline DSL can allow users to return artifacts they select.
With these two new features, it'll be possible to select pipeline level output artifacts and only metrics returned to pipeline output are associated with the pipeline.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Feature Area
/area frontend
What feature would you like to see?
For runs that output many metrics, the current experiment page only displays limited number of metrics.
It will be good to add a selector for users to choose what metrics to display, so they can be used for sorting and further analysis.
Example:
Run that generates many metrics
Only 2 metrics are displayed on the experiment page
What is the use case or pain point?
To analyze runs that output many metrics and reduce the "noisy" information displayed on the run comparison page
Is there a workaround currently?
The "compare runs" function can somehow achieve the purpose. But an overview and sorting over the "focus" metrics is better.
Love this idea? Give it a 👍. We prioritize fulfilling features with the most 👍.
The text was updated successfully, but these errors were encountered: