-
Notifications
You must be signed in to change notification settings - Fork 8.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Search] Inference Management FTR Tests #188438
base: main
Are you sure you want to change the base?
[Search] Inference Management FTR Tests #188438
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still need to pull this down and test them, but if we're enabling the feature for ES3 then the tests should not be disabled in MKI IMO
const ml = getService('ml'); | ||
|
||
describe('Serverless Inference Management UI', function () { | ||
this.tags('skipMKI'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if we're enabling the feature flag then we want to remove this tag. Enabling the feature means we WANT to run these against MKI.
We should verify the tests against MKI before merging this.
@@ -17,17 +17,17 @@ xpack.osquery.enabled: false | |||
## Fine-tune the search solution feature privileges. Also, refer to `serverless.yml` for the project-agnostic overrides. | |||
xpack.features.overrides: | |||
### Dashboards feature is moved from Analytics category to the Search one. | |||
dashboard.category: "enterpriseSearch" | |||
dashboard.category: 'enterpriseSearch' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we need these quote changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do not need these quote changes. These go added when I rand lint
command so I assumed something changed with the eslint configuration which forced the file to update.
Flaky Test Runner Stats🎉 All tests passed! - kibana-flaky-test-suite-runner#6558[✅] x-pack/test_serverless/functional/test_suites/search/config.ts: 25/25 tests passed. |
@elasticmachine merge upstream |
💛 Build succeeded, but was flaky
Failed CI StepsMetrics [docs]Async chunks
History
|
@TattdCodeMonkey : can you please give a review again? |
@@ -65,7 +65,7 @@ data_visualizer.resultLinks.fileBeat.enabled: false | |||
xpack.searchPlayground.ui.enabled: true | |||
|
|||
# Search InferenceEndpoints | |||
xpack.searchInferenceEndpoints.ui.enabled: false | |||
xpack.searchInferenceEndpoints.ui.enabled: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Based on the recommendation #5, we may need to hide the Inference Endpoints page in the servereless environment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, we will not be merging this PR as the requirements changed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Samiul-TheSoccerFan we can merge this PR, we just need to not enable the FF. instead run the new FTRs in the feature flag config and only enable it there.
num_threads: 1, | ||
}, | ||
}; | ||
await ml.api.createInferenceEndpoint('endpoint-1', 'sparse_embedding', modelConfig); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This endpoint was created in the ‘before’ block. We may need to write some cleanup code in the ‘after’ block to delete the endpoint and remove the underlying trained model if it was added during the creation of the endpoint.
I’m suggesting this because we previously had a CI/CD failure due to not cleaning up created resources. See this PR as an example of cleanup.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At the last test, we removed the inference endpoint
from the UI. Do we still need to do the cleanup or it will be handled through the application itself?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please check whether creation of inference endpoint adds any new trained model. If so, we need to clean up the trained model.
num_threads: 1, | ||
}, | ||
}; | ||
await ml.api.createInferenceEndpoint('endpoint-1', 'sparse_embedding', modelConfig); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to this comment, we need to write some clean-up in the 'after' block.
deprioritizing this PR as the plugin flow is expected to be changed. We will revisit this PR once we decide to move forward in Serverless. |
Could you switch this back to a draft PR? |
Summary
FTR tests for Inference Management UI. Due to a race condition between license subscription and app registration, we are avoiding these tests for Stack only.
Checklist
Delete any items that are not applicable to this PR.