Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regenerate monitor code #19375

Merged
merged 9 commits into from
Jun 23, 2021
Merged

Regenerate monitor code #19375

merged 9 commits into from
Jun 23, 2021

Conversation

rakshith91
Copy link
Contributor

Fixes #19274
Fixes #19273

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@check-enforcer
Copy link

This pull request is protected by Check Enforcer.

What is Check Enforcer?

Check Enforcer helps ensure all pull requests are covered by at least one check-run (typically an Azure Pipeline). When all check-runs associated with this pull request pass then Check Enforcer itself will pass.

Why am I getting this message?

You are getting this message because Check Enforcer did not detect any check-runs being associated with this pull request within five minutes. This may indicate that your pull request is not covered by any pipelines and so Check Enforcer is correctly blocking the pull request being merged.

What should I do now?

If the check-enforcer check-run is not passing and all other check-runs associated with this PR are passing (excluding license-cla) then you could try telling Check Enforcer to evaluate your pull request again. You can do this by adding a comment to this pull request as follows:
/check-enforcer evaluate
Typically evaulation only takes a few seconds. If you know that your pull request is not covered by a pipeline and this is expected you can override Check Enforcer using the following command:
/check-enforcer override
Note that using the override command triggers alerts so that follow-up investigations can occur (PRs still need to be approved as normal).

What if I am onboarding a new service?

Often, new services do not have validation pipelines associated with them, in order to bootstrap pipelines for a new service, you can issue the following command as a pull request comment:
/azp run prepare-pipelines
This will run a pipeline that analyzes the source tree and creates the pipelines necessary to build and validate your pull request. Once the pipeline has been created you can trigger the pipeline using the following comment:
/azp run python - [service] - ci

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Comment on lines 78 to 80
self.tables = kwargs.get("tables", None)
self.errors = kwargs.get("errors", None)
self.statistics = kwargs.pop("statistics", None)
self.render = kwargs.pop("render", None)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should the last two also be get for consistency? how does it affect what is being passed in? or does it not matter?

prefer += " "
prefer += "include-render=true"

headers = kwargs.get("headers", None)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why get vs. pop?

workspace= os.environ['LOG_WORKSPACE_ID']
query= "AppRequests",
workspace= os.environ['LOG_WORKSPACE_ID'],
include_statistics=True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should sample also have include_render, server_timeout or nah?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we have a different sample for server time out - for render we are avoiding for now

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91
Copy link
Contributor Author

/azp run python - monitor - tests

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@rakshith91 rakshith91 merged commit 676b80e into Azure:main Jun 23, 2021
iscai-msft added a commit to iscai-msft/azure-sdk-for-python that referenced this pull request Jun 24, 2021
…into get_testserver_working

* 'main' of https://github.com/Azure/azure-sdk-for-python: (45 commits)
  ignore coretestserver readme (Azure#19436)
  Add Ubuntu 20 to local dns bypass template (Azure#19432)
  Sync eng/common directory with azure-sdk-tools for PR 1729 (Azure#19415)
  Async/BearerTokenCredentialPolicy consistently calls on_exception (Azure#19195)
  [EventHubs] Fix bug in sending stress test code and update default stress test settings (Azure#19429)
  [EventHubs] Get IoT Hub Name from Redirect Address in sample (Azure#19314)
  [textanalytics] regen on v3.1 (Azure#19193)
  Adapt EG to arm template (Azure#19262)
  [Key Vault] Extend pipeline test timeout (Azure#19404)
  Update platform matrix to ubuntu 20 (Azure#19296)
  [AppConfig] Add lock to SyncTokenPolicy (Azure#19395)
  Regenerate monitor code (Azure#19375)
  Increment version for keyvault releases (Azure#19402)
  Aggregation should be a list (Azure#19381)
  [azure-mgmt-monitor] skip test to unblock ci (Azure#19390)
  Cloud event should parse smaller ms precisions (Azure#19259)
  Update release date (Azure#19399)
  [Communication]: use x-ms-date for hmac (Azure#19396)
  [Key Vault] Performance tests for certificates, keys, and secrets (Azure#19002)
  Deprecate azure-monitor (Azure#19384)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

regenerate code with the latest commit statistics, render should be supported
2 participants