Skip to content

Test failure: test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unknown\nbackend = RuntimeBackend()\ntry:\n grants = backend.fetch("SHOW GRANTS ON METASTORE")\n print("FAILED")\nexcept Unknown:\n print("PASSED")\n] #489

@github-actions

Description

@github-actions
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unknown\nbackend = RuntimeBackend()\ntry:\n grants = backend.fetch("SHOW GRANTS ON METASTORE")\n print("FAILED")\nexcept Unknown:\n print("PASSED")\n]: AssertionError: assert 'FAILED' == 'PASSED' (25.084s)
AssertionError: assert 'FAILED' == 'PASSED'
  
  - PASSED
  + FAILED
05:17 DEBUG [databricks.sdk] Loaded from environment
05:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
05:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
[gw1] linux -- Python 3.10.19 /home/runner/work/lsql/lsql/.venv/bin/python
05:17 DEBUG [databricks.sdk] Loaded from environment
05:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
05:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
05:17 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/153383108335587",
<       "display": "users",
<       "type": "direct",
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
05:17 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0lxnye0g/working-copy in /tmp/tmp0lxnye0g
05:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels/databricks_labs_lsql-0.16.1+320260219051747-py3-none-any.whl
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels) does not exist."
< }
05:17 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels"
> }
< 200 OK
< {}
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3768182285747818
< }
05:17 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
05:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/version.json
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3768182285747820
< }
05:17 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "ON_DEMAND_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "liran.bareket@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "liran.bareket@databricks.com",
<     "DatabricksInstanceGroupId": "-7571316921879686317",
<     "DatabricksInstancePoolCreatorId": "6779888502363704",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.12.11",
<     "instance_id": "3c075f30c7ac461aac3bed23b75df53d",
<     "ngrok_endpoint_base_domain": "green.mux.ngrok-dataplane.wildcard",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "32e8f19b9eac40f08e3ae86351bf3f4a",
<     "node_type_id": "Standard_D8ads_v6",
<     "private_ip": "10.179.14.11",
<     "public_dns": "",
<     "start_timestamp": 1771478184840
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8ads_v6",
<   "effective_spark_version": "16.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1771478207227,
<   "last_restarted_time": 1771478235069,
<   "last_state_loss_time": 1771478235025,
<   "node_type_id": "Standard_D8ads_v6",
<   "num_workers": 0,
<   "pinned_by_user_name": "6779888502363704",
<   "release_version": "16.4.17",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7456092220667130223,
<   "spark_version": "16.4.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.4.x-scala2.12"
<   },
<   "start_time": 1759339672984,
<   "state": "RUNNING",
<   "state_message": ""
< }
05:17 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "2423109679674500450"
< }
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Pending"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=2423109679674500450: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Pending"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=2423109679674500450: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "2423109679674500450",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e"
< }
05:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:18 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:18 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels/databricks_labs_ls... (5292 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
05:18 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "from databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unkno... (145 more bytes)",
>   "contextId": "2423109679674500450",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "656f72a7d5a14681ab2aaee5bb0a9105"
< }
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=656f72a7d5a14681ab2aaee5bb0a9105&contextId=2423109679674500450
< 200 OK
< {
<   "id": "656f72a7d5a14681ab2aaee5bb0a9105",
<   "results": {
<     "data": "FAILED",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
05:17 DEBUG [databricks.sdk] Loaded from environment
05:17 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
05:17 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
05:17 INFO [databricks.sdk] Using Databricks Metadata Service authentication
05:17 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/153383108335587",
<       "display": "users",
<       "type": "direct",
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
05:17 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0lxnye0g/working-copy in /tmp/tmp0lxnye0g
05:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels/databricks_labs_lsql-0.16.1+320260219051747-py3-none-any.whl
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels) does not exist."
< }
05:17 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels"
> }
< 200 OK
< {}
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3768182285747818
< }
05:17 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
05:17 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/version.json
05:17 DEBUG [databricks.sdk] Retry disabled for non-seekable stream: type=<class 'dict'>
05:17 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 3768182285747820
< }
05:17 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "ON_DEMAND_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "liran.bareket@databricks.com",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "liran.bareket@databricks.com",
<     "DatabricksInstanceGroupId": "-7571316921879686317",
<     "DatabricksInstancePoolCreatorId": "6779888502363704",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "labs-oss@databricks.com",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.12.11",
<     "instance_id": "3c075f30c7ac461aac3bed23b75df53d",
<     "ngrok_endpoint_base_domain": "green.mux.ngrok-dataplane.wildcard",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "32e8f19b9eac40f08e3ae86351bf3f4a",
<     "node_type_id": "Standard_D8ads_v6",
<     "private_ip": "10.179.14.11",
<     "public_dns": "",
<     "start_timestamp": 1771478184840
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8ads_v6",
<   "effective_spark_version": "16.4.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1771478207227,
<   "last_restarted_time": 1771478235069,
<   "last_state_loss_time": 1771478235025,
<   "node_type_id": "Standard_D8ads_v6",
<   "num_workers": 0,
<   "pinned_by_user_name": "6779888502363704",
<   "release_version": "16.4.17",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 7456092220667130223,
<   "spark_version": "16.4.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.4.x-scala2.12"
<   },
<   "start_time": 1759339672984,
<   "state": "RUNNING",
<   "state_message": ""
< }
05:17 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "2423109679674500450"
< }
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Pending"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=2423109679674500450: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Pending"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=2423109679674500450: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=2423109679674500450
< 200 OK
< {
<   "id": "2423109679674500450",
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "2423109679674500450",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e"
< }
05:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
05:17 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:17 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:18 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": null,
<   "status": "Running"
< }
05:18 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=17df7a397b89499a9901aa54dd8b297e, context_id=2423109679674500450: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~4s)
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=17df7a397b89499a9901aa54dd8b297e&contextId=2423109679674500450
< 200 OK
< {
<   "id": "17df7a397b89499a9901aa54dd8b297e",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.OVoy/wheels/databricks_labs_ls... (5292 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
05:18 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "from databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import Unkno... (145 more bytes)",
>   "contextId": "2423109679674500450",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "656f72a7d5a14681ab2aaee5bb0a9105"
< }
05:18 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=656f72a7d5a14681ab2aaee5bb0a9105&contextId=2423109679674500450
< 200 OK
< {
<   "id": "656f72a7d5a14681ab2aaee5bb0a9105",
<   "results": {
<     "data": "FAILED",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
[gw1] linux -- Python 3.10.19 /home/runner/work/lsql/lsql/.venv/bin/python

Running from nightly #464

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions