- Run
build/sbt clean package publishLocal
to publish spark connector to local maven cache - These tests currently assume you have existing cloud resources set up (e.g. S3 bucket & IAM role for S3)
First, update integration test server.properties
- For S3: Set
s3.bucketPath.0
,s3.region.0
,s3.awsRoleArn.0
,s3.accessKey.0
, ands3.secretKey.0
- For GCP: Set
gcs.bucketPath.0
andgcs.jsonKeyFilePath.0
- For Azure: Set
adls.storageAccountName.0
,adls.tenantId.0
,adls.clientId.0
, andadls.clientSecret.0
Next, run the UC server to test against:
# run from the integration-tests dir to use the testing configurations
cd integration-tests
../bin/start-uc-server
In a separate shell, ensure a catalog is created for testing:
bin/uc catalog create --name unity
This scenario assumes an existing unity catalog is already running externally.
export CATALOG_URI=https://<my-uc-instance/
export CATALOG_AUTH_TOKEN=<my-access-token>
export CATALOG_NAME=<my-catalog-name>
By default, tests will run against the local filesystem. To run against cloud storage, set the following optional environment variables:
export S3_BASE_LOCATION=s3://<my-bucket>/<optional>/<path>/
export GS_BASE_LOCATION=gs://<my-bucket>/<optional>/<path>/
export ABFSS_BASE_LOCATION=abfss://<container>@<account_name>.dfs.core.windows.net/<optional>/<path>/
Finally, run the tests:
build/sbt integrationTests/test