Accelerate large files download using concurent chunks for lakectl do… #2
esti.yaml
on: push
Check if secrets are available.
4s
Generate code from latest lakeFS app
5m 18s
Test lakeFS rclone export functionality
0s
Test lakeFS Hadoop FileSystem
0s
Test lakeFS multipart upload with Hadoop S3A
0s
Test metastore client commands using trino
0s
Run latest lakeFS app on AWS S3 DynamoDB KV
0s
Run latest lakeFS app on Google Cloud Platform and Google Cloud Storage
0s
Run latest lakeFS app on Azure with Azure blobstore
0s
Run latest lakeFS app on Azure with Azure Data Lake Storage Gen2 and CosmosDB
0s
Test lakeFS against the python wrapper client
0s
E2E - DynamoDB Local - Local Block Adapter
0s
Quickstart
0s
Run latest lakeFS app on AWS S3 + Basic Auth
0s
Matrix: Run latest lakeFS app on AWS S3
Matrix: spark
Test lakeFS metadata client export with Spark 3.x
0s
Test unified gc
0s
Annotations
1 warning
Build lakeFS HadoopFS
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
|
Artifacts
Produced during runtime
Name | Size | Digest | |
---|---|---|---|
generated-code
|
27.5 MB |
sha256:0dba395bf862a4258dbd3c0ae19b297157d83eccd34cd301842fa134e4c31afc
|
|
lakefs-hadoopfs
|
5.68 MB |
sha256:5f6dc951aa1742fe18a116139717fea711445c643a39501b12af76790ba79ef6
|
|
spark-apps
|
313 KB |
sha256:5956f0ff201c4bcd168835e2ddf9d6f1797a0f3c9c1dbdabe32cb3005ae30fc2
|
|