diff --git a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md index 149e8fe356de2..94ab1886315d4 100644 --- a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md +++ b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md @@ -14,9 +14,9 @@ https://github.com/Azure/azure-sdk-for-java/issues/new ## Documentation -- [Getting started](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md) -- [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) -- [Configuration Parameter Reference](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md) +- [Getting started](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md) +- [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) +- [Configuration Parameter Reference](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md) [//]: # (//TODO: moderakh add more sections) [//]: # (//TODO: moderakh Enable Client Logging) @@ -52,7 +52,7 @@ To suggest a new feature or changes that could be made, file an issue the same w ## License This project is under MIT license and uses and repackages other third party libraries as an uber jar. -See [NOTICE.txt](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/NOTICE.txt). +See [NOTICE.txt](https://github.com/Azure/azure-sdk-for-java/blob/master/NOTICE.txt). ## Contributing diff --git a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/dev/README.md b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/dev/README.md index 59e61864c9b80..c0c5bd6ad576e 100644 --- a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/dev/README.md +++ b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/dev/README.md @@ -63,7 +63,7 @@ Use the partner release pipeline to release. For each release we need to go over the OSS compliance steps: 1) Ensure the branch is tracked by Component Governance: https://dev.azure.com/azure-sdk/internal/_componentGovernance/106501?_a=components&typeId=6129920&alerts-view-option=active - master branch is automatically tracked. For the feature/cosmos/spark30 branch you need to manually trigger the run. + master branch is automatically tracked. For the master branch you need to manually trigger the run. 2) To add OSS components (or 3rd party source code) missed by CG, create a cgmanifest.json file. This typically happens if you manually copy code without adding maven dependency. howto https://docs.opensource.microsoft.com/tools/cg/cgmanifest.html. 3) Check for "legal alerts" and "security alerts". If you do not see a "legal alert" diff --git a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md index 4a7870da78328..55dd0045e3ae3 100644 --- a/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md +++ b/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md @@ -15,7 +15,7 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl - An active Azure account. If you don't have one, you can sign up for a [free account](https://azure.microsoft.com/try/cosmosdb/). Alternatively, you can use the - [use Azure Cosmos DB Emulator](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/local-emulator.md) for development and testing. + [use Azure Cosmos DB Emulator](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/local-emulator.md) for development and testing. - [Azure Databricks](https://docs.microsoft.com/azure/databricks/release-notes/runtime/8.0) Runtime 8.0 with Spark 3.1.1. - (Optional) [SLF4J binding](https://www.slf4j.org/manual.html) is used to associate a @@ -51,7 +51,7 @@ cfg = { } ``` -see [General Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#Generic-Configuration) for more detail. +see [General Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#Generic-Configuration) for more detail. You can use the new Catalog API to create a Cosmos DB Database and Container through Spark. Configure Catalog Api to be used @@ -75,7 +75,7 @@ spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.items TBL ``` Cosmos Catalog API for creating container supports setting throughput and partition-key-path for the container to be created. -see [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) for more detail. +see [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) for more detail. ### Ingest Data to Cosmos DB @@ -92,7 +92,7 @@ spark.createDataFrame((("cat-alive", "Schrodinger cat", 2, True), ("cat-dead", " ``` Note that `id` is a mandatory field for Cosmos DB. -see [Write Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#write-config) for more detail. +see [Write Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#write-config) for more detail. ### Query Cosmos DB @@ -109,12 +109,12 @@ df.filter(col("isAlive") == True)\ .show() ``` -see [Query Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#query-config) for more detail. +see [Query Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#query-config) for more detail. Note when running queries unless if are interested to get back the raw json payload we recommend setting `spark.cosmos.read.inferSchemaEnabled` to be `true`. -see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail. +see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail. ### See the Schema of Data Ingested in Cosmos DB Container @@ -128,5 +128,5 @@ df = spark.read.format("cosmos.items").options(**cfg)\ df.printSchema() ``` -see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail. +see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail.