Skip to content

Commit

Permalink
updated links to point to master branch (#20211)
Browse files Browse the repository at this point in the history
spark updated links to point to master branch instead of feature/cosmos/spark30
  • Loading branch information
moderakh authored Mar 29, 2021
1 parent 852122a commit 767628a
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 12 deletions.
8 changes: 4 additions & 4 deletions sdk/cosmos/azure-cosmos-spark_3-1_2-12/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ https://github.com/Azure/azure-sdk-for-java/issues/new

## Documentation

- [Getting started](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md)
- [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md)
- [Configuration Parameter Reference](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md)
- [Getting started](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md)
- [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md)
- [Configuration Parameter Reference](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md)

[//]: # (//TODO: moderakh add more sections)
[//]: # (//TODO: moderakh Enable Client Logging)
Expand Down Expand Up @@ -52,7 +52,7 @@ To suggest a new feature or changes that could be made, file an issue the same w

## License
This project is under MIT license and uses and repackages other third party libraries as an uber jar.
See [NOTICE.txt](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/NOTICE.txt).
See [NOTICE.txt](https://github.com/Azure/azure-sdk-for-java/blob/master/NOTICE.txt).

## Contributing

Expand Down
2 changes: 1 addition & 1 deletion sdk/cosmos/azure-cosmos-spark_3-1_2-12/dev/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Use the partner release pipeline to release.
For each release we need to go over the OSS compliance steps:

1) Ensure the branch is tracked by Component Governance: https://dev.azure.com/azure-sdk/internal/_componentGovernance/106501?_a=components&typeId=6129920&alerts-view-option=active
master branch is automatically tracked. For the feature/cosmos/spark30 branch you need to manually trigger the run.
master branch is automatically tracked. For the master branch you need to manually trigger the run.
2) To add OSS components (or 3rd party source code) missed by CG, create a cgmanifest.json file. This typically happens if you manually copy code
without adding maven dependency. howto https://docs.opensource.microsoft.com/tools/cg/cgmanifest.html.
3) Check for "legal alerts" and "security alerts". If you do not see a "legal alert"
Expand Down
14 changes: 7 additions & 7 deletions sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ You can use any other Spark 3.1.1 spark offering as well, also you should be abl
- An active Azure account. If you don't have one, you can sign up for a
[free account](https://azure.microsoft.com/try/cosmosdb/).
Alternatively, you can use the
[use Azure Cosmos DB Emulator](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/local-emulator.md) for development and testing.
[use Azure Cosmos DB Emulator](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/local-emulator.md) for development and testing.
- [Azure Databricks](https://docs.microsoft.com/azure/databricks/release-notes/runtime/8.0)
Runtime 8.0 with Spark 3.1.1.
- (Optional) [SLF4J binding](https://www.slf4j.org/manual.html) is used to associate a
Expand Down Expand Up @@ -51,7 +51,7 @@ cfg = {
}
```

see [General Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#Generic-Configuration) for more detail.
see [General Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#Generic-Configuration) for more detail.

You can use the new Catalog API to create a Cosmos DB Database and Container through Spark.
Configure Catalog Api to be used
Expand All @@ -75,7 +75,7 @@ spark.sql("CREATE TABLE IF NOT EXISTS cosmosCatalog.{}.{} using cosmos.items TBL
```
Cosmos Catalog API for creating container supports setting throughput and partition-key-path for the container to be created.

see [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) for more detail.
see [Catalog API](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/catalog-api.md) for more detail.

### Ingest Data to Cosmos DB

Expand All @@ -92,7 +92,7 @@ spark.createDataFrame((("cat-alive", "Schrodinger cat", 2, True), ("cat-dead", "
```
Note that `id` is a mandatory field for Cosmos DB.

see [Write Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#write-config) for more detail.
see [Write Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#write-config) for more detail.


### Query Cosmos DB
Expand All @@ -109,12 +109,12 @@ df.filter(col("isAlive") == True)\
.show()
```

see [Query Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#query-config) for more detail.
see [Query Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#query-config) for more detail.

Note when running queries unless if are interested to get back the raw json payload
we recommend setting `spark.cosmos.read.inferSchemaEnabled` to be `true`.

see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail.
see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail.


### See the Schema of Data Ingested in Cosmos DB Container
Expand All @@ -128,5 +128,5 @@ df = spark.read.format("cosmos.items").options(**cfg)\
df.printSchema()
```

see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/feature/cosmos/spark30/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail.
see [Schema Inference Configuration](https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/cosmos/azure-cosmos-spark_3-1_2-12/docs/configuration-reference.md#schema-inference-config) for more detail.

0 comments on commit 767628a

Please sign in to comment.