Skip to content

Commit

Permalink
MicroBatchWriterFactory
Browse files Browse the repository at this point in the history
  • Loading branch information
jaceklaskowski committed Nov 19, 2022
1 parent 914ee37 commit df98b4d
Show file tree
Hide file tree
Showing 4 changed files with 29 additions and 4 deletions.
2 changes: 1 addition & 1 deletion docs/StreamingDataWriterFactory.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# StreamingDataWriterFactory

`StreamingDataWriterFactory` is an [abstraction](#contract) of [factories](#implementations) that [create a DataWriter](#createWriter).
`StreamingDataWriterFactory` is an [abstraction](#contract) of [factories](#implementations) to [create a DataWriter](#createWriter).

## Contract

Expand Down
2 changes: 1 addition & 1 deletion docs/micro-batch-execution/MicroBatchWrite.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ commit(

`commit` requests the [StreamingWrite](#writeSupport) to [commit](../StreamingWrite.md#commit).

## <span id="createBatchWriterFactory"> Creating Batch DataWriterFactory
## <span id="createBatchWriterFactory"> Creating DataWriterFactory for Batch Write

```scala
createBatchWriterFactory(
Expand Down
27 changes: 26 additions & 1 deletion docs/micro-batch-execution/MicroBatchWriterFactory.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,28 @@
# MicroBatchWriterFactory

`MicroBatchWriterFactory` is...FIXME
`MicroBatchWriterFactory` is a `DataWriterFactory` ([Spark SQL]({{ book.spark_sql }}/connector/DataWriterFactory)).

## Creating Instance

`MicroBatchWriterFactory` takes the following to be created:

* <span id="epochId"> Epoch ID
* <span id="streamingWriterFactory"> [StreamingDataWriterFactory](../StreamingDataWriterFactory.md)

`MicroBatchWriterFactory` is created when:

* `MicroBatchWrite` is requested to [create a DataWriterFactory for batch write](MicroBatchWrite.md#createBatchWriterFactory)

## <span id="createWriter"> Creating DataWriter

```scala
createWriter(
partitionId: Int,
taskId: Long): DataWriter[InternalRow]
```

`createWriter` is part of the `DataWriterFactory` ([Spark SQL]({{ book.spark_sql }}/connector/DataWriterFactory#createWriter)) abstraction.

---

`createWriter` requests the [StreamingDataWriterFactory](#streamingWriterFactory) for a [DataWriter](../StreamingDataWriterFactory.md#createWriter).
2 changes: 1 addition & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -189,6 +189,7 @@ nav:
- ProgressReporter: ProgressReporter.md
- SQLConf: SQLConf.md
- StreamExecution: StreamExecution.md
- StreamingDataWriterFactory: StreamingDataWriterFactory.md
- StreamingQueryListenerBus: StreamingQueryListenerBus.md
- StreamMetadata: StreamMetadata.md
- TriggerExecutors:
Expand All @@ -197,7 +198,6 @@ nav:
- Misc:
- AcceptsLatestSeenOffsetHandler: AcceptsLatestSeenOffsetHandler.md
- AvailableNowDataStreamWrapper: AvailableNowDataStreamWrapper.md
- StreamingDataWriterFactory: StreamingDataWriterFactory.md
- StreamingQueryWrapper: StreamingQueryWrapper.md
- UnsupportedOperationChecker: UnsupportedOperationChecker.md
- Features:
Expand Down

0 comments on commit df98b4d

Please sign in to comment.