Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(firestore-bigquery-export): rollback backfill feature #1845

Merged
merged 1 commit into from
Nov 27, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions firestore-bigquery-export/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
## Version 0.1.41

fix - rollback backfill feature

## Version 0.1.40

fix - correct default value for use collection group query param
Expand Down
58 changes: 1 addition & 57 deletions firestore-bigquery-export/extension.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
# limitations under the License.

name: firestore-bigquery-export
version: 0.1.40
version: 0.1.41
specVersion: v1beta

displayName: Stream Firestore to BigQuery
Expand Down Expand Up @@ -338,62 +338,6 @@ params:
value: no
default: no
required: true

- param: DO_BACKFILL
label: Import existing Firestore documents into BigQuery?
description: >-
Do you want to import existing documents from your Firestore collection into BigQuery? These documents
will have each have a special changelog with the operation of `IMPORT` and the timestamp of epoch.
This ensures that any operation on an imported document supersedes the import record.
type: select
required: true
options:
- label: Yes
value: yes
- label: No
value: no

- param: IMPORT_COLLECTION_PATH
label: Existing documents collection
description: >-
What is the path of the the Cloud Firestore Collection you would like to import from?
(This may, or may not, be the same Collection for which you plan to mirror changes.)
If you want to use a collectionGroup query, provide the collection name value here,
and set 'Use Collection Group query' to true.
type: string
validationRegex: "^[^/]+(/[^/]+/[^/]+)*$"
validationErrorMessage: Firestore collection paths must be an odd number of segments separated by slashes, e.g. "path/to/collection".
example: posts
required: false

- param: USE_COLLECTION_GROUP_QUERY
label: Use Collection Group query
description: >-
Do you want to use a [collection group](https://firebase.google.com/docs/firestore/query-data/queries#collection-group-query) query for importing existing documents?
Warning: A collectionGroup query will target every collection in your Firestore project that matches the 'Existing documents collection'.
For example, if you have 10,000 documents with a sub-collection named: landmarks, this will query every document in 10,000 landmarks collections.
type: select
default: no
options:
- label: Yes
value: yes
- label: No
value: no

- param: DOCS_PER_BACKFILL
label: Docs per backfill
description: >-
When importing existing documents, how many should be imported at once?
The default value of 200 should be ok for most users.
If you are using a transform function or have very large documents, you may need to set this to a lower number.
If the lifecycle event function times out, lower this value.
type: string
example: 200
validationRegex: "^[1-9][0-9]*$"
validationErrorMessage: Must be a postive integer.
default: 200
required: true


- param: KMS_KEY_NAME
label: Cloud KMS key name
Expand Down
2 changes: 1 addition & 1 deletion firestore-bigquery-export/functions/src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ export default {
bqProjectId: process.env.BIGQUERY_PROJECT_ID,
collectionPath: process.env.COLLECTION_PATH,
datasetId: process.env.DATASET_ID,
doBackfill: process.env.DO_BACKFILL === "yes",
doBackfill: false,
docsPerBackfill: parseInt(process.env.DOCS_PER_BACKFILL) || 200,
tableId: process.env.TABLE_ID,
location: process.env.LOCATION,
Expand Down
Loading