You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: service-catalog/rust-assets-backup/README.md
+6-38Lines changed: 6 additions & 38 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -80,53 +80,21 @@ Here we have one Google [Object Storage](https://cloud.google.com/storage?hl=en)
80
80
81
81
For the objects:
82
82
83
-
-Set the [storage class](https://cloud.google.com/storage/docs/storage-classes) to "archive" for all buckets.
83
+
-The [storage class](https://cloud.google.com/storage/docs/storage-classes) is set to "archive" for all buckets.
84
84
This is the cheapest class for infrequent access.
85
-
-Enable [object-versioning](https://cloud.google.com/storage/docs/object-versioning) and [soft-delete](https://cloud.google.com/storage/docs/soft-delete),
85
+
-[object-versioning](https://cloud.google.com/storage/docs/object-versioning) and [soft-delete](https://cloud.google.com/storage/docs/soft-delete) are enabled,
86
86
so that we can recover updates and deletes.
87
87
88
88
We use [Storage Transfer](https://cloud.google.com/storage-transfer/docs/overview) to automatically transfer the content of the s3 bucket into the Google Object Storage.
89
89
This is a service managed by Google. We'll use it to download the S3 buckets from CloudFront to perform a daily incremental transfer. The transfers only move files that are new, updated, or deleted since the last transfer, minimizing the amount of data that needs to be transferred.
90
90
91
-
### Monitoring 🕵️
91
+
##Explanations
92
92
93
-
To check that the backups are working:
93
+
-[FAQ](./faq.md)
94
94
95
-
- Ensure the number of files and the size of the GCP buckets is the same as the respective AWS buckets by looking at the metrics
96
-
- Ensure that only the authorized people have access to the account
95
+
## How-to Guides
97
96
98
-
You can also run the following test:
99
-
100
-
- Upload a file in an AWS S3 bucket and check that it appears in GCP.
101
-
- Edit the file in AWS and check that you can recover the previous version from GCP.
102
-
- Delete the in AWS and check that you can recover all previous versions from GCP.
103
-
104
-
In the future, we might want to create alerts in:
105
-
106
-
-_Datadog_: to monitor if the transfer job fails.
107
-
-_Wiz_: to monitor if the access control changes.
108
-
109
-
### Backup maintenance 🧹
110
-
111
-
If a crate version is deleted from the crates-io bucket (e.g. for GDPR reasons), an admin needs to delete it from the GCP bucket as well.
112
-
Even though the delete will propagate to GCP, the `soft-delete` feature will preserve the data, so we need to delete it manually.
113
-
114
-
### FAQ 🤔
115
-
116
-
#### Do we need a multi-region backup for the object storage?
117
-
118
-
No. [Multi-region](https://cloud.google.com/storage/docs/availability-durability#cross-region-redundancy) only helps if we want to serve this data real-time and we want to have a fallback mechanism if a GCP region fails. We just need this object storage for backup purposes, so we don't need to pay more 👍
119
-
120
-
#### Why did you choose the `europe-west1` GCP region?
121
-
122
-
It's far from the `us-west-1` region where the AWS S3 buckets are located. This protects us from geographical disasters.
123
-
The con is that the latency of the transfer job is higher when compared to a region in the US.
124
-
Also, the cost calculator indicates that this regions has a "Low CO2" and it's among the cheapest regions.
125
-
126
-
#### Why GCP?
127
-
128
-
Both the Rust Foundation and the Rust project have a good working relationship with Google, and it is where the Rust Foundation's Security Initiative hosts its infrastructure.
129
-
Due to the good collaboration with Google, we expect that we can cover the costs of the backup with credits provided by Google.
## Do we need a multi-region backup for the object storage?
4
+
5
+
No. [Multi-region](https://cloud.google.com/storage/docs/availability-durability#cross-region-redundancy) only helps if we want to serve this data real-time and we want to have a fallback mechanism if a GCP region fails. We just need this object storage for backup purposes, so we don't need to pay more 👍
6
+
7
+
## Why did you choose the `europe-west1` GCP region?
8
+
9
+
It's far from the `us-west-1` region where the AWS S3 buckets are located. This protects us from geographical disasters.
10
+
The con is that the latency of the transfer job is higher when compared to a region in the US.
11
+
Also, the cost calculator indicates that this regions has a "Low CO2" and it's among the cheapest regions.
12
+
13
+
## Why GCP?
14
+
15
+
Both the Rust Foundation and the Rust project have a good working relationship with Google, and it is where the Rust Foundation's Security Initiative hosts its infrastructure.
16
+
Due to the good collaboration with Google, we expect that we can cover the costs of the backup with credits provided by Google.
0 commit comments