Support profile for AWS-based credentials for restic #2033
Description
What steps did you take and what happened:
It seems Velero uses restic
CLI under the hood for taking data backups/restores, though it runs fine for a single backup-location but not working if multiple storage-locations (s3 based) are present with different credentials, i.e. multiple AWS profiles in AWS_SHARED_CREDENTIALS_FILE file which Restic pod mounts at /credentials/cloud
.
I guess the root of the problem is while executing restic command, Velero is not setting AWS_PROFILE
which will help restic to load correct credentials.
What did you expect to happen:
Restore/Backup with Restic should work will multiple AWS credentials.
The output of the following commands will help us better understand what's going on:
(Pasting long output into a GitHub gist or other pastebin is fine.)
kubectl logs deployment/velero -n velero
Outputvelero restore describe r2 --details
Outputvelero backup describe b1
Outputvelero backup logs b1
(outputs some junk)velero restore logs r2
(outputs some junk)velero restic repo get -ojson
Output
Anything else you would like to add:
[Miscellaneous information that will assist in solving the issue.]
As you can see above the restic repo is not getting in Ready
state because of the credentials problem.
velero get backup-locations -ojson
Output
Note that it has two backuplocations with profile
attribute set for testc1
. Here is the full credentials secret -
kubectl ksd -n velero get secrets cloud-credentials -ojson
Output
Environment:
- Velero version (use
velero version
):v1.1.0
- Velero features (use
velero client config get features
):features: <NOT SET>
- Kubernetes version (use
kubectl version
):Client: v1.12.1
Server:v1.15.4-k3s.1
- Kubernetes installer & version:
- Cloud provider or hardware configuration: k3d/k3s
- OS (e.g. from
/etc/os-release
):