Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

App of apps being overwritten by image-updater #896

Closed
mehdicopter opened this issue Oct 24, 2024 · 42 comments · Fixed by #918
Closed

App of apps being overwritten by image-updater #896

mehdicopter opened this issue Oct 24, 2024 · 42 comments · Fixed by #918
Labels
blocker bug Something isn't working

Comments

@mehdicopter
Copy link

mehdicopter commented Oct 24, 2024

Screenshot 2024-10-24 at 19 28 26
Screenshot 2024-10-24 at 19 28 48

Describe the bug

I am using ArgoCD with the “App of Apps” pattern. After updating argo-cd-image-updater to version 0.15.0, I encountered an unexpected side effect.

When updating the image of a child application, ArgoCD also updates the parent application (“App of Apps”). This causes a resource conflict, as both the child application (“myapp”) and the parent application (“root”) end up supervising the same resources.

To Reproduce

  1. Use ArgoCD with the “App of Apps” pattern.
  2. Update argo-cd-image-updater to version 0.15.0.
  3. Perform an image update for a child application.
  4. Observe that the parent application also attempts to supervise the same resources.

Expected behavior

Only the child application (“myapp”) should be updated when its image is changed, without the parent application (“root”) taking control over the same resources.

Additional context

This issue was not present with the previous version of argo-cd-image-updater (0.14.x).

Version

{
    "Version": "v2.12.6+4dab5bd",
    "BuildDate": "2024-10-18T17:39:26Z",
    "GitCommit": "4dab5bd6a60adea12e084ad23519e35b710060a2",
    "GitTreeState": "clean",
    "GoVersion": "go1.22.4",
    "Compiler": "gc",
    "Platform": "linux/amd64",
    "KustomizeVersion": "v5.4.2 2024-05-22T15:19:38Z",
    "HelmVersion": "v3.15.2+g1a500d5",
    "KubectlVersion": "v0.29.6",
    "JsonnetVersion": "v0.20.0"
}
@mehdicopter mehdicopter added the bug Something isn't working label Oct 24, 2024
@chengfang
Copy link
Collaborator

It's not obvious which commits may cause this regression in v0.15.0. #854 looks a bit suspicious.

Is it possible to filter out the parent app via command line options --match-application-label --match-application-name?

@mehdicopter
Copy link
Author

I am trying to use those filters but I am having those errors:

klf argocd-image-updater-5bdb94f977-56hcv
Error: unknown flag: --match-application-label app.company.com/name

What am i doing wrong ? 😬

@mehdicopter
Copy link
Author

mehdicopter commented Oct 24, 2024

I am using kustomize to update the deployment of argocd-image-updater

apiVersion: apps/v1
kind: Deployment
metadata:
  name: argocd-image-updater
spec:
  selector:
    matchLabels:
      app.kubernetes.io/name: argocd-image-updater
  template:
    spec:
      volumes:
        - name: scripts
          configMap:
            name: argocd-image-updater-scripts
            defaultMode: 0777
      containers:
        - name: argocd-image-updater
          args:
            - run
            - --match-application-label app.company.com/name=myapp
          volumeMounts:
            - name: scripts
              mountPath: /scripts

@jannfis
Copy link
Contributor

jannfis commented Oct 24, 2024

In your snippet below, the command line switch and its parameter are being passed as a single argument.

To fix it, you can either use

          args:
            - run
            - --match-application-label=app.company.com/name=myapp

(note the equal sign between the parameter and the value) or

          args:
            - run
            - --match-application-label
            - app.company.com/name=myapp

@mehdicopter
Copy link
Author

Even with the matching label it does the same... look at the screenshot.
Screenshot 2024-10-25 at 00 43 33

@mehdicopter
Copy link
Author

The root app-of-apps is behaving like being the app, resulting of having 2 argo apps responsible to handle resources, which causes SharedResourceWarning

@mehdicopter
Copy link
Author

mehdicopter commented Oct 24, 2024

According to the logs, it does not even update the root application.

time="2024-10-24T22:54:12Z" level=debug msg="Applications listed: 12"
time="2024-10-24T22:54:12Z" level=info msg="Starting image update cycle, considering 1 annotated application(s) for update"
time="2024-10-24T22:54:12Z" level=debug msg="Processing application argocd/myapp-staging"
time="2024-10-24T22:54:12Z" level=debug msg="Considering this image for update" alias=adserver application=myapp-staging image_name=xxxx/oci-xxx/adserver image_tag="sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27" registry=europe-west1-docker.pkg.dev
time="2024-10-24T22:54:12Z" level=debug msg="Using version constraint 'staging' when looking for a new tag" alias=adserver application=myapp-staging image_name=xxxx/oci-xxx/adserver image_tag="sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27" registry=europe-west1-docker.pkg.dev
time="2024-10-24T22:54:13Z" level=debug msg="found 1 from 1 tags eligible for consideration" image="europe-west1-docker.pkg.dev/xxxx/oci-xxx/adserver@sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27"
time="2024-10-24T22:54:13Z" level=info msg="Setting new image to europe-west1-docker.pkg.dev/xxxx/oci-xxx/adserver:staging@sha256:63f7bebb86c43d8a1a71a8394f6f576731acf08a97dd0279a830d4bba8406c36" alias=adserver application=myapp-staging image_name=xxxx/oci-xxx/adserver image_tag="sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27" registry=europe-west1-docker.pkg.dev
time="2024-10-24T22:54:13Z" level=info msg="Successfully updated image 'europe-west1-docker.pkg.dev/xxxx/oci-xxx/adserver@sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27' to 'europe-west1-docker.pkg.dev/xxxx/oci-xxx/adserver:staging@sha256:63f7bebb86c43d8a1a71a8394f6f576731acf08a97dd0279a830d4bba8406c36', but pending spec update (dry run=false)" alias=adserver application=myapp-staging image_name=xxxx/oci-xxx/adserver image_tag="sha256:1a15f767519b1ce8d73130cc7b6e6a8787c12482f3a61dbe4c6d7bbc5d5d6c27" registry=europe-west1-docker.pkg.dev
time="2024-10-24T22:54:13Z" level=debug msg="Using commit message: "
time="2024-10-24T22:54:13Z" level=info msg="Committing 1 parameter update(s) for application myapp-staging" application=myapp-staging
time="2024-10-24T22:54:13Z" level=debug msg="Getting application myapp-staging across all namespaces"
time="2024-10-24T22:54:13Z" level=debug msg="Applications listed: 39"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: argo-cd in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: autopilot-bootstrap in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: blackbox-exporter in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: cluster-resources-in-cluster in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxxx-ui-back-dev in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxxx-ui-front-dev in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxxx-dev in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-in-cluster in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxx-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxx-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxx-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxxx-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxxx-stng in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-dns-xxxx-test in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxxx-dev in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-gitlab-runner in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-in-cluster in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxx-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxx-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxx-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxxx-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxxx-stng in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: external-secrets-xxxx-test in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: gitlab-runner in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: grafana in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: prometheus in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: myapp-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: myapp-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: myapp-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Application myapp-staging matches the pattern"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-backend-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-backend-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-backend-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-frontend-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-frontend-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-frontend-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-fusionauth-preprod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-fusionauth-prod in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: xxx-fusionauth-staging in namespace argocd"
time="2024-10-24T22:54:13Z" level=debug msg="Found application: root in namespace argocd"
time="2024-10-24T22:54:13Z" level=info msg="Successfully updated the live application spec" application=myapp-staging
time="2024-10-24T22:54:13Z" level=info msg="Processing results: applications=1 images_considered=1 images_skipped=0 images_updated=1 errors=0"

@mehdicopter
Copy link
Author

Does the logs help you @chengfang ?

@jannfis
Copy link
Contributor

jannfis commented Oct 28, 2024

What are the SharedResourceWarning's details?

@mehdicopter
Copy link
Author

What are the SharedResourceWarning's details?

time="2024-10-24T02:59:51Z" level=info msg="Normalized app spec: {\"status\":{\"conditions\":[{\"lastTransitionTime\":\"2024-10-24T02:59:50Z\",\"message\":\"ConfigMap/datawiz-ui-front is part of applications argocd/datawiz-ui-front-dev and root\",\"type\":\"SharedResourceWarning\"},{\"lastTransitionTime\":\"2024-10-24T02:59:50Z\",\"message\":\"Deployment/datawiz-ui-front is part of applications argocd/datawiz-ui-front-dev and root\",\"type\":\"SharedResourceWarning\"},{\"lastTransitionTime\":\"2024-10-24T02:59:50Z\",\"message\":\"Service/datawiz-ui-front is part of applications argocd/datawiz-ui-front-dev and root\",\"type\":\"SharedResourceWarning\"}]}}" app-namespace=argocd app-qualified-name=argocd/datawiz-ui-front-dev application=datawiz-ui-front-dev project=datawiz

@jannfis
Copy link
Contributor

jannfis commented Oct 28, 2024

At this point, I highly doubt that this has to do with the Image Updater. Image Updater itself doesn't manage or manipulate resources such as ConfigMaps, or other types.

Are you using a mono repo for all your apps, including the root app by any chance?

@mehdicopter
Copy link
Author

At this point, I highly doubt that this has to do with the Image Updater. Image Updater itself doesn't manage or manipulate resources such as ConfigMaps, or other types.

Are you using a mono repo for all your apps, including the root app by any chance?

Ok I understand. But explain why it is working in 0.14.0 ?
I am using the same cluster, same repo which contains all ArgoCD apps.

@jannfis
Copy link
Contributor

jannfis commented Oct 28, 2024

Can you post your root application's spec here?

@mehdicopter
Copy link
Author

Can you post your root application's spec here?

apiVersion: argoproj.io/v1alpha1
kind: Application
metadata:
  finalizers:
    - resources-finalizer.argocd.argoproj.io
  labels:
    app.kubernetes.io/managed-by: argocd-autopilot
    app.kubernetes.io/name: root
  name: root
  namespace: argocd
spec:
  destination:
    namespace: argocd
    server: https://kubernetes.default.svc
  ignoreDifferences:
    - group: argoproj.io
      jsonPointers:
        - /status
      kind: Application
  project: default
  source:
    path: projects
    repoURL: https://gitlab.com/xxx/argocd.git
  syncPolicy:
    automated:
      allowEmpty: true
      prune: true
      selfHeal: true
    syncOptions:
      - allowEmpty=true
status:
  health: {}
  summary: {}
  sync:
    comparedTo:
      destination: {}
      source:
        repoURL: ""
    status: ""

@jannfis
Copy link
Contributor

jannfis commented Oct 28, 2024

I assume that is the version stored in Git, I am more interested in the live resource in the cluster.

I suspect something (maybe image updater) might have fiddled with the source block there.

@mehdicopter
Copy link
Author

mehdicopter commented Oct 28, 2024

I assume that is the version stored in Git, I am more interested in the live resource in the cluster.

I suspect something (maybe image updater) might have fiddled with the source block there.

I'll update to 0.15.0 and try to get the live version. I say "try" because when I update, the yaml become the same as another app, so it will be difficult to get the live of the root app itself.

@LGLN-LS
Copy link

LGLN-LS commented Oct 30, 2024

I have encountered a similar problem. But I am not using the "App of Apps" pattern.

My setup looks like this. I deploy multiple nginx servers with different Docker images/tags in different kubernetes namespaces. They all use the same helm chart for deployment, but with different values.yaml. Each application has its own ArgoCD application resource, which is applied manually to kubernetes.

Application Dockerimage Kubernetes Namespace
App-A-Development app-a:dev app-a-dev
App-A-Staging app-a:staging app-a-staging
App-B-Development app-b:dev app-b-dev
App-B-Staging app-b:staging app-b-staging

With argocd-image-updater v0.14.0 everything works as indented.

After I updated argocd-image-updater to v0.15.0 something strange happened. Our monitoring issued an alert because no more metrics could be collected from App-B-Staging. I started to investigate and noticed that the namespace app-b-staging was empty. The deployment was gone. Then I checked argocd and I got a SharedResourceWarning for App-B-Staging.

ArgoCD was trying to deploy App-B-Staging to the app-b-dev namespace with the app-b:dev Dockerimage. Somehow the configurations have to be mixed up?
I tried to delete and reinstall App-B-Staging and it worked. But then App-A-Staging had the exact same issue!

After that, I downgraded argocd-image-updater back to v0.14.0, reapplied the ArgoCD application resources, and everything worked as expected.
I hope you can understand, and this helps to find the issue.

@JeromeMSD
Copy link

JeromeMSD commented Oct 30, 2024

Same issue for a week with the v0.15.0 version. It breaks one app in a set of 125. Application specs of some apps that use argocd-image-updater appear to drift to other applications that also use the argocd-image-updater annotations.
Deleting the broken app diverts the configuration to another one.

I tried hard refresh, rollout of most of the ArgoCD's components and even a manual cleaning into argocd-redis.
This behavior only stops when argocd-image-updater is not running.

Note

Update - Same as @LGLN-LS, downgrading argocd-image-updater to v0.14.0 fix the issue.

@jannfis
Copy link
Contributor

jannfis commented Oct 30, 2024

Thanks everyone! Appreciate the insights here. I assume y'all who are hitting on this problem are using the default argocd update method, and not Git write-back, right?

@mehdicopter
Copy link
Author

mehdicopter commented Oct 30, 2024

Thanks everyone! Appreciate the insights here. I assume y'all who are hitting on this problem are using the default argocd update method, and not Git write-back, right?

I am using both methods. But the one which fails is the default one indeed.

@JeromeMSD
Copy link

Thanks everyone! Appreciate the insights here. I assume y'all who are hitting on this problem are using the default argocd update method, and not Git write-back, right?

Indeed it's the specs from recently auto-updated apps without Git write-back-method that overwrite the specifications of other apps. In my case, "Recently auto-updated apps" refers to those that have received a new image since the upgrade to v0.15.0.

@christian-schlichtherle

I found this issue because version 0.15.0 overwrote my application, too. In our case, one of our own "Application" instances trumped its resources into another "Application" instance.

@AmitBenAmi
Copy link

I had observed the same behaviors with app-of-apps pattern from 2 different environments with 2 different ArgoCD versions, but unfortunately, argocd-image-updater was configured with the latest tag, so it was for sure running later than v0.15.0.

I managed to see that within the app of apps, the last application in the list (lexicographically) was overwritten with its repoURL and path, causing this application to point to a wrong GitHub repo and path, eventually taking ownership of a different application totally outside of the app of apps one.

So eventually I had:

  1. Application called infrastructure with all of the really important resources
  2. Application called dummy app that took over the infrastructure resources

It also removed those resources at some point, effectively causing my cluster to be in a bad state.

It happened both with a regular app of apps, but also with an applicationset setup.

I opened up an issue with ArgoCD argoproj/argo-cd#20440 that explains some of my behavior, but that was before I found this issue

@HuseyinCoinmerce
Copy link

I also had this problem and in my case I was using the app of apps pattern. Funny enough, after deploying the new version of image updater. It replaced the application for Promtheus instance with an App service of the pattern :S

@tokongs
Copy link

tokongs commented Nov 4, 2024

This bug is particularly bad if you have any apps with prune: true!

  1. App A is overwritten to point to App B's resources
  2. App A is overwritten to point to App C's resources. App C has prune: true, which means App A also has it.
  3. Watch as ArgoCD prunes all of App B's resources.

We just had a bunch of resources deleted because of this failure mode.

@chengfang
Copy link
Collaborator

Thanks for all the inputs. I've tested with a sample app but couldn't reproduce the issue. The updates were correctly written to git repo kustomization.yaml for each child app, and the kubectl describe app app1 | app2 | root shows their correct state. For simplicity, all apps are in the control plane namespace, not sure if separating them into different ns will make a difference. I'll also try argocd write-back method, which was mentioned in some previous comments. Any ideas to reproduce it with this sample app are welcome.

@tokongs
Copy link

tokongs commented Nov 5, 2024

@chengfang I don't know if it makes a difference, but we don't use the git write-back. After a brief glance at the code it looks like the spec is only updated when you use the WriteBackApplication method.

@chengfang
Copy link
Collaborator

chengfang commented Nov 5, 2024

After changing the write-back-method to argocd, I was able to reproduce it with the sample app. After the image-update run, the 2 child apps (app1, app2) got removed, and the root app was erroneously updated to be like app2. Will keep looking.

kubectl describe -n argocd app root

  History:
    Deploy Started At:  2024-11-05T19:14:05Z
    Deployed At:        2024-11-05T19:14:05Z
    Id:                 0
    Initiated By:
      Automated:  true
    Revision:     e64ea670f0a64d8f5671c261c68fe52ff0b1c7c3
    Source:
      Path:             app-of-apps/apps
      Repo URL:         https://github.com/chengfang/image-updater-examples.git
      Target Revision:  main
    
    Deploy Started At:  2024-11-05T19:16:24Z
    Deployed At:        2024-11-05T19:16:24Z
    Id:                 1
    Initiated By:
      Automated:  true
    Revision:     e64ea670f0a64d8f5671c261c68fe52ff0b1c7c3
    Source:
      Kustomize:
        Images:
          nginx:1.12.2
      Path:             app-of-apps/source/overlays/app2
      Repo URL:         https://github.com/chengfang/image-updater-examples.git
      Target Revision:  main

@fletch3555
Copy link

I wish I saw this issue before we updated to v0.15.0 this morning. Spent several hours trying to track down very weird behavior. We also follow the App of Apps (of Apps) pattern and were seeing one "leaf" app update to match the resources of another and couldn't figure out what was going on. Like others, reverting back to v0.14.0 resolved it.

All apps are git-hosted Helm charts, and the Apps/AppSets often override Helm values inline and make use of image-updater with argo writeback.

I'm not sure how much help I would be on this, but happy to help test/troubleshoot however possible.

chengfang added a commit to chengfang/argocd-image-updater that referenced this issue Nov 6, 2024
Signed-off-by: Cheng Fang <cfang@redhat.com>
chengfang added a commit to chengfang/argocd-image-updater that referenced this issue Nov 6, 2024
Signed-off-by: Cheng Fang <cfang@redhat.com>
@chengfang
Copy link
Collaborator

With the linked PR, my sample app of apps now works and updates correctly. It will be great if we can get some more testing from your real apps.

@HuseyinCoinmerce
Copy link

@chengfang if an image is available then I can give it a try

@mehdicopter
Copy link
Author

With the linked PR, my sample app of apps now works and updates correctly. It will be great if we can get some more testing from your real apps.

If you could upload an image, I will test it

@paolofacchinetti
Copy link

I'm not using app of apps and yet two different ArgoCD applications running on different clusters have a bunch of "SharedResourceWarning" after updating image updater to 0.15.0

I would just remove the 0.15.0 release and make it unavailable before other people stumble upon it. The blast radius for this issue is inexplicably large and may break production for a lot of users.

@chengfang
Copy link
Collaborator

I pushed the fixed image (my local build) to my personal quay repo for testing purpose: https://quay.io/repository/cfang/argocd-image-updater?tab=tags

@jamct
Copy link

jamct commented Nov 6, 2024

Thanks @chengfang for your effort. I can confirm that the problem existed in version 0.15.0 (App of Apps, no write-back to Git). With your version 0.15.1 everything works as expected.

@ognyan-lazarov-cloudoffice

I can confirm as well that with @chengfang's version 0.15.1 everything works correctly.
Thank you, @chengfang!

@yo-l1982
Copy link

yo-l1982 commented Nov 8, 2024

Make sure to clear out any added resources after rolling back to 0.14.0.
I had a duplicate of everything in the app namespace that got overwritten(not removed by argo or visible in argo).

@jannfis jannfis closed this as completed in 8146cf1 Nov 8, 2024
chengfang added a commit to chengfang/argocd-image-updater that referenced this issue Nov 8, 2024
jannfis pushed a commit that referenced this issue Nov 8, 2024
…5] (#920)

Signed-off-by: Pasha Kostohrys <pavel@codefresh.io>
Signed-off-by: Cheng Fang <cfang@redhat.com>
Co-authored-by: pasha-codefresh <pavel@codefresh.io>
@christian-schlichtherle
Copy link

I side that 0.15.0 should been recalled. In our case, it basically destroyed the deployed app with all the volumes. It took me hours to recover from that, including recovering a backup. If I had no backup, I would have lost some data forever.

@mehdicopter
Copy link
Author

I side that 0.15.0 should been recalled. In our case, it basically destroyed the deployed app with all the volumes. It took me hours to recover from that, including recovering a backup. If I had no backup, I would have lost some data forever.

Friendly reminder....
Screenshot 2024-11-08 at 21 56 51

@chengfang
Copy link
Collaborator

The patch release v0.15.1 was just released: https://github.com/argoproj-labs/argocd-image-updater/releases/tag/v0.15.1

@yo-l1982
Copy link

yo-l1982 commented Nov 9, 2024

I side that 0.15.0 should been recalled. In our case, it basically destroyed the deployed app with all the volumes. It took me hours to recover from that, including recovering a backup. If I had no backup, I would have lost some data forever.

Friendly reminder.... Screenshot 2024-11-08 at 21 56 51

Even in dev/test environments 0.15.0 can be very destructive.

I dont see the reason to keep the release public.

@RmStorm
Copy link

RmStorm commented Nov 11, 2024

I side that 0.15.0 should been recalled. In our case, it basically destroyed the deployed app with all the volumes. It took me hours to recover from that, including recovering a backup. If I had no backup, I would have lost some data forever.

Friendly reminder.... Screenshot 2024-11-08 at 21 56 51

I don't really think this fairly holds up when argocd itself literally points to argocd-image-updater in the top 3 of it's recommended blogpost/presentations. The fact is that a lot of people that use argocd also use image-update because keeping control over which images actually enter the cluster is important! And they do it in production. I think the correct course of action is to pull 0.15.0. No need to leave that release public when it can be so incredibly destructive.

Furthermore I think argocd-image-updater should acknowledge that a great many people that use argocd also use argocd-image-updater and that it would be proper to start treating it as a production critical project by now. For example the warning warns about potentially a lot of breaking changes but the previous release with breaking changes was 0.12.0 which is more than two and a half years old! Off course with that kind of stability people will start to consider argocd-image-updater stable even if you keep a big warning on the project 🙈.

I agree with you that they shouldn't do that and that the warning is unambiguous but in my experience this is not how people actually behave.

edit: ps.: I made an issue for this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocker bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.