Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Vercel deployment fails #419

Closed
1 task done
technophile-04 opened this issue Jul 13, 2023 · 10 comments
Closed
1 task done

bug: Vercel deployment fails #419

technophile-04 opened this issue Jul 13, 2023 · 10 comments

Comments

@technophile-04
Copy link
Collaborator

technophile-04 commented Jul 13, 2023

Is there an existing issue for this?

Current Behavior

Running : yarn vercel

Screenshot 2023-07-14 at 12 16 18 AM

Steps To Reproduce

Alternatively to reproduce locally :
Update viem and wagmi to latest version :
1.yarn workspace @se-2/nextjs up wagmi viem

Run check-types :
2.yarn next:check-types

However through above steps you get different type errors then vercel :
Screenshot 2023-07-14 at 1 11 15 AM

Anything else?

I think the reason its failing on vercel deployement and not locally is because since we have ^ for viem and wagmi in package.json because of which on Vercel while yarn install it gets latest patch version of both viem and wagmi and it seems that in latest viem and wagmi its broken.

I think this problem is again related to -> wevm/wagmi#2421 (comment)

@carletex
Copy link
Member

Thanks for reporting this Shiv.

What I don't understand is why Vercel installs a different version of the packages. Even if we have ^ in package.json, it should take the exact version from yarn.lock (that's the goal of this file!)

@technophile-04
Copy link
Collaborator Author

What I don't understand is why Vercel installs a different version of the packages. Even if we have ^ in package.json, it should take the exact version from yarn.lock (that's the goal of this file!)

Yeah this makes sense !!

I tried going through the deployment logs and found this :
Screenshot 2023-07-14 at 2 35 47 PM

Also if you look at the souce uploaded it dosen't have yarn.lock (I think because we are just uploading nextjs workspace and yarn.lock is present at root ?)

Screenshot 2023-07-14 at 2 39 55 PM

@carletex
Copy link
Member

Oh damn, you are right @technophile-04!! Thanks for looking into it.

Maybe we could explore if there is a way to make a dedicated yarn.lock for each package?? (this will also be great for the CLI)

I think we can merge PRs #420 #421 for now

@carletex
Copy link
Member

carletex commented Jul 14, 2023

Another option is to make the root yarn.lock avaliable on the Vercel build (I think we look for something similar when playing with moving the scaffold.config.ts to the root of the repo)

@technophile-04
Copy link
Collaborator Author

Maybe we could explore if there is a way to make a dedicated yarn.lock for each package?? (this will also be great for the CLI)

Yes yes will research on this nicely

Another option is to make the root yarn.lock available on the Vercel build (I think we look for something similar when playing with moving the scaffold.config.ts to the root of the repo)

Yup but with this option we had to mention root directory manually while deployemnt :( checkout point 1. of -> #231 (comment)

I have also created some discussion on vercel regarding this :
vercel/vercel#10211
https://github.com/orgs/vercel/discussions/2145

Nevertheless, I like the first option more having a dedicated yarn.lock if that's feasible but let's see 🙌

@technophile-04
Copy link
Collaborator Author

technophile-04 commented Jul 16, 2023

Maybe we could explore if there is a way to make a dedicated yarn.lock for each package?? (this will also be great for the CLI)

I found this -> yarnpkg/berry#1223 think there is no direct solution to doing this in yarn :( .... There are some hacky solutions mentioned in the issue using some people's plugin but I went through and didn't find any stable, well maintained and famous

cc @rin-st @sverps also if they have some better approach or if I am missing and there might an easy solution already (because I feel this should have been present in yarn already).

Also just mentioning pnpm here 😅 (It's workspaces has this option to create separate lock files) ....I tried it out and its really good also we might also benefit from it using it in our CLI due to its some what faster installations

carletex pushed a commit that referenced this issue Jul 17, 2023
carletex pushed a commit that referenced this issue Jul 17, 2023
@carletex
Copy link
Member

Also just mentioning pnpm here sweat_smile (It's workspaces has this option to create separate lock files) ....I tried it out and its really good also we might also benefit from it using it in our CLI due to its some what faster installations

We kept using yarn for historical reasons and it was easy to set up. But I won't be opposed to switching to pnpm if we have a good reason for it (like the per-package lock files) It supports monorepos too and it seems to be super fast.

But yarn chain etc is a classic in SE, so it looks like a big deal to me haha. I just texted Austin to get his take.

But also let's make sure that there isn't an easier solution that doesn't imply switching.

@rin-st
Copy link
Member

rin-st commented Jul 18, 2023

I believe pnpm is a good solution.

@carletex
Copy link
Member

@technophile-04 could you draft a PR (against main) for the pnpm "tests" that you made when you have the chance? So we can also see how pnpm looks on SE-2.

I really like the individual lock files, and it seems that's a modern and performant solution.

@technophile-04
Copy link
Collaborator Author

@technophile-04 could you draft a PR (against main) for the pnpm "tests" that you made when you have the chance? So we can also see how pnpm looks on SE-2.

For sure !!!! lol I deleted all the changes but the migration is pretty much straightforward forward and easy so will draft a PR soon 🙌

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants