Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues when running multiple bundlers concurrently. #1903

Open
gdborton opened this issue Aug 16, 2018 · 7 comments
Open

Issues when running multiple bundlers concurrently. #1903

gdborton opened this issue Aug 16, 2018 · 7 comments

Comments

@gdborton
Copy link

🐛 bug report

The issues occur when you run multiple builds concurrently or even sequentially in the same process.

For the first issue, it seems that module resolution is only run once for both builds, which creates a problem if for example you're building a browser and server bundle.

The second issue is that parcel keeps the process open even after completing its work when building w/ two bundlers.

🎛 Configuration (.babelrc, package.json, cli command)

See this simple repo repository - https://github.com/gdborton/parcel-concurrent-build

🤔 Expected Behavior

Building a bundle for both server and browser should include the correct modules for each (browser vs main in package.json)

😯 Current Behavior

Parcel seems to use the same module resolution across bundlers, despite one targeting node and the other targeting browsers.

💁 Possible Solution

For the first problem, scope module resolution to bundler, hopefully this is fairly straight forward.

For the second, maybe a singleton wrapper around the worker farm api that can track open/closed workers.

🔦 Context

I think this is a related issue, I'm not sure it's a 100% duplicate - #1771

💻 Code Sample

See this simple repo repository - https://github.com/gdborton/parcel-concurrent-build

🌍 Your Environment

Software Version(s)
Parcel
Node
npm/Yarn
Operating System
@DeMoorJasper
Copy link
Member

DeMoorJasper commented Aug 16, 2018

Changing your bundle Promise.all to this async IIFE function should solve your issue, untill we figure out how to improve the workerFarm to handle this.

(async function bundleEverything() {
  console.log('Start bundling...');

  let clientBundle = await clientBundler.bundle();
  console.log('Client bundled...');
  let serverBundle = await serverBundler.bundle();
  console.log('Server bundled...');

  // the debug/src/browser file is the "browser" entry for debug
  const debugBrowserLocation = require.resolve('debug/src/browser');
  const containsBrowserFile = !!Array.from(clientBundle.assets).find(item => item.name === debugBrowserLocation);
  assert(containsBrowserFile);
  
  console.log('Bundling finished!');
})();

EDIT: This script cleanly starts up and shuts down an entire workerfarm, this way the options don't get overwritten and the process shuts down cleanly

This should not be an issue as long as you do production builds without watching. The moment you start watching you have no other option than running them in parallel so this bug definitely needs to get resolved.

@DeMoorJasper
Copy link
Member

So the issue is that getShared gets called, which if there is a shared one overwrites the options of the entire workerfarm, blocking all workers updating their options and re-enabling them once up-to-date with the latest options. (So browser => Node).

If you would flip around the Promise.all it would be the opposite result.

We could solve this by assigning an ID to each workerfarm based on options and entrypoint or making workerfarm part of the bundler object, so we don't overwrite options. Not sure both solutions seem a bit too complex to me.

@gdborton
Copy link
Author

Not entirely sure that I'm following your solution, but I think ideally we still use the same set of workers with differing configs to avoid creating too many threads.

This could be done by passing the config with each request (potentially slow due to ipc, largely determined by the size of the message), or by initializing workers with a config then passing references to that config with each message (reference could be a hash of the config object).

@DeMoorJasper
Copy link
Member

DeMoorJasper commented Aug 17, 2018

Bundler.bundle starts and stops the entire workerfarm, so it does not overwrite the configs at all. (Will not work in watch mode)

About the solution, I guess adding a config hash would solve the issue.

@jamiebuilds
Copy link
Member

Note: Parcel 2 will be able to solve this by running a single instance of Parcel with two entries points to the same file with different configs

@gdborton
Copy link
Author

@jamiebuilds do you have a roadmap/timeline for parcel v2?

@jamiebuilds
Copy link
Member

#1952

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants