Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batched Requests #892

Closed
tobyjaguar opened this issue Jun 16, 2020 · 21 comments
Closed

Batched Requests #892

tobyjaguar opened this issue Jun 16, 2020 · 21 comments

Comments

@tobyjaguar
Copy link

Does ethers have a way to do serialized batch requests?

re:
https://web3js.readthedocs.io/en/v1.2.9/web3-eth.html#batchrequest

@tobyjaguar
Copy link
Author

I am going to close this one out, as the desired behavior would not necessarily execute in a batched request. A transaction cue which executes off of mined transactions is probably the solve here.

thanks

@ricmoo
Copy link
Member

ricmoo commented Apr 18, 2021

I've added ethers.providers.JsonRpcBatchProvider in 5.1.1. Try it out and let me know if it works for you.

@tobyjaguar
Copy link
Author

oh, this is great! thank you very much. will report back

@tobyjaguar
Copy link
Author

Am I giving JsonRpcBatchProvider.prototype.send(method, params) an array, or is there another way to instantiate the pendingBatch pool?

https://github.com/ethers-io/ethers.js/blob/master/packages/providers/lib/json-rpc-batch-provider.js#L28

Not quite sure how to schedule a series of transactions

@ricmoo
Copy link
Member

ricmoo commented Apr 29, 2021

No, you use it like a normal JsonRpcProvider, with the standard Provider API. There is nothing special about how you interact with it. It will implicitly batch all requests made within a single event loop for you.

const provider = new JsonRpcBatchProvider();

// Begin queuing...
const promiseA = provider.getBlockNumber();
const promiseB = provider.getGasPrice();

// This line won't finish until both A and B are complete, and both A and B are sent as a batch
const [ blockNumber, gasPrice ] = await Promise.all([ promiseA, promiseB ]);

// Queue some new things...
const promises = [ ];
for (let j = 0; j < 10; j++) {
    promises.push(provider.call(getTx(j)));
}

// This line won't complete until the 10 above calls are complete, all of which will be sent as a single batch
const results = await Promise.all(promises);

Otherwise it won't be compatible with any existing libraries or features that use a standard ethers Provider.

I am considering adding a .flush operation that would force it to send all queued requests and clearing the pending queue. Or allowing a maximum batch size parameter to force flushing in the event someone tries to queue too many items in a batch. But for now I want to see if anyone uses it first. I'm also considering adding a timer which will batch all requests within a timeframe instead of basing it entirely on event loops, but the way most people want to batch (by making a bunch of initial requests) this should work.

Make sense?

@tobyjaguar
Copy link
Author

hmm, this line for me:
tx.push(contract.catchDataCall(price, { gasLimit: 50000000 }));

seems to execute regardless of if I wait for it. I guess my experience with web3 batching, is that with their api you will add to a batch array, and then send the batched transaction with a batch send transaction. Here you are just loading them up, and whether they begin doesn't matter, you are just waiting for them to all complete?

@tobyjaguar
Copy link
Author

also I am sure I don't have this quite right, but would the nonce get incremented using the batch provider?

[
  {
    nonce: 2,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000005',
    chainId: 212984383488152,
    v: 425968766976339,
    r: '0x2410e0703f2c2330ca8d3afc40d24554fa6b8a35ec18ef889fa5b44b1a3a6a63',
    s: '0x1329fc6ffb3b41ea166e99209f017670aad52d18cacf4ee62817126954060972',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0x95dbdc33fb2034e96ac3e4dd16724b6abbce8df34952d09a974b338a7082818f',
    type: null,
    wait: [Function]
  },
  {
    nonce: 2,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000006',
    chainId: 212984383488152,
    v: 425968766976339,
    r: '0xaa17202875d29a45b90c204bf2cbebe18cad430d919e822064c4ba50c673dcea',
    s: '0x7fdf5976de22ab132a7c523c5e05b34ed3d01d9974ee6156d01251bc4bdfb879',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0xcd93201e844b457005cd59cf1c85418e4f5263290a92654646eb15ddc6c993e7',
    type: null,
    wait: [Function]
  },
  {
    nonce: 2,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000007',
    chainId: 212984383488152,
    v: 425968766976339,
    r: '0xebd195d86e0d348c1798a8a3ca9ffed9ded9652d0c7b0bac6b55ea6f4b42ecdb',
    s: '0x52a1a4a77c8faf63406869a095ab62145efc20db40706551174b6d004ef14b47',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0x44a0cbc2aaa0ae388538ea4cb716e004e44a976afb092fe149398844cc9bc38c',
    type: null,
    wait: [Function]
  }
]

@ricmoo
Copy link
Member

ricmoo commented Apr 29, 2021

Keep in mind batching is not part of the Ethereum spec or Web3, per se. It is part of the JSON-RPC specification, so Ethereum and Web3 accidentally inherited it as a result of that.

It actually makes very little sense in Ethereum since the outputs from a result are required as inputs to other calls, and JSON-RPC batching doesn’t support those type of inter-dependent connections.

The only real reason to use it is to reduce the number of HTTP connections.

That line will execute immediately, but the call to send will buffer it internally, and if it is the first send of that event loop, if will schedule a send-request for the next event loop. Each subsequent send will not schedule additional send-requests, but will append to the pending queue. Once the current evaluation loop is complete, the next event loop triggers, which will cause that above send-request to fire off all pending requests and clear the queue, preparing it for the next evaluation loop.

Now, keep in mind, as I mentioned above JSON-RPC batching does not support dependencies within the same batch, and ethers implicitly populates things like gas price, gas limit, nonce, etc. if you do not specify them. Populating these things will occur in the next event loop, which means those calls depending on those values are at least 2 event loops away, so will not get batched, unless you specify all properties that Signer.populateTransaction is responsible for (JsonRpcSigners do not do this though as they expect the connected node to populate those, so for the most part those are safe). Many of those intermediate calls will get batched together though and some calls will even get deduped within a given event loop (the degree of caching will go up in tbe next major version).

The JsonRpcBatchProvider will bundle things up as efficiently as possible and ensure dependency order is adhered to, but due to dependencies, there isn’t actually that much to gain in many cases…

Does that make sense?

@ricmoo
Copy link
Member

ricmoo commented Apr 29, 2021

Batches cannot increment the nonce because you must look it up, then include it. But a batch must be fully formed at send time, so it cannot know. You cannot do a batch that looks like [ x = getNonce, sendTx(nonce: x), y = getNonce, sendTx(y), z = getNonce, sendTx(z) ] because data within a batch cannot be used within a batch. You must have a dependency tree of 2 batches 1) [ x = getNonce ] 2) [ sendTx(x), sendTx(x + 1), sendTx(nonce + 2) ].

The NonceManager should handle that for you, but (I cannot recall for sure), may not handle it synchronously, which means each will occur in the next event loop, so it would spread the calls over multiple batches. If the nonce manager does not return pending promises without an await, this is easy to adjust, but I think it should already do what you want to achieve the above dependency tree.

You can use provider.on("debug", console.log) to check.

@tobyjaguar
Copy link
Author

Okay, I have a successful test:

[
  {
    nonce: 4,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000005',
    chainId: 212984383488152,
    v: 425968766976339,
    r: '0x1f2b645ed270d3dcb6bd7c4a194d1b962e1d07a0e176dedd0cb40b5278302892',
    s: '0x79e69c198b7bd46075884e602e4577cfd46c44d04925a4c2d1ac126bd469fe5d',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0x5a3f07d61f59d9a0d9df4b9ad01bde3ca12087cb51bcd5f5e53cde951b8b39c7',
    type: null,
    wait: [Function]
  },
  {
    nonce: 5,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000006',
    chainId: 212984383488152,
    v: 425968766976340,
    r: '0x58db73c9729109d1ef28d90a5fe8de3cf081ee779498ac2b31e488bb9d5fc6ad',
    s: '0x6c6b7c2cea1540a8477b75c60ed56328de2eff5d1383f2da74b4f56c5f6bde51',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0xb0220afb6850ddc3c6220b50ec9855241f6495b47ba461085a0d4d1686cf2bc5',
    type: null,
    wait: [Function]
  },
  {
    nonce: 6,
    gasPrice: BigNumber { _hex: '0x00', _isBigNumber: true },
    gasLimit: BigNumber { _hex: '0x02faf080', _isBigNumber: true },
    to: '0xE4A176c6aC45CC8B4E9486Abbdd4CA5e93678e64',
    value: BigNumber { _hex: '0x00', _isBigNumber: true },
    data: '0x1477e7880000000000000000000000000000000000000000000000000000000000000007',
    chainId: 212984383488152,
    v: 425968766976340,
    r: '0xdb8e2e67cf68b4386fa3eceac551b787e3c7e3ff8761e50b6aeae50dd1d18f0e',
    s: '0x547ad0cfbbdb000e3038600ebc2e5f9ea72290e59a73e1031da8d589403d74b3',
    from: '0x0Ec2996B99B39b52369853254182D813E56f0769',
    hash: '0xf62d47701f6c0e1bbe82ad8424f4dcf0748ef15b834fd73b3ea3c84d0b48d731',
    type: null,
    wait: [Function]
  }
]

updated the contract correctly. I'll play around with it. Thanks for the hard work.

@ricmoo
Copy link
Member

ricmoo commented Apr 29, 2021

No problem. I think performance can improve by allowing a time-slice-based batch instead of an event-loop-based solution, but that requires some experimentation, as each time-slice will put an extra delay on each step of the dependencies.

It's still a work in progress. :)

@tobyjaguar
Copy link
Author

I was using a websocket for this service (and use it for many others), but this doesn't need to be connected all the time. If I use the batch rpc provider to wakeup establish an rcp connection, and then go back to sleep, what is the best way to destroy the rpc provider, so I don't build up connections on every wake up? (infura often goes stale, so it would be nice to make and break the connection).

@tobyjaguar
Copy link
Author

so far so good, and the batching provider

@jamesryan83
Copy link

Will using JsonRpcBatchProvider send only a single http request ? I've been using Web3 batch requests and they seem to only send a single request for multiple web3 calls (see browser devtools payload image below). I've found I can batch up to about 5,000 web3 requests into one batch and they will be sent in a single http request. I noticed metamask recommends using ethers.js now and thought I'd change over from web3 but i'm using a lot of batch requests with web3 in my app and wasn't sure if ethers.js batch requests work the same way ? I'm just concerned I might exceed rate limits doing 1 http request per call

168976343

@ricmoo
Copy link
Member

ricmoo commented May 18, 2021

@jamesryan83

The JsonRpcBatchProvider will batch any requests made during the same event loop. Many operations in ethers perform lookups for you, so dependency calls will be made (and batched) as best as possible, and then the dependent calls made.

If you fully describe the call (i.e. include a gasLimit, gasPrice, from, etc) and place them in the same event loop, they will be batched into one call. Otherwise, the dependencies (looking up nonces, etc) will be made in one event loop, and depending on the response order and timing, additional batches will be made with those values.

Future changes I will likely make to the JsonRpcBatchProvider will use a timer to decide what to batch (relaxing the event loop restriction), and enabling some sort of explicit mode, although, this could possibly cause apps to terminate in some environments, since it would allow for unresolvable Promise-Promise dependencies.

In v6, the JsonRpcProvider will also more aggressively cache certain calls made during the same event loop, which will be added to the JsonRpcBatchProvider too.

But try it out, and let me know if it works for you.

@alexbakers
Copy link

alexbakers commented Feb 11, 2023

But try it out, and let me know if it works for you.

@ricmoo

Could you give an example or show in the documentation how to call multiple functions in a contract at once? That it was in one transaction.

  const provider = new ethers.providers.JsonRpcBatchProvider(rpc);
  const signer = new ethers.Wallet(privateKey, provider);
  const one = new ethers.Contract(addr1, abi1, signer);
  const two = new ethers.Contract(addr2, abi2, signer);

  const { wait: waitFunc1 } = await one.Func1();
  await waitFunc1();

  const { wait: waitFunc2 } = await two.Func2();
  await waitFunc2();

@ricmoo
Copy link
Member

ricmoo commented Feb 11, 2023

@alexbakers You could use:

const txs = await Promise.all([ one.Func1(), two.Func2() ]);
const receipts = await Promise.all(txs.map((t) => t.wait()));

Keep in mind, that ethers will populate things like the nonce for you, so any of those async properties (which forma dependency tree) will delay the transaction to the next event loop.

So, the event loop immediately after those transactions were made would look like [ getNonce, estimateGas1, estimateGas2 ] and once those came back (assuming they come back close enough to the same time) would then look like [ sendTx1, sendTx2 ]. I'm assuming you would be using the NonceManager, otherwise there would be two getNonce calls, which would likely step on each other. But if estimateGas1 comes back early and the estimateGas2 takes its sweet time, the sendTx1 will prolly go out before sendTx2.

It implicitly forms a dependency tree, and will batch things as efficiently as possible.

If you need more control, you can just make sure you supply the necessary overrides into the Func(overrides) calls, so ethers doesn't need to look up gas estimates or nonces for you.

Does that make sense?

@Alexandra-Popa
Copy link

@alexbakers were you able to make it work? I'm trying to make a multicall which should fail if one of the write functions fails. Could you please share a sample code if you made it work? Cheers!

@hairyf
Copy link

hairyf commented Feb 2, 2024

Using ethers-batch-request, you can perform some simple contract queries.

@ricmoo
Copy link
Member

ricmoo commented Feb 2, 2024

Batch requests are built-in and enabled by default in Ethers v6, and can be adjusted for max batch size and max request count.

If you wish to use multicall for aggregating many read operations, there is also the MulticallProvider which will use a deploy-less technique to merge many calls into a single call.

@macaujack
Copy link

macaujack commented Apr 16, 2024

Hi @ricmoo , by "Batch requests are built-in and enabled by default in Ethers v6", do you mean that when await Promise.all(transactionPromises), the JsonRpcProvider will try to batch the transactions as many as possible? If yes, then what's the differences between ethers.js v6's native JsonRpcProvider and the MulticallProvider in @ethers-ext/provider-multicall package? Except that MulticallProvider only supports readonly calls while JsonRpcProvider supports read-write calls.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants