-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a JsonRpcBatchProvider (using JSON-RPC batch calls) #62
Comments
Hi @doraemondrian, I don't think ethers has batch requests. |
Thanks @GFJHogue - Do you know if this is intentional? Or if there's a plan to implement this in the future? As far as I know technically it just involves
so I think it can be done without too much hassle, but was wondering if this was the direction this library wanted to go or if there were other ideas. p.s. BTW how do people normally handle cases where normally batch requests are required? I'm pretty much a newbie with Ethereum, and as far as I have learned so far, currently there doesn't seem to be a way to return a full array of structs from the blockchain so the only option is to return an array of addresses and iterate through each and make subsequent N requests to the blockchain to get more data. I think this works fine if you're using a full node since each request is local, but nowadays most DApps use things like Metamask that makes calls to Infura, which means each request is an HTTP request, (This was why I was happy to discover batch request in the spec) so it feels really unscalable. How do most people deal with programming DApps without using batch requests? I guess depending on existing solutions I may not need a batch request after all, but would appreciate any help. Thank you! |
Heya! So, I had not considered batch requests, but it is something I could transparently add to a JsonRpcBatchProvider, however you won’t even need to know it was batching calls. It would just defer requests for 10ms and create batch calls up to some max size within that window. The only method that would need to be overridden is That said, the goal is to keep things simple and have a straight forward provider interface, that is kind of lowest common denominator. For example, if we had a special batch request method, things like Etherscan (which do not have a concept of batching) would need to have something magic done. Have you found you require batching? For performance? There are better ways, I think, to solve this. Creating multiple connections to multiple provides, for example. The The v2 of the ABI is supported in ethers.js, but it is considered an experimental feature in Solidity, so you have enable the pragma to use it, and it should not be used for production. This allows you to return structs, and arrays of structs. |
This is a really great idea! Most people who would want a batch request don't care about what goes on underneath but just want to achieve something like what I described above, so I agree it makes sense to abstract this out.
This may sound clueless and I apologize in advance but isn't batch request part of the Ethereum's JSON-RPC standard? https://github.com/ethereum/wiki/wiki/JSON-RPC#json-rpc-support I guess I'm kind of confused why batch request is considered a special method. Could you explain?
Could you elaborate on this? So for example on solidity end I have
Yes, in fact I've come across it in another issue thread on this repo and looking forward to it! It's just that since it's an experimental feature I would like to be a bit conservative on this until it becomes official, which was why I was looking into batch request in the first place. BTW thank you so much for the library! |
Batch is actually part of the JSON-RPC 2.0 specification (not specific to just Ethereum). However, not all Ethereum providers are JSON-RPC, and I want to keep the idea of a Provider as general as possible. A person should be able to write code against any Provider, and it should just work. You can imagine if a person wrote a library that used a batching API on a provider, which worked fine against their local Ethereum node, but in the wild failed against an EtherscanProvider. So, it's about keeping the API abstract. Batching is not a thing a normal developer should even need to know or think about. So, for now, depending on the rate at which you are querying there are two options. If the volume is somewhat low and you don't mind fetching all results simultaneously: var reqs = [];
for (var j = 0; j < 100; j++) {
reqs.push(contract.someFunction(params));
}
Promise.all(reqs).then(function(results) {
console.log(results);
}) Or if you want something more serial for a larger number of requests, than parallel: var results = [];
var seq = Promise.resolve();
for (var j = 0; j < 100; j++) {
seq = seq.then(function() {
return contract.someFunction(params).then(function(result) {
results.push(result);
});
});
}
seq.then(function() {
console.log(results);
}); There may be typos in the above examples, but those give you the general idea on how you can simulate the fetching you are trying to do using the built-in Promise paradigms. Definitely, experimental features are quite experimental. For now, you can just have a standard getter on your contract that takes in an index and returns that result. For very complex applications, I expect apps will likely have centralized databases to facilitate indexing and then the authoritative result can be verified against the blockchain. Does that all make sense? I haven't had enough caffeine today, and it was an early morning. :) |
Yup totally makes sense, thanks for the detailed answer! 👍 |
I wanted to return to this for a very specific use case. In my use case, I have to figure out a way to get 25k transactions into the transaction pool as fast as possible over the jsonrpc interface. Now, even with concurrency, a single rpc call will take you a few millisends, (take 10 for easy calculation), that would limit the speed i can get transactions into the chain to 100/second. (1000ms/10) While I can spread out my infrastructure to e.g 2 nodes taking in transactions (they do not scale linear but even if they would) i would be at 200. It is not feasible to put 250 nodes up to just take this load. While I have not been able to test this hypothesis since we are about halfway in our migration to Ethers, batching would allow me to do e.g 100 calls in 1 rpc call, and 100 calls in 1 second, so i'm already at 10000tx/second into the transaction pool. Now my nodes have been tuned to handle these amounts in the blocks, so it would work out great. As I said, still theory since I do not know how the clients handle incoming batch requests, but it would at least save me the network IO. |
It is certainly easy to cobble together a very quick and dirty one from the existing JSON-RPC provider, if you need this sooner than I have the time for. But let me know the outcome of your tests first, since there is a lot of other features I would like to prioritize as more frequently used. |
Did you ever look into the numbers? I recently learned that batch requests are generally slower, because they are performed serially, so it will likely reduce the speed of your script, by quite a lot, actually. A colleague has been playing with it. I think I’m going to close this issue, since the WebSocketProvider will likely be better suited to your needs, or better yet Nick Johnson’s GraphQL he has added to Geth (100x to 1000x performance improvements for receipt retrieval, for example). See: ethereum/go-ethereum#17903 To discuss further, please feel free to re-open too. Thanks! :) |
I've added |
Hi I can't see documentation for this anywhere. Can you provide a brief example? Thanks. The only code I see from the source code is In my case I want to request about 1000 contract calls (read-only). And somehow our ETH node (Geth) would hang if multiple requests are sent at the same time. |
The send method aggregates requests and is sent all requests every 10 milliseconds. I apologize for my bad english) |
@datvm Yes, @spb-web is correct. You just use it like a normal ethers provider and it will aggregate the calls for you. There isn’t currently a way force a batch to begin or end, it just aggregates calls within the same event loop (with some fuzzing of 10ms). Any call with a dependency will be postponed into the first event loop after all its dependencies have responded. I’m curious if 1 million calls would work though, that might be a tall order. I have been planning to add a maximum batch size and count property to it. If it is too big for it, I can add those sooner than later. But long story short, there is nothing special you need to do. :) |
I see this in the code but it's not exported in the NPM package? Do you need to pull from source to use it? |
@austinbv It's in the package, at least in the latest version. |
Yeah it's available in the package, pls see following example. You can use whatever that fits your style ;) import { ethers } from 'ethers';
const provider = new ethers.providers.JsonRpcBatchProvider('url')
import { JsonRpcBatchProvider } from '@ethersproject/providers'
const provider = new JsonRpcBatchProvider('url') |
Hi, thanks for the library, it really is a great simple alternative to Web3.
I was trying to port an existing web3 based app to using this library and couldn't find a batch request either on the documentation or the code.
Does this library have a batch request functionality?
The text was updated successfully, but these errors were encountered: