-
Notifications
You must be signed in to change notification settings - Fork 208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
jsonbig doesn't seem to be as performant #114
Comments
thx for letting us know, would be great if we could get a smoking gun on this (is it really size - off the whole set, or off one of the fields?, or rather depth/complexitiy or maybe one specific subtype of data that is causing this?) Having a sample of json and/or test-case that fails would be most helpfull. Is there a sample you can share? I'm also quite sure @sidorares will like to know about this to fix in https://github.com/sidorares/json-bigint |
Thanks @ryanwilliamquinn, like @marc-portier said it would be really helpful to have an example with which JSONbig.parse hangs, to be honest I haven't try with a sample of 50mb maybe the size's criteria is enough. Also I need to benchmark JSON vs JSONbigint, if the performance are not similar (and this problem is not a bug), we should allow the user to switch between the two and document the problem. |
@ryanwilliamquinn can you post example data somewhere? ( link to dropbox etc ) Feel free to open https://github.com/sidorares/json-bigint/issues/new @lbdremy it's definitely slower. Built-in |
2322420 bytes JSON: |
I'm not sure I can post the data, I will talk to my boss about it and get back to you. I tried manually parsing the data in the node repl. I downloaded the json data from solr, read it from the file, and tried both parsers. JSON.parse works fine, in about 2 seconds. The json-big parser ends up throwing an error after a second, saying that the data: has no method 'charAt' |
I talked to my boss and unfortunately we can't distribute the data. I can run any script you want on the data though. |
can you make sure input is a string (it looks like it's a Buffer)? I might add extra |
readFile returns buffer unless you pass encoding as extra argument |
Oops wrong click, adding back the message... This is the code where the response is deserialized https://github.com/lbdremy/solr-node-client/blob/master/lib/solr.js#L715-L743, |
ah got it, sorry about that. here is the latest test i ran:
here is the response: json.parse: 1047ms FATAL ERROR: JS Allocation failed - process out of memory the error occurred after about 10 minutes. i could try with a smaller dataset if that would be helpful. |
3.5M data: 17M data: 35M data: 52M: 71M: |
I think I need to rewrite it and have first pass producing valid json with big numbers replaced by string + some sentinel value and second pass - native JSON.parse + reviver |
Ok thanks @sidorares. |
The first thing to do is to allow the user to use the native Also @ryanwilliamquinn, do you need support for big number for your response or |
@lbdremy I immediately (2-3 weeks ago) flagged the bignumber issue in SOLR land as: https://issues.apache.org/jira/browse/SOLR-6364 There already is a conceptual solution there to introduce some param (&json.long) and value ('long'=default or 'string' probably) to return these bigints in the json as being string. But AFAICS they have no active plan to implement and release such thing. (Even if the own js based solr ui is affected by this as well.) Obviously providing a patch might enhance their enthusiasm. On short notice I think a client-options-switch to be able to choose between classic json or the json-bigint is our safest bet. |
@lbdremy I do not need support for big number, JSON.parse is fine for me. |
Alright, considering the difference of performance between |
@lbdremy I'll let you know when two-pass version of JSONbig is available ( it should have nearly native performance ) |
Ok thanks @sidorares |
@lbdremy 100% ok with the approach - makes sense And also: if #90 can slide into 0.4.0 that would be great, but if it needsmore work we should not keep others waiting... Thx. |
Ok I think we are done with this, |
i'm parsing large json responses (about 50mb). it works fine in 0.2.9, but in 0.3 it hangs on JSONbig.parse(text). if i change that back to JSON.parse it works again. maybe the answer is for me to get less data per batch, just thought i would let you know.
The text was updated successfully, but these errors were encountered: