Skip to content

Commit

Permalink
Other: Revamped benchmark, now also covers Google's JS implementation
Browse files Browse the repository at this point in the history
  • Loading branch information
dcodeIO committed Apr 7, 2017
1 parent 45356be commit f88dad0
Show file tree
Hide file tree
Showing 17 changed files with 1,981 additions and 630 deletions.
73 changes: 32 additions & 41 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ In a nutshell, the wire format writer understands the following types:

| Field type | Expected JS type (create, encode) | Conversion (fromObject)
|------------|-----------------------------------|------------------------
| s-/u-/int32<br />s-/fixed32 | `number` (32 bit integer) | `value | 0` if signed<br /> `value >>> 0` if unsigned
| s-/u-/int32<br />s-/fixed32 | `number` (32 bit integer) | `value | 0` if signed<br />`value >>> 0` if unsigned
| s-/u-/int64<br />s-/fixed64 | `Long`-like (optimal)<br />`number` (53 bit integer) | `Long.fromValue(value)` with long.js<br />`parseInt(value, 10)` otherwise
| float<br />double | `number` | `Number(value)`
| bool | `boolean` | `Boolean(value)`
Expand Down Expand Up @@ -695,61 +695,52 @@ Additional documentation
Performance
-----------
The package includes a benchmark that tries to compare performance to native JSON as far as this is possible. On an i7-2600K running node 6.9.1 it yields:
The package includes a benchmark that compares protobuf.js performance to native JSON (as far as this is possible) and [Google's JS implementation](https://github.com/google/protobuf/tree/master/js). On an i7-2600K running node 6.9.1 it yields:
```
benchmarking encoding performance ...

Type.encode to buffer x 547,361 ops/sec ±0.27% (94 runs sampled)
JSON.stringify to string x 310,848 ops/sec ±0.73% (92 runs sampled)
JSON.stringify to buffer x 173,608 ops/sec ±1.51% (86 runs sampled)
protobuf.js (reflect) x 547,366 ops/sec ±1.29% (90 runs sampled)
protobuf.js (static) x 525,722 ops/sec ±1.17% (91 runs sampled)
JSON (string) x 311,180 ops/sec ±0.67% (93 runs sampled)
JSON (buffer) x 183,724 ops/sec ±0.69% (92 runs sampled)
google-protobuf x 76,337 ops/sec ±0.73% (91 runs sampled)

Type.encode to buffer was fastest
JSON.stringify to string was 43.5% slower
JSON.stringify to buffer was 68.7% slower
protobuf.js (reflect) was fastest
protobuf.js (static) was 3.8% slower
JSON (string) was 42.8% slower
JSON (buffer) was 66.2% slower
google-protobuf was 86.0% slower

benchmarking decoding performance ...

Type.decode from buffer x 1,294,378 ops/sec ±0.86% (90 runs sampled)
JSON.parse from string x 291,944 ops/sec ±0.72% (92 runs sampled)
JSON.parse from buffer x 256,325 ops/sec ±1.50% (90 runs sampled)
protobuf.js (reflect) x 1,401,958 ops/sec ±0.78% (93 runs sampled)
protobuf.js (static) x 1,391,017 ops/sec ±0.78% (90 runs sampled)
JSON (string) x 301,749 ops/sec ±0.88% (93 runs sampled)
JSON (buffer) x 268,792 ops/sec ±0.84% (90 runs sampled)
google-protobuf x 186,727 ops/sec ±0.81% (90 runs sampled)

Type.decode from buffer was fastest
JSON.parse from string was 77.4% slower
JSON.parse from buffer was 80.3% slower
protobuf.js (reflect) was fastest
protobuf.js (static) was 0.8% slower
JSON (string) was 78.5% slower
JSON (buffer) was 80.8% slower
google-protobuf was 86.7% slower

benchmarking combined performance ...

Type to/from buffer x 254,126 ops/sec ±1.13% (91 runs sampled)
JSON to/from string x 122,896 ops/sec ±1.29% (90 runs sampled)
JSON to/from buffer x 88,005 ops/sec ±0.87% (89 runs sampled)
protobuf.js (reflect) x 274,685 ops/sec ±0.99% (89 runs sampled)
protobuf.js (static) x 278,352 ops/sec ±1.00% (90 runs sampled)
JSON (string) x 129,638 ops/sec ±0.83% (91 runs sampled)
JSON (buffer) x 90,904 ops/sec ±0.93% (87 runs sampled)
google-protobuf x 43,327 ops/sec ±0.89% (90 runs sampled)

Type to/from buffer was fastest
JSON to/from string was 51.7% slower
JSON to/from buffer was 65.3% slower

benchmarking verifying performance ...

Type.verify x 6,246,765 ops/sec ±2.00% (87 runs sampled)

benchmarking message from object performance ...

Type.fromObject x 2,892,973 ops/sec ±0.70% (92 runs sampled)

benchmarking message to object performance ...

Type.toObject x 3,601,738 ops/sec ±0.72% (93 runs sampled)
protobuf.js (static) was fastest
protobuf.js (reflect) was 1.3% slower
JSON (string) was 53.3% slower
JSON (buffer) was 67.3% slower
google-protobuf was 84.4% slower
```
Note that JSON is a native binding nowadays and as such is about as fast as it possibly can get. So, how can protobuf.js be faster?
* The benchmark is [somewhat flawed](https://github.com/dcodeIO/protobuf.js/blob/master/bench/index.js).
* Reader and writer interfaces configure themselves according to the environment to eliminate redundant conditionals.
* Node-specific reader and writer subclasses benefit from node's buffer binding.
* Reflection has built-in code generation that builds type-specific encoders, decoders and verifiers at runtime.
* Encoders and decoders do not implicitly call `verify` on messages to avoid unnecessary overhead where messages are already known to be valid. It's up to the user to call `verify` where necessary.
* Quite a bit of V8-specific profiling is accountable for everything else.
You can also run [the benchmark](https://github.com/dcodeIO/protobuf.js/blob/master/bench/index.js) ...
```
Expand Down
47 changes: 0 additions & 47 deletions bench/alloc.js

This file was deleted.

File renamed without changes.
File renamed without changes.
Loading

0 comments on commit f88dad0

Please sign in to comment.