Addition to benchmarks #6
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Hey, first of all, good work on this module here!
I was reviewing your module, as I was interested to utilize it to speed up communication within micro service networks. I also compared it to major solutions like msgpack. While I looked at your benchmarks I quickly saw, that it does not put any strings into account yet. I slightly modified the benchmark to also process some strings and this are the results:
Encoding got 17% slower than JSON.stringify, but decode is still fast. Still pretty good though.
Just wanted to have that shared, will do one to one comparisions next.
Here are the timings:
Benchmark without strings:
Benchmark with strings:
Run on
Intel(R) Core(TM) i7-7820HK CPU @ 2.90GHreal clock while benchmark @ ~3.4GHSo msgpack in both of the major libraries is significantly slower. Your module is a bit slower than JSON.stringify on the encode side.
I also made another test with a bit more real world data and smaller payloads, which results in:
Seems like in all cases msgpack is a bad decision, only if the only thing that matters is network. Still your module delivers a good performance here. Also on the encoding side again this time. Suprisingly enough, the payload mainly consists out of strings and now it performs better again.
All in all it seems like this module could be good choice to improve performance. On the wire as well as on the encoding/decoding side.
Facts on the median:
For me, dealing with 177 kB of JSON + Buffers I saw that it was slower than plain JSON+blob for both encode and decode. It did reduce the JSON size a bit though.
@jdalton Seems to depend, had no problems yet in the simulations. Do you have an example on how your payloads were structured?
Arrays of objects and a buffer property.
Ok, had these in my tests too with documents as big as 4MiB, could be a construction issue or something. How deep do your objects go? How big are your buffers? Would be cool to know this to see where the limitations here are since msgpack seems to be a really bad option compared to v8s json en/decoding capabilities.
Just one level deep.
Do you have an example object? I would then write a generator function for it and run some tests against it.
And thanks for your input from your experiences here @jdalton :)