From 315ad4f8e290a644d9e3d4333c68421ec30a1ebd Mon Sep 17 00:00:00 2001 From: Sergio Pedri Date: Fri, 1 Nov 2019 17:57:11 +0100 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index c6a625b..a838816 100644 --- a/README.md +++ b/README.md @@ -3,8 +3,8 @@ **BinaryPack** is a binary serialization library inspired by `MessagePack`, but even faster, more efficient and producing smaller files. The goal of this project is to be able to use **BinaryPack** as a drop-in replacement for JSON, XML, `MessagePack` or `BinaryFormatter` serialization, when the serialized models don't need to be shared with other applications or with web services. **BinaryPack** is built to be as fast and memory efficient as possible: it uses virtually no memory allocations, and the serialized data is packed to take up as little space as possible. Whenever you're using either JSON, `MessagePack`, XML or some other format to cache data for your apps, to send data between clients or to save data that is not critical, you can try using **BinaryPack** over your previous serialization library - it will provide the same basic functionalities to serialize and deserialize models, but with much higher performance and less memory usage. -### Alright, it's "faster and more efficient", but how much? ![BinaryPack-benchmark](https://i.imgur.com/WJYuBXK.png) + This benchmark was performed with the [JsonResponseModel](https://github.com/Sergio0694/BinaryPack/blob/master/unit/BinaryPack.Models/JsonResponseModel.cs) class available in the repository, which contains a number of `string`, `int`, `double` and `DateTime` properties, as well as a collection of other nested models, representing an example of a JSON response from a REST API. This README also includes a number of benchmarks that were performed on a number of different models. The benchmark code and all the models used can be found in this repository as well. To summarize: - **BinaryPack** was consistently the fastest library, both during serialization and deserialization. The performance difference ranged from **7.6x** faster than `Newtonsoft.Json`, **7x** than [`Utf8Json`](https://github.com/neuecc/Utf8Json) and **1.9x** than `MessagePack` when serializing a small JSON response, to **245x** faster than `Newtonsoft.Json`, **129x** than `Utf8Json` and **3.9x** than `MessagePack` when dealing with mostly binary data (eg. a model with a large `float[]` array). - The memory usage was on average on par or better than `Utf8Json`, except when deserializing mostly binary data, in which case **BinaryPack** used **1/100** the memory of `Utf8Json`, and **1/2** that of `MessagePack`. **BinaryPack** also almost always resulted in the lowest number of GC collections during serialization and deserialization of models.