source: trip-planner-front/node_modules/@discoveryjs/json-ext/README.md@ 59329aa

Last change on this file since 59329aa was 6a3a178, checked in by Ema <ema_spirova@…>, 3 years ago

initial commit

  • Property mode set to 100644
File size: 8.5 KB
RevLine 
[6a3a178]1# json-ext
2
3[![NPM version](https://img.shields.io/npm/v/@discoveryjs/json-ext.svg)](https://www.npmjs.com/package/@discoveryjs/json-ext)
4[![Build Status](https://travis-ci.org/discoveryjs/json-ext.svg?branch=master)](https://travis-ci.org/discoveryjs/json-ext)
5[![Coverage Status](https://coveralls.io/repos/github/discoveryjs/json-ext/badge.svg?branch=master)](https://coveralls.io/github/discoveryjs/json-ext?)
6[![NPM Downloads](https://img.shields.io/npm/dm/@discoveryjs/json-ext.svg)](https://www.npmjs.com/package/@discoveryjs/json-ext)
7
8A set of utilities that extend the use of JSON. Designed to be fast and memory efficient
9
10Features:
11
12- [x] `parseChunked()` – Parse JSON that comes by chunks (e.g. FS readable stream or fetch response stream)
13- [x] `stringifyStream()` – Stringify stream (Node.js)
14- [x] `stringifyInfo()` – Get estimated size and other facts of JSON.stringify() without converting a value to string
15- [ ] **TBD** Support for circular references
16- [ ] **TBD** Binary representation [branch](https://github.com/discoveryjs/json-ext/tree/binary)
17- [ ] **TBD** WHATWG [Streams](https://streams.spec.whatwg.org/) support
18
19## Install
20
21```bash
22npm install @discoveryjs/json-ext
23```
24
25## API
26
27- [parseChunked(chunkEmitter)](#parsechunkedchunkemitter)
28- [stringifyStream(value[, replacer[, space]])](#stringifystreamvalue-replacer-space)
29- [stringifyInfo(value[, replacer[, space[, options]]])](#stringifyinfovalue-replacer-space-options)
30 - [Options](#options)
31 - [async](#async)
32 - [continueOnCircular](#continueoncircular)
33- [version](#version)
34
35### parseChunked(chunkEmitter)
36
37Works the same as [`JSON.parse()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse) but takes `chunkEmitter` instead of string and returns [Promise](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise).
38
39> NOTE: `reviver` parameter is not supported yet, but will be added in next releases.
40> NOTE: WHATWG streams aren't supported yet
41
42When to use:
43- It's required to avoid freezing the main thread during big JSON parsing, since this process can be distributed in time
44- Huge JSON needs to be parsed (e.g. >500MB on Node.js)
45- Needed to reduce memory pressure. `JSON.parse()` needs to receive the entire JSON before parsing it. With `parseChunked()` you may parse JSON as first bytes of it comes. This approach helps to avoid storing a huge string in the memory at a single time point and following GC.
46
47[Benchmark](https://github.com/discoveryjs/json-ext/tree/master/benchmarks#parse-chunked)
48
49Usage:
50
51```js
52const { parseChunked } = require('@discoveryjs/json-ext');
53
54// as a regular Promise
55parseChunked(chunkEmitter)
56 .then(data => {
57 /* data is parsed JSON */
58 });
59
60// using await (keep in mind that not every runtime has a support for top level await)
61const data = await parseChunked(chunkEmitter);
62```
63
64Parameter `chunkEmitter` can be:
65- [`ReadableStream`](https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_readable_streams) (Node.js only)
66```js
67const fs = require('fs');
68const { parseChunked } = require('@discoveryjs/json-ext');
69
70parseChunked(fs.createReadStream('path/to/file.json'))
71```
72- Generator, async generator or function that returns iterable (chunks). Chunk might be a `string`, `Uint8Array` or `Buffer` (Node.js only):
73```js
74const { parseChunked } = require('@discoveryjs/json-ext');
75const encoder = new TextEncoder();
76
77// generator
78parseChunked(function*() {
79 yield '{ "hello":';
80 yield Buffer.from(' "wor'); // Node.js only
81 yield encoder.encode('ld" }'); // returns Uint8Array(5) [ 108, 100, 34, 32, 125 ]
82});
83
84// async generator
85parseChunked(async function*() {
86 for await (const chunk of someAsyncSource) {
87 yield chunk;
88 }
89});
90
91// function that returns iterable
92parseChunked(() => ['{ "hello":', ' "world"}'])
93```
94
95Using with [fetch()](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API):
96
97```js
98async function loadData(url) {
99 const response = await fetch(url);
100 const reader = response.body.getReader();
101
102 return parseChunked(async function*() {
103 while (true) {
104 const { done, value } = await reader.read();
105
106 if (done) {
107 break;
108 }
109
110 yield value;
111 }
112 });
113}
114
115loadData('https://example.com/data.json')
116 .then(data => {
117 /* data is parsed JSON */
118 })
119```
120
121### stringifyStream(value[, replacer[, space]])
122
123Works the same as [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify), but returns an instance of [`ReadableStream`](https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_readable_streams) instead of string.
124
125> NOTE: WHATWG Streams aren't supported yet, so function available for Node.js only for now
126
127Departs from JSON.stringify():
128- Outputs `null` when `JSON.stringify()` returns `undefined` (since streams may not emit `undefined`)
129- A promise is resolving and the resulting value is stringifying as a regular one
130- A stream in non-object mode is piping to output as is
131- A stream in object mode is piping to output as an array of objects
132
133When to use:
134- Huge JSON needs to be generated (e.g. >500MB on Node.js)
135- Needed to reduce memory pressure. `JSON.stringify()` needs to generate the entire JSON before send or write it to somewhere. With `stringifyStream()` you may send a result to somewhere as first bytes of the result appears. This approach helps to avoid storing a huge string in the memory at a single time point.
136- The object being serialized contains Promises or Streams (see Usage for examples)
137
138[Benchmark](https://github.com/discoveryjs/json-ext/tree/master/benchmarks#stream-stringifying)
139
140Usage:
141
142```js
143const { stringifyStream } = require('@discoveryjs/json-ext');
144
145// handle events
146stringifyStream(data)
147 .on('data', chunk => console.log(chunk))
148 .on('error', error => consold.error(error))
149 .on('finish', () => console.log('DONE!'));
150
151// pipe into a stream
152stringifyStream(data)
153 .pipe(writableStream);
154```
155
156Using Promise or ReadableStream in serializing object:
157
158```js
159const fs = require('fs');
160const { stringifyStream } = require('@discoveryjs/json-ext');
161
162// output will be
163// {"name":"example","willSerializeResolvedValue":42,"fromFile":[1, 2, 3],"at":{"any":{"level":"promise!"}}}
164stringifyStream({
165 name: 'example',
166 willSerializeResolvedValue: Promise.resolve(42),
167 fromFile: fs.createReadStream('path/to/file.json'), // support file content is "[1, 2, 3]", it'll be inserted as it
168 at: {
169 any: {
170 level: new Promise(resolve => setTimeout(() => resolve('promise!'), 100))
171 }
172 }
173})
174
175// in case several async requests are used in object, it's prefered
176// to put fastest requests first, because in this case
177stringifyStream({
178 foo: fetch('http://example.com/request_takes_2s').then(req => req.json()),
179 bar: fetch('http://example.com/request_takes_5s').then(req => req.json())
180});
181```
182
183Using with [`WritableStream`](https://nodejs.org/dist/latest-v14.x/docs/api/stream.html#stream_writable_streams) (Node.js only):
184
185```js
186const fs = require('fs');
187const { stringifyStream } = require('@discoveryjs/json-ext');
188
189// pipe into a console
190stringifyStream(data)
191 .pipe(process.stdout);
192
193// pipe into a file
194stringifyStream(data)
195 .pipe(fs.createWriteStream('path/to/file.json'));
196
197// wrapping into a Promise
198new Promise((resolve, reject) => {
199 stringifyStream(data)
200 .on('error', reject)
201 .pipe(stream)
202 .on('error', reject)
203 .on('finish', resolve);
204});
205```
206
207### stringifyInfo(value[, replacer[, space[, options]]])
208
209`value`, `replacer` and `space` arguments are the same as for `JSON.stringify()`.
210
211Result is an object:
212
213```js
214{
215 minLength: Number, // minimal bytes when values is stringified
216 circular: [...], // list of circular references
217 duplicate: [...], // list of objects that occur more than once
218 async: [...] // list of async values, i.e. promises and streams
219}
220```
221
222Example:
223
224```js
225const { stringifyInfo } = require('@discoveryjs/json-ext');
226
227console.log(
228 stringifyInfo({ test: true }).minLength
229);
230// > 13
231// that equals '{"test":true}'.length
232```
233
234#### Options
235
236##### async
237
238Type: `Boolean`
239Default: `false`
240
241Collect async values (promises and streams) or not.
242
243##### continueOnCircular
244
245Type: `Boolean`
246Default: `false`
247
248Stop collecting info for a value or not whenever circular reference is found. Setting option to `true` allows to find all circular references.
249
250### version
251
252The version of library, e.g. `"0.3.1"`.
253
254## License
255
256MIT
Note: See TracBrowser for help on using the repository browser.