Skip to content

Commit

Permalink
feat: allow emitting partial tokens & values
Browse files Browse the repository at this point in the history
  • Loading branch information
juanjoDiaz committed Nov 28, 2023
1 parent 53e77c6 commit 1234524
Show file tree
Hide file tree
Showing 32 changed files with 2,465 additions and 84 deletions.
29 changes: 26 additions & 3 deletions packages/node/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ The available options are:
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
numberBufferSize: <number>, // set to 0 to don't buffer.
separator: <string>, // separator between object. For example `\n` for nd-js.
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
}
```

Expand Down Expand Up @@ -82,6 +83,7 @@ The available options are:
paths: <string[]>,
keepStack: <boolean>, // whether to keep all the properties in the stack
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
}
```

Expand All @@ -108,7 +110,6 @@ const tokenParser = new TokenParser();
const jsonParser = tokenizer.pipeTrough(tokenParser);
```


You can subscribe to the resulting data using the

```javascript
Expand Down Expand Up @@ -138,7 +139,7 @@ Imagine an endpoint that send a large amount of JSON objects one after the other

const response = await fetch('http:https://example.com/');
const reader = response.body.pipe(parser);
reader.on('data', value => /* process element */)
reader.on('data', value => /* process element */);
```

### Stream-parsing a fetch request returning a JSON array
Expand All @@ -152,11 +153,33 @@ Imagine an endpoint that send a large amount of JSON objects one after the other

const response = await fetch('http:https://example.com/');

const reader = response.body.pipe(parse)getReader();
const reader = response.body.pipe(parse).getReader();

reader.on('data', ({ value, key, parent, stack }) => /* process element */)
```

### Stream-parsing a fetch request returning a very long string getting previews of the string

Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).

```js
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });

const response = await fetch('http:https://example.com/');

const reader = response.body.pipe(parse).getReader();

reader.on('data', ({ value, key, parent, stack, partial }) => {
if (partial) {
console.log(`Parsing value: ${value}... (still parsing)`);
} else {
console.log(`Value parsed: ${value}`);
}
});
```

## License

See [LICENSE.md].
Expand Down
29 changes: 26 additions & 3 deletions packages/node/dist/deno/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ The available options are:
stringBufferSize: <number>, // set to 0 to don't buffer. Min valid value is 4.
numberBufferSize: <number>, // set to 0 to don't buffer.
separator: <string>, // separator between object. For example `\n` for nd-js.
emitPartialTokens: <boolean> // whether to emit tokens mid-parsing.
}
```

Expand Down Expand Up @@ -82,6 +83,7 @@ The available options are:
paths: <string[]>,
keepStack: <boolean>, // whether to keep all the properties in the stack
separator: <string>, // separator between object. For example `\n` for nd-js. If left empty or set to undefined, the token parser will end after parsing the first object. To parse multiple object without any delimiter just set it to the empty string `''`.
emitPartialValues: <boolean>, // whether to emit values mid-parsing.
}
```

Expand All @@ -108,7 +110,6 @@ const tokenParser = new TokenParser();
const jsonParser = tokenizer.pipeTrough(tokenParser);
```


You can subscribe to the resulting data using the

```javascript
Expand Down Expand Up @@ -138,7 +139,7 @@ Imagine an endpoint that send a large amount of JSON objects one after the other

const response = await fetch('http:https://example.com/');
const reader = response.body.pipe(parser);
reader.on('data', value => /* process element */)
reader.on('data', value => /* process element */);
```

### Stream-parsing a fetch request returning a JSON array
Expand All @@ -152,11 +153,33 @@ Imagine an endpoint that send a large amount of JSON objects one after the other

const response = await fetch('http:https://example.com/');

const reader = response.body.pipe(parse)getReader();
const reader = response.body.pipe(parse).getReader();

reader.on('data', ({ value, key, parent, stack }) => /* process element */)
```

### Stream-parsing a fetch request returning a very long string getting previews of the string

Imagine an endpoint that send a large amount of JSON objects one after the other (`"Once upon a midnight <...>"`).

```js
import { JSONParser } from '@streamparser/json-node';

const parser = new JSONParser({ stringBufferSize: undefined, paths: ['$.*'], keepStack: false });

const response = await fetch('http:https://example.com/');

const reader = response.body.pipe(parse).getReader();

reader.on('data', ({ value, key, parent, stack, partial }) => {
if (partial) {
console.log(`Parsing value: ${value}... (still parsing)`);
} else {
console.log(`Value parsed: ${value}`);
}
});
```

## License

See [LICENSE.md].
Expand Down
16 changes: 0 additions & 16 deletions packages/node/src/utils.ts

This file was deleted.

Loading

0 comments on commit 1234524

Please sign in to comment.