Giter VIP home page Giter VIP logo

json-stream-stringify's People

Contributors

envek avatar faleij avatar mmsqe avatar slavikpr avatar yuripetusko avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

json-stream-stringify's Issues

Add usage example with Koa+Koa Router

app.get('/api/users', (ctx) => {
  ctx.type = 'json'; // Required for proper handling by test frameworks and some clients
  ctx.body = new JsonStreamStringify(Users.find().stream());
});

How to test streaming responses in Express.js?

Hi! Thank you very much for this library: it works like a charm!

But I can't get my Express.js routing tests to pass after switching to streaming responses. Am I missing something in my setup?

The problem

  1. My end-to-end tests doesn't pass anymore (seems like they doesn't wait for all the data)

    expect((await request(app).get('/streaming')).body).toEqual({some: 'result'}); // But it gets {} instead of {"some":"result"}
  2. In logs wrong request runtime is being reported (request took 10 seconds but it says that it took 1 second, and no response body size info present, I suppose that it is time to first results being streamed):

    without streaming:
    GET /sync 200 14.797 ms - 43
    with streaming (it executes for more than a second and writes the same 43 bytes):
    GET /streaming 200 6.513 ms - -
    

Environment
Node.js: 12.18.3
express.js: 4.16.4
json-stream-stringify: 2.0.2

Application code

router.get('/streaming', async function(request, response, next) {
  const immediate = Promise.resolve("Now");
  const postponed = new Promise((resolve) => setTimeout(() => resolve("or never!"), 1000));
  new JsonStreamStringify({ immediate, postponed }).pipe(response);
});

Test

describe('GET /straming', () => {
  it('should return response', async () => {
    const res = await request(app).get('/streaming')
    expect(res.body).toEqual({ immediate: 'Now', postponed: 'or never!' });
  });
});

I'm using Jest and Supertest.

And it fails:

    expect(received).toEqual(expected) // deep equality

    - Expected  - 4
    + Received  + 1

    - Object {
    -   "immediate": "Now",
    -   "postponed": "or never!",
    - }
    + Object {}

      13 |   it('should return response', async () => {
      14 |     const res = await request(app).get('/streaming')
    > 15 |     expect(res.body).toEqual({ immediate: 'Now', postponed: 'or never!' });
         |                      ^
      16 |   });
      17 | });

Example app
To test and reproduce: https://github.com/Envek/express-streaming-test

Thank you in advance! 🙏

Error on importing with Typescript

Hi, I use Typescript, the code is similar to

import JsonStreamStringify from 'json-stream-stringify';

new JsonStreamStringify(data, undefined, 2);

This is the error:

TypeError: json_stream_stringify_1.default is not a constructor

Can anyone explain why and how to resolve this?

Trailing comma when replacing last key

If you return undefined for the last key in an object, a trailing comma is rendered in the output JSON.

For example:

var s = new JsonStreamStringify({a: 12, b: 14, c: 13}, function (k, v) { if (k == 'c') return undefined; return v; });
s.pipe(process.stdout); // outputs {"a":12,"b":14,}

Repeated reference misattributed as circular reference

Hi - the serialization algorithm appears to misidentify a previously seen object as a circular reference, and replaces it with a $ref value. Is this behavior intentional?

const JSONStreamStringify = require('json-stream-stringify');
const foo = { foo: 'a' };
JSONStreamStringify([
    foo, foo
]).pipe(process.stdout);

// => [{"foo":"a"},{"$ref":"$[0]"}]

Any suggestions for reducing CPU usage?

Hi there,

First off, thank you so much for making this. I’m seeing loop delay in the 0.x ms’s which is amazing (compare, e.g, to loop delay with the same data structure with bfj of ~150ms).

I am, however, seeing sustained CPU usage of > 100% when either constructing a string of the result in memory or piping the result to a file. Would you have any quick suggestions for reducing CPU load?

And thanks again :)

Critical | Unable to handle blank keys in JSON object

Please find below the sample JSON object with an empty key

{'': 'data1', 'key1': ''}

function replacer(key, value) {
if (!key) {
console.log('YES' , key, value);
}
return value;
}

let str = '';
const JsonStreamStringify = require('json-stream-stringify');
const stream = new JsonStreamStringify({'': 'data1', 'key1': ''}, replacer , 0, false);
stream.on('data', (data) => {
str += data.toString();
});
stream.on('end', function () {
console.log('PARSING NOW ' , str);
});

On the end event, the output is
// PARSING NOW {"data1","key1":""}

Please note the JSON is malformed.

readAsPromised - stuck if stream alredy "end"

readAsPromised - stuck if stream alredy "end"

please add check if (stream.readableEnded) endListener();

Case: when i try write object with Readable field where can be another big object
when stream write all data from Readable stream, readAsPromised - stuck

function readAsPromised(stream: Readable, size: number) {
    const value = stream.read(size);

    if (value === null) {
        return new Promise((resolve, reject) => {
            const endListener = () => resolve(null);
            stream.once('end', endListener);
            stream.once('error', reject);
            stream.once('readable', () => {
                stream.removeListener('end', endListener);
                stream.removeListener('error', reject);
                resolve(stream.read());
            });
            // FIX: Resolve promise when stream already "end" 
            if (stream.readableEnded) {
                endListener();
            }
            // End fix
        });
    }
    return Promise.resolve(value);
}

Readable Stream read 16,384 objects in ObjectMode

in objectMode readebleStream read 16,384 object at one call read and ignore highWaterMark setup.

I think the problem is here:

   // function - setReadableObjectItem(input: Readable, parent: Item)
   const item = <any>{
      type: 'readable object',
      async read(size: number) {
        try {
          let out = '';
          // by default size = 16384
          const data = await readAsPromised(input, size);
          if (data === null) {
            if (i && that.indent) {
              out += `\n${parent.indent}`;
            }

my suggestion:

   // function - setReadableObjectItem(input: Readable, parent: Item)
   const item = <any>{
      type: 'readable object',
      async read() {
        try {
          let out = '';
          // if send size as undefined ReadebleStream will use highWaterMark
          const data = await readAsPromised(input);
          if (data === null) {
            if (i && that.indent) {
              out += `\n${parent.indent}`;
            }

support for promises returned by replacer()

It would be great if it was possible to return a Promise from the replacer function.

JSONStreamStringify(1, function (key, value) {
  return new Promise((resolve) => {
    resolve(2)
  })
})

Currently, the RecursiveIterable drops the returned promise as if it were undefined.

Transpilation issues

This bundle uses ES2015+ but does not transpile itself to the ES5 which makes it difficult to bundle it via bundlers like webpack or browserify - it is common to omit node_modules from parsing with Babel or other transpilers.

I suggest to make a prepublish script which would transpile source to ES5 in /lib/ folder and setting main in package.json to lib/jsonStreamify.js

Throws: Uncaught RangeError: Invalid string length

Thanks for the library!

I tried using this library to bypass this JSON.stringify bug: nodejs/help#1061
But the library is throwing the same error.

I don't have time to debug it but I thought I'd flag it. I wonder if maybe this library is using JSON.stringify at some point and it's not navigating recursively into the object?

How to reproduce (it takes a while, maybe you'll be able to get it with a smaller array):

> JSON.stringify( Array.from({length: 10000000}).fill({a: 'streaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringify'}))
Uncaught RangeError: Invalid string length
    at JSON.stringify (<anonymous>)
> new JSONStreamStringify(Array.from({length: 10000000}).fill({a: 'streaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringifystreaming-json-stringify'})).then(console.log)
Promise { <pending> }
Uncaught RangeError: Invalid string length

Stringify in batches (respect stream read size)

I'm sending a huge array of numbers (around 100MB as JSON string, and I known this is not an optimal format for the task). This module helped enormously in cutting memory consumption, but an issue remains that serialization now takes forever due to JSON stream yielding only one token per tick.

In my case consumer repeatedly calls read with a size argument of 16384, If JSON stream respected this argument (which is currently ignored), it would have sped things up considerably while preserving reasonable memory consumption.

_read(n) {
super._read(n);
if (this._done) {
return false;
}
if (!this._running) {
this._running = true;
this._handle(this._generator.next());
}
return !this._done;
}

Empty arrays detected as cycle

{
  raw: { reference: '/items' },
  partiallyEvaluated: { resolved: [] },
  evaluated: []
}

results in the error Error: Converting circular structure to JSON

If cycle is set to true it will create an result like
{"raw":{"reference":"/items"},"partiallyEvaluated":{"resolved":[]},"evaluated":{"$ref":"$[\"input\"][\"partiallyEvaluated\"][\"resolved\"]"}}

It only occurs if the same array is used twice.

Full code example:

const { JsonStreamStringify } = require("json-stream-stringify");

const array = [];

const val = {
  raw: { reference: "/items" },
  partiallyEvaluated: { resolved: array },
  evaluated: array,
};

const stream = new JsonStreamStringify(val);

stream.on("data", (chunk) => {
  console.log("data", chunk);
});

stream.on("error", (err) => {
  console.log("err", err);
});

stream.on("close", () => {
  console.log("close");
});

How to implement graceful handling of source streams' errors?

TL;DR: I want to handle errors in source streams, properly terminate JSON for them and also append error info to JSON itself. How can I do this with syntax like this:

new JsonStreamStringify({ result: dataStream, error: wasErrorPromise });
// and to get
// {"result":[{"some":"data"},{"test":"data"}],"error":{"message":"Boom!"}}
// instead of
// {"result":[{"some":"data"},{"test":

Currently, if source stream emits error, json-stream-stringify also immediately emits error here:

realValue.once('error', (err) => {
this.error = true;
this.emit('error', err);
});

In real life (when piping to HTTP response object) it means that connection to client will be reset immediately with partial data sent (and partial means “interrupted inside a string maybe”). I want to gracefully handle it and send a valid JSON to client with partial data and some error message.

So, basically, I want to do something like this:

const { Readable } = require('stream');
const JsonStreamStringify = require('json-stream-stringify');

const dataStream = new Readable({ objectMode: true });
dataStream._read = () => {};

const wasError = new Promise(
  (resolve, reject) => {
    dataStream.on('error', resolve);
    dataStream.on('end', resolve); // will be resolved with undefined eliminating error key from result
  }
);

const jsonStream = new JsonStreamStringify({ result: dataStream, error: wasError });
let result = '';
jsonStream.on('data', (data) => result += data);
jsonStream.on('error', (error) => console.error(error));

dataStream.push({ some: 'data' });
dataStream.push({ more: 'data' });
dataStream.emit('error', { message: 'Boom!' });

console.log(result);

Here I want to get following result:

{"result":[{"some":"data"},{"more":"data"}],"error":{"message":"Boom!"}}

or, if json-stream-stringify hasn't drained dataStream yet (it is totally fine for simplicity):

{"result":[{"some":"data"}],"error":{"message":"Boom!"}}

But I get

{"result":[{"some":"data"},{"more":"data"}

And in real life, for example when I streaming data from postgres database with pg-query-stream and server or connection dies while streaming, data can be truncated between key and value in response (and client will get an error because of connection reset):

{"result":[{"book_ref":"005F0D","book_date":"2017-02-12T00:12:00.000Z"},{"book_ref":
// curl: (18) transfer closed with outstanding read data remaining

I wondering of some kind of wrapper I can create for source stream(s) that will re-emit all events except error (on which it will just somehow close itself) and delegate all methods to source stream.

Or may be some kind of such error-handling can be included into json-stream-stringify itself?

Do you have any thoughts, or suggestions? Thanks!

setPromiseItem - no wait input promise !!

setPromiseItem - no wait input promise !!

setPromiseItem(input: Promise<any>, parent: Item, key) {
        const that = this;
        let read = false;
        this.item = {
            async read() {
                if (read) {
                    return;
                }
                read = true;
                input.then(
                    (v) => that.setItem(v, parent, key),
                    (err) => {
                        that.emit('error', err);
                        that.destroy();
                    },
                );
            },
        };
    }
setPromiseItem(input: Promise<any>, parent: Item, key) {
        const that = this;
        let read = false;
        this.item = {
            async read() {
                if (read) {
                    return;
                }
                try {
                    read = true;
                    that.setItem(await input, parent, key);
                } catch (err) {
                    that.emit('error', err);
                    that.destroy();
                }
            },
        };
    }

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.