Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serious performance issues #12

Open
andreasanta opened this issue Apr 16, 2018 · 0 comments
Open

Serious performance issues #12

andreasanta opened this issue Apr 16, 2018 · 0 comments

Comments

@andreasanta
Copy link

I attempted the compression of objects with more than 5k key->value pairs, and the use of arrays and several iterations slows down the processing terribly.

I would suggest to optimize:

  • contains, use object instead of array
  • unique, same as above, just stick all the keys into an object and then Object.keys()
  • _getKeys, use === undefined instead of hasOwnProperty (line 149)
  • _compressOther, do not use json.parse/stringify, replace the key names in place without going over the whole string

I wanted to use this for a huge number of objects, but it keeps my node cli running forever with just over 4k keys in my object.

Alamantus added a commit to Alamantus/JSON-Compress that referenced this issue Dec 4, 2019
_correctCollision addresses issue:
tcorral#15

Some minor performance tweaks address some points in:
tcorral#12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant