-
Notifications
You must be signed in to change notification settings - Fork 17.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
proposal: encoding/json: avoid massive escape costs #68203
Comments
Let's focus further encoding/json optimization discussions on encoding/json/v2. #63397 |
Ah, works for me! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Proposal Details
It seems the json encoder and decoder has a significant overhead when escaping strings. I've attached a bunch of benchmarks to the end of this report. In short, I have a large string (hex in this case) and I would like to insert it into a json field. My benchmarks just json encode the single hex value.
I would expect the performance to be near the speed of copying the data. However, Go seems to do a lot of extra processing. This report is kind of questioning various parts of that:
appendString
call, which internally does the escape checks (yeah, the noescape flag only disables HTML escape checking, not ascii escape checking).I'm not even entirely sure what's the solution to the various issues.
The text was updated successfully, but these errors were encountered: