You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was searching for a good way to parse JSON (concatenated) incrementally about 3 months ago and found a couple Github issues on this repository mentioning it, using the >> operator on a stringstream. This works great, but since I was parsing as fast as I could from a TCP stream, some JSON objects would get cut in half. This throws an exception, and I use this exception to prepend the JSON data on the next parsing loop.
I implemented it like this:
//Parser.cpp contains internal_ss_ and last_parse_pos_//internal_ss_ gets appended before this
std::vector<json> Parser::extract_json_blobs_from_internal_string_stream_new() {
std::vector<json> return_vec;
internal_ss_.clear();
while(internal_ss_.peek() != EOF) {
json json_obj_buf;
try {
internal_ss_ >> json_obj_buf;
return_vec.emplace_back(json_obj_buf);
last_parse_pos_ = internal_ss_.tellg();
} catch(json::parse_error &e) {
internal_ss_.clear();
internal_ss_.seekg(last_parse_pos_);
if (internal_ss_.eof()) break; //IMPORTANT OR IF ITS EMPTY, WE GET EXC_BAD_ACCESS ON READwhile(internal_ss_.peek() != EOF) {
prepend_string_.push_back(internal_ss_.get());
}
std::stringstream(prepend_string_).swap(internal_ss_); //Reset internal string stream
prepend_string_ = ""; //Reset prepend
last_parse_pos_ = 0;
break;
}
}
return return_vec;
}
As you can see, I do nothing with the exception, I just check at what position in the stringstream it happened, take the rest of the stringstream, and prepend it for the next parsing call. This function works as advertised, with no packets lost if data is continuously being streamed in.
My question is this : Is this slow? Using exceptions like this and parsing this way?
For context, this is for a multiplayer game that I am working on, and I thought that incremental JSON parsing could work pretty well in this context, but I might be wrong!
If it is slow and not viable, could someone point me in the right direction, whatever that is? I'm welcome to any advice or criticism.
Thanks!
EDIT: I'm not asking if the library is slow, I'm asking if basing my whole parsing on these exceptions is slow, I'm well aware that nlohmann/json is fast!
EDIT 2: I now realize that I can parse from a stringstream with the library without throwing exceptions, but how would this look like? I also need to know where the error was (I saw that #2597 might have my answer, but as was mentioned my nlohmann, they couldn't guarantee it would remain stable). It's also obvious that using exceptions for general control flow is not a great idea if they happen a lot like in my case, so knowing how to do this without them would be great.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi everyone,
I was searching for a good way to parse JSON (concatenated) incrementally about 3 months ago and found a couple Github issues on this repository mentioning it, using the >> operator on a stringstream. This works great, but since I was parsing as fast as I could from a TCP stream, some JSON objects would get cut in half. This throws an exception, and I use this exception to prepend the JSON data on the next parsing loop.
I implemented it like this:
As you can see, I do nothing with the exception, I just check at what position in the stringstream it happened, take the rest of the stringstream, and prepend it for the next parsing call. This function works as advertised, with no packets lost if data is continuously being streamed in.
My question is this : Is this slow? Using exceptions like this and parsing this way?
For context, this is for a multiplayer game that I am working on, and I thought that incremental JSON parsing could work pretty well in this context, but I might be wrong!
If it is slow and not viable, could someone point me in the right direction, whatever that is? I'm welcome to any advice or criticism.
Thanks!
EDIT: I'm not asking if the library is slow, I'm asking if basing my whole parsing on these exceptions is slow, I'm well aware that nlohmann/json is fast!
EDIT 2: I now realize that I can parse from a stringstream with the library without throwing exceptions, but how would this look like? I also need to know where the error was (I saw that #2597 might have my answer, but as was mentioned my nlohmann, they couldn't guarantee it would remain stable). It's also obvious that using exceptions for general control flow is not a great idea if they happen a lot like in my case, so knowing how to do this without them would be great.
Beta Was this translation helpful? Give feedback.
All reactions