Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add jvmargs to gradle.properties to provide more RAM when building #642

Merged
merged 4 commits into from
Jul 7, 2020

Conversation

anuraaga
Copy link
Contributor

@anuraaga anuraaga commented Jul 7, 2020

I got an out-of-heap space when working in the project in IntelliJ and it suggests raising the RAM like this. I have a feeling previously, the metaspace flag of 1G was implicitly causing Gradle to use more RAM while now it's the default, I think is just hundreds of MB.

I went with 2G arbitrarily since in other large projects I've had better experience with it than 1G and haven't needed more.

@iNikem
Copy link
Contributor

iNikem commented Jul 7, 2020

I think this is better put into local ~/.gradle/gradle.properties file. It may differ from env to env, how much memory one can give to gradle.

@anuraaga
Copy link
Contributor Author

anuraaga commented Jul 7, 2020

How about setting to 1g then so at least it matches the previous behavior? It's probably enough to not crash which is a good contributor experience (never had a crash before), and user can still tune it up with their home directory if they want.

@iNikem
Copy link
Contributor

iNikem commented Jul 7, 2020

How does it “match”? :) It was metaspace option, not it is heap :)

@anuraaga
Copy link
Contributor Author

anuraaga commented Jul 7, 2020

It's just a guess, but because metaspace is part of heap I'm assuming Java defaults the heap flag to be at least as high as the metaspace option, which would make sense. It never crashed before but did right away today :)

Default is an awesome 64MB
https://github.com/open-telemetry/opentelemetry-java-instrumentation/blob/master/gradlew#L47

@iNikem
Copy link
Contributor

iNikem commented Jul 7, 2020

 java -XX:+PrintFlagsFinal -version | grep MaxHeapSize          
   size_t MaxHeapSize                              = 4294967296                                {product} {ergonomic}
openjdk version "11.0.7" 2020-04-14 LTS
OpenJDK Runtime Environment (build 11.0.7+10-LTS)
OpenJDK 64-Bit Server VM (build 11.0.7+10-LTS, mixed mode)
 java -XX:+PrintFlagsFinal -XX:MaxMetaspaceSize=1g -version | grep MaxHeapSize
   size_t MaxHeapSize                              = 4294967296                                {product} {ergonomic}
openjdk version "11.0.7" 2020-04-14 LTS
OpenJDK Runtime Environment (build 11.0.7+10-LTS)
OpenJDK 64-Bit Server VM (build 11.0.7+10-LTS, mixed mode)

Metaspace does not affect max heap size.

@iNikem
Copy link
Contributor

iNikem commented Jul 7, 2020

Default is an awesome 64MB
https://github.com/open-telemetry/opentelemetry-java-instrumentation/blob/master/gradlew#L47

I expect all actual work to happen in Gradle daemons, not in the main process. Look here: https://docs.gradle.org/current/userguide/build_environment.html#sec:configuring_jvm_memory

@iNikem
Copy link
Contributor

iNikem commented Jul 7, 2020

But I have also had OOM in my Idea recently. That's interesting...

@anuraaga
Copy link
Contributor Author

anuraaga commented Jul 7, 2020

Thanks for the link, I didn't realize the daemon had different defaults. It sounds like our previous flag was setting metaspace limit to effectively 512MB (heap size) instead of Gradle default of 256MB. And because the build started crashing today after that flag change, I think it's a good guess that this is a threshold for our build. How about if we just restore the metaspace flag?

@trask
Copy link
Member

trask commented Jul 7, 2020

Oh yes this makes sense now. Thanks @anuraaga @iNikem 👍

@iNikem iNikem merged commit 817094a into master Jul 7, 2020
@iNikem iNikem deleted the anuraaga-patch-1 branch July 7, 2020 05:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants