Skip to content

Commit

Permalink
[SPARK-37729][SQL] Fix SparkSession.setLogLevel that is not working i…
Browse files Browse the repository at this point in the history
…n Spark Shell

### What changes were proposed in this pull request?

This patch fixes a regression by upgrading to log4j 2. `SparkSession.setLogLevel` should work in spark-shell.

### Why are the changes needed?

Currently in spark-shell `SparkSession.setLogLevel` can not work well.

Previously log4j 1.x API only calls the API on logger to set logging level. Now with log4j 2.x, the updated logging level at root logger is not taken by other loggers. We need to explicitly ask updating loggers with the updated logger config.

### Does this PR introduce _any_ user-facing change?

No, the upgrading is not released yet.

### How was this patch tested?

Manual test and unit test.

Closes apache#35012 from viirya/SPARK-37729.

Authored-by: Liang-Chi Hsieh <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
  • Loading branch information
viirya authored and HyukjinKwon committed Dec 24, 2021
1 parent 472629a commit 8bf2515
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 5 deletions.
14 changes: 9 additions & 5 deletions core/src/main/scala/org/apache/spark/util/Utils.scala
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,8 @@ import org.apache.hadoop.io.compress.{CompressionCodecFactory, SplittableCompres
import org.apache.hadoop.security.UserGroupInformation
import org.apache.hadoop.util.{RunJar, StringUtils}
import org.apache.hadoop.yarn.conf.YarnConfiguration
import org.apache.logging.log4j.{Level, LogManager}
import org.apache.logging.log4j.core.LoggerContext
import org.eclipse.jetty.util.MultiException
import org.slf4j.Logger

Expand Down Expand Up @@ -2423,11 +2425,13 @@ private[spark] object Utils extends Logging {
/**
* configure a new log4j level
*/
def setLogLevel(l: org.apache.logging.log4j.Level): Unit = {
val rootLogger = org.apache.logging.log4j.LogManager.getRootLogger()
.asInstanceOf[org.apache.logging.log4j.core.Logger]
rootLogger.setLevel(l)
rootLogger.get().setLevel(l)
def setLogLevel(l: Level): Unit = {
val ctx = LogManager.getContext(false).asInstanceOf[LoggerContext]
val config = ctx.getConfiguration()
val loggerConfig = config.getLoggerConfig(LogManager.ROOT_LOGGER_NAME)
loggerConfig.setLevel(l)
ctx.updateLoggers()

// Setting threshold to null as rootLevel will define log level for spark-shell
Logging.sparkShellThresholdLevel = null
}
Expand Down
3 changes: 3 additions & 0 deletions core/src/test/scala/org/apache/spark/util/UtilsSuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -690,8 +690,11 @@ class UtilsSuite extends SparkFunSuite with ResetSystemProperties with Logging {
try {
Utils.setLogLevel(org.apache.logging.log4j.Level.ALL)
assert(rootLogger.getLevel == org.apache.logging.log4j.Level.ALL)
assert(log.isInfoEnabled())
Utils.setLogLevel(org.apache.logging.log4j.Level.ERROR)
assert(rootLogger.getLevel == org.apache.logging.log4j.Level.ERROR)
assert(!log.isInfoEnabled())
assert(log.isErrorEnabled())
} finally {
// Best effort at undoing changes this test made.
Utils.setLogLevel(current)
Expand Down

0 comments on commit 8bf2515

Please sign in to comment.