关注 spark技术分享,
撸spark源码 玩spark最佳实践

Logging

Logging

Spark uses log4j for logging.

Logging Levels

The valid logging levels are log4j’s Levels (from most specific to least):

  • OFF (most specific, no logging)

  • FATAL (most specific, little data)

  • ERROR

  • WARN

  • INFO

  • DEBUG

  • TRACE (least specific, a lot of data)

  • ALL (least specific, all data)

conf/log4j.properties

You can set up the default logging for Spark shell in conf/log4j.properties. Use conf/log4j.properties.template as a starting point.

Setting Default Log Level Programatically

Setting Log Levels in Spark Applications

In standalone Spark applications or while in Spark Shell session, use the following:

sbt

When running a Spark application from within sbt using run task, you can use the following build.sbt to configure logging levels:

With the above configuration log4j.properties file should be on CLASSPATH which can be in src/main/resources directory (that is included in CLASSPATH by default).

When run starts, you should see the following output in sbt:

Disabling Logging

Use the following conf/log4j.properties to disable logging completely:

赞(0) 打赏
未经允许不得转载:spark技术分享 » Logging
分享到: 更多 (0)

关注公众号:spark技术分享

联系我们联系我们

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏