关注 spark技术分享,
撸spark源码 玩spark最佳实践

Logging

admin阅读(4099)

Logging

Spark uses log4j for logging.

Logging Levels

The valid logging levels are log4j’s Levels (from most specific to least):

  • OFF (most specific, no logging)

  • FATAL (most specific, little data)

  • ERROR

  • WARN

  • INFO

  • DEBUG

  • TRACE (least specific, a lot of data)

  • ALL (least specific, all data)

conf/log4j.properties

You can set up the default logging for Spark shell in conf/log4j.properties. Use conf/log4j.properties.template as a starting point.

Setting Default Log Level Programatically

Setting Log Levels in Spark Applications

In standalone Spark applications or while in Spark Shell session, use the following:

sbt

When running a Spark application from within sbt using run task, you can use the following build.sbt to configure logging levels:

With the above configuration log4j.properties file should be on CLASSPATH which can be in src/main/resources directory (that is included in CLASSPATH by default).

When run starts, you should see the following output in sbt:

Disabling Logging

Use the following conf/log4j.properties to disable logging completely:

BasicWriteJobStatsTracker

admin阅读(3469)

BasicWriteJobStatsTracker

BasicWriteJobStatsTracker is a concrete WriteJobStatsTracker.

BasicWriteJobStatsTracker is created when DataWritingCommand and Spark Structured Streaming’s FileStreamSink are requested for one.

When requested for a new WriteTaskStatsTracker, BasicWriteJobStatsTracker creates a new BasicWriteTaskStatsTracker.

Creating BasicWriteJobStatsTracker Instance

BasicWriteJobStatsTracker takes the following when created:

  • Serializable Hadoop Configuration

  • Metrics (Map[String, SQLMetric])

WriteJobStatsTracker

admin阅读(1396)

WriteJobStatsTracker

Table 1. WriteJobStatsTracker Contract
Method Description

newTaskInstance

Creates a new WriteTaskStatsTracker

Used when EmptyDirectoryWriteTask, SingleDirectoryWriteTask and DynamicPartitionWriteTask are requested for the statsTrackers

processStats

Used when…​FIXME

Note
BasicWriteJobStatsTracker is the one and only known implementation of the WriteJobStatsTracker Contract in Apache Spark.

BasicWriteTaskStatsTracker

admin阅读(3363)

BasicWriteTaskStatsTracker

BasicWriteTaskStatsTracker is a concrete WriteTaskStatsTracker.

BasicWriteTaskStatsTracker is created exclusively when BasicWriteJobStatsTracker is requested for one.

BasicWriteTaskStatsTracker takes a Hadoop Configuration when created.

Getting Final WriteTaskStats — getFinalStats Method

Note
getFinalStats is part of the WriteTaskStatsTracker Contract to get the final WriteTaskStats statistics computed so far.

getFinalStats…​FIXME

WriteTaskStatsTracker

admin阅读(3096)

WriteTaskStatsTracker

WriteTaskStatsTracker is the abstraction of WriteTaskStatsTrackers that collect the statistics of the number of buckets, files, partitions and rows processed.

Table 1. WriteTaskStatsTracker Contract
Method Description

getFinalStats

The final WriteTaskStats statistics computed so far

Used when EmptyDirectoryWriteTask, SingleDirectoryWriteTask and DynamicPartitionWriteTask are requested to execute

newBucket

Used when…​FIXME

newFile

Used when…​FIXME

newPartition

Used when…​FIXME

newRow

Used when…​FIXME

Note
BasicWriteTaskStatsTracker is the one and only known implementation of the WriteTaskStatsTracker Contract in Apache Spark.

BasicWriteTaskStats

admin阅读(1240)

BasicWriteTaskStats

BasicWriteTaskStats is a basic WriteTaskStats that carries the following statistics:

  • numPartitions

  • numFiles

  • numBytes

  • numRows

BasicWriteTaskStats is created exclusively when BasicWriteTaskStatsTracker is requested for getFinalStats.

SQLAppStatusPlugin

admin阅读(1482)

SQLAppStatusPlugin

SQLAppStatusPlugin is a AppStatusPlugin…​FIXME

setupUI Method

Note
setupUI is part of AppStatusPlugin Contract to…​FIXME.

setupUI…​FIXME

SQLAppStatusListener Spark Listener

admin阅读(2056)

SQLAppStatusListener Spark Listener

SQLAppStatusListener is a SparkListener that…​FIXME

Table 1. SQLAppStatusListener’s Internal Properties (e.g. Registries, Counters and Flags)
Name Description

liveUpdatePeriodNs

liveExecutions

stageMetrics

uiInitialized

onExecutionStart Internal Method

onExecutionStart…​FIXME

Note
onExecutionStart is used exclusively when SQLAppStatusListener handles a SparkListenerSQLExecutionStart event.

onJobStart Callback

Note
onJobStart is part of SparkListener Contract to…​FIXME

onJobStart…​FIXME

onStageSubmitted Callback

Note
onStageSubmitted is part of SparkListener Contract to…​FIXME

onStageSubmitted…​FIXME

onJobEnd Callback

Note
onJobEnd is part of SparkListener Contract to…​FIXME

onJobEnd…​FIXME

onExecutorMetricsUpdate Callback

Note
onExecutorMetricsUpdate is part of SparkListener Contract to…​FIXME

onExecutorMetricsUpdate…​FIXME

onTaskEnd Callback

Note
onTaskEnd is part of SparkListener Contract to…​FIXME

onTaskEnd…​FIXME

Handling SparkListenerEvent — onOtherEvent Callback

Note
onOtherEvent is part of SparkListener Contract to…​FIXME

onOtherEvent…​FIXME

关注公众号:spark技术分享

联系我们联系我们