关注 spark技术分享,
撸spark源码 玩spark最佳实践

LocalSparkCluster — Single-JVM Spark Standalone Cluster

LocalSparkCluster — Single-JVM Spark Standalone Cluster

LocalSparkCluster is responsible for local-cluster master URL.

Note
local-cluster master URL matches local-cluster[numWorkers,coresPerWorker,memoryPerWorker] pattern where numWorkers, coresPerWorker and memoryPerWorker are all numbers separated by the comma.

LocalSparkCluster can be particularly useful to test distributed operation and fault recovery without spinning up a lot of processes.

LocalSparkCluster is created when SparkContext is created for local-cluster master URL (and so requested to create the SchedulerBackend and the TaskScheduler).

Table 1. LocalSparkCluster’s Internal Properties (e.g. Registries, Counters and Flags)
Name Description

localHostname

FIXME

Used when…​FIXME

masterRpcEnvs

FIXME

Used when…​FIXME

workerRpcEnvs

FIXME

Used when…​FIXME

Tip

Enable INFO logging level for org.apache.spark.deploy.LocalSparkCluster logger to see what happens inside.

Add the following line to conf/log4j.properties:

Refer to Logging.

Creating LocalSparkCluster Instance

LocalSparkCluster takes the following when created:

  • Number of workers

  • CPU cores per worker

  • Memory per worker

  • SparkConf

LocalSparkCluster initializes the internal registries and counters.

start Method

start…​FIXME

Note
start is used when…​FIXME

stop Method

stop…​FIXME

Note
stop is used when…​FIXME
赞(0) 打赏
未经允许不得转载:spark技术分享 » LocalSparkCluster — Single-JVM Spark Standalone Cluster
分享到: 更多 (0)

关注公众号:spark技术分享

联系我们联系我们

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏