Class SparkExecutionContext.SparkClusterConfig
- java.lang.Object
 - 
- org.apache.sysds.runtime.controlprogram.context.SparkExecutionContext.SparkClusterConfig
 
 
- 
- Enclosing class:
 - SparkExecutionContext
 
public static class SparkExecutionContext.SparkClusterConfig extends Object
Captures relevant spark cluster configuration properties, e.g., memory budgets and degree of parallelism. This configuration abstracts legacy (< Spark 1.6) and current configurations and provides a unified view. 
- 
- 
Field Summary
Fields Modifier and Type Field Description static longRESERVED_SYSTEM_MEMORY_BYTES 
- 
Constructor Summary
Constructors Constructor Description SparkClusterConfig()SparkClusterConfig(org.apache.spark.SparkConf sconf) 
- 
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description voidanalyzeSparkConfiguation(org.apache.spark.SparkConf conf)voidanalyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf)longgetBroadcastMemoryBudget()longgetDataMemoryBudget(boolean min, boolean refresh)intgetDefaultParallelism(boolean refresh)intgetNumExecutors()StringtoString() 
 - 
 
- 
- 
Field Detail
- 
RESERVED_SYSTEM_MEMORY_BYTES
public static final long RESERVED_SYSTEM_MEMORY_BYTES
- See Also:
 - Constant Field Values
 
 
 - 
 
- 
Method Detail
- 
getBroadcastMemoryBudget
public long getBroadcastMemoryBudget()
 
- 
getDataMemoryBudget
public long getDataMemoryBudget(boolean min, boolean refresh) 
- 
getNumExecutors
public int getNumExecutors()
 
- 
getDefaultParallelism
public int getDefaultParallelism(boolean refresh)
 
- 
analyzeSparkConfiguationLegacy
public void analyzeSparkConfiguationLegacy(org.apache.spark.SparkConf conf)
 
- 
analyzeSparkConfiguation
public void analyzeSparkConfiguation(org.apache.spark.SparkConf conf)
 
 - 
 
 -