public abstract class SparkCommand extends AbstractSyntaxTree
A Spark command corresponds to a MapPhysicalNode
, which in turn, corresponds to a
MapConnectorPoint
in a MapComponent
. For example, filter, join, aggregate components,
etc., all have just one output connector point. Each corresponds to one SparkCommand to be used for generating
code for those components. A splitter component may have more than one output connector point. Therefore,
multiple Spark Command will be created, one for each output connector point, in order to generate code reflecting
the semantics of those connector points.
A SparkCommand normally generates multiple lines of PySpark code
AbstractSyntaxTree.LogCounter
Modifier and Type | Method and Description |
---|---|
java.util.List |
convertExprs(MapConnectorPoint scopingInputPoint,
java.util.List exprs,
boolean referenceOutputFields)
Convert list of mapping expressions into a list of expression texts.
|
java.util.List |
convertExprsDF(MapConnectorPoint scopingInputPoint,
java.util.List exprs,
boolean referenceOutputFields,
boolean qualified)
Convert list of mapping expressions into a list of expression texts, specific to DataFrames.
|
java.util.List |
getChildren()
Get the list of child objects owned by this substitution API object.
|
int |
getCommandIndex()
Get the index of this command.
|
java.lang.String |
getExprText(MapConnectorPoint scopingInputPoint,
MapExpression expr)
Return the expression text according to a given map expression.
|
java.lang.String |
getExprTextDF(MapConnectorPoint scopingInputPoint,
MapExpression expr)
Return the expression text according to a given map expression, specific to DataFrames.
|
java.lang.String |
getFlexField(java.lang.String flexFieldName,
java.lang.String defaultFieldValue)
Get the flexfield value by using the flexField name.
|
java.util.HashMap |
getModulePaths()
Gets import module paths; Each context has a corresponding module path.
|
java.lang.String |
getOptionValue(java.lang.String optionName)
Get the option value from a physical node by using option name
|
SparkScript |
getParentScript()
Get the parent script.
|
MapPhysicalNode |
getPhysicalNode()
Return the physical node referenced by this command.
|
java.lang.String[] |
getScriptLevelCommonContext()
Get the common context in the script level.
|
java.lang.String |
getSourceAlias()
Get the the source alias.
|
java.util.List |
getSourceAliases()
Gets the source aliases of this BigData command.
|
java.lang.String |
getSparkContextName() |
java.lang.String |
getSparkContextType() |
java.lang.String |
getStrctType(java.lang.String builtinType)
Get the Spark valid Struct DataTypes
|
boolean |
getStreamEnableWindowing()
Get the enable windowing flag
|
java.lang.Integer |
getStreamSlideInterval()
Get the steam slide interval, it's used only for the stream mode.
|
java.lang.Integer |
getStreamWindowLength()
Get the length of the steam window.
|
java.lang.String |
getTargetAlias()
Get the target alias for this command.
|
java.util.Map |
getTemplateSubstitutionMap()
Get a hash map containing built-in template substitution variable names as the hash key,
and the substitution variable value as values.
|
java.lang.String |
getText()
Return the text format of the spark command.
|
java.lang.String |
getType()
Get the value of SparkCommandType
|
abstract SparkCommandType |
getTypeEnum()
Get the enum value of SparkCommandType
|
boolean |
isStreamingMode()
Return true if the corresponding script is on streaming mode.
|
void |
setSourceAliases(java.util.List sourceAliases)
Sets source aliases to a new list of Strings.
|
java.lang.String |
toString()
Return basic information about SparkCommand with the description of its contents
|
boolean |
useSQLExpressions()
Use SQL expressions.
|
getCodeGenerationTemplate, getCodeGenerationTemplateName, getCustomTemplate, getKMName, getLeafLevelChildren, getMapPhysicalNode, getOrder, getParentAST, getParentOfType, getPropertyValue, getSourceLanguage, getSourceLocation, getSourceTechnology, getSourceText, getTargetLanguage, getTargetLocation, getTargetTechnology, getTargetText, hasCustomTemplate, hasSourceAndTargetText, isLeafLevelNode, isPushFromSource
public MapPhysicalNode getPhysicalNode()
public SparkScript getParentScript()
SparkScript
public java.util.List getSourceAliases()
public void setSourceAliases(java.util.List sourceAliases)
sourceAliases
- the new source aliases to be set topublic java.lang.String getSourceAlias()
public java.lang.String getTargetAlias()
public boolean useSQLExpressions()
public abstract SparkCommandType getTypeEnum()
public java.lang.String getType()
getType
in class AbstractSyntaxTree
public java.util.List getChildren()
AbstractSyntaxTree
getChildren
in class AbstractSyntaxTree
public java.util.HashMap getModulePaths()
public boolean isStreamingMode()
public java.lang.String getOptionValue(java.lang.String optionName) throws GenerationException
optionName
- The given option name.GenerationException
public java.lang.String[] getScriptLevelCommonContext()
public java.lang.String getFlexField(java.lang.String flexFieldName, java.lang.String defaultFieldValue)
flexFieldName
- The given flexField name.defaultFieldValue
- The default flexfield value.public boolean getStreamEnableWindowing()
public java.lang.Integer getStreamWindowLength()
public java.lang.Integer getStreamSlideInterval()
public java.util.Map getTemplateSubstitutionMap()
AbstractSyntaxTree
getTemplateSubstitutionMap
in class AbstractSyntaxTree
public java.lang.String getText() throws GenerationException
getText
in class AbstractSyntaxTree
GenerationException
public java.lang.String getExprText(MapConnectorPoint scopingInputPoint, MapExpression expr) throws GenerationException
scopingInputPoint
- If it's not null, specifies a mapping path that includes the specified input connector point.
The scoping point is an input point owned by the same component that owns the expressionexpr
- MapExpressionGenerationException
public java.lang.String getExprTextDF(MapConnectorPoint scopingInputPoint, MapExpression expr) throws GenerationException
scopingInputPoint
- If it's not null, specifies a mapping path that includes the specified input connector point.
The scoping point is an input point owned by the same component that owns the expressionexpr
- MapExpressionGenerationException
public java.util.List convertExprs(MapConnectorPoint scopingInputPoint, java.util.List exprs, boolean referenceOutputFields) throws GenerationException
scopingInputPoint
- If it's not null, specifies a mapping path that includes the specified input connector point.
The scoping point is an input point owned by the same component that owns the expressionexprs
- The list of mapping expressionsreferenceOutputFields
- If true, indicates that the expression references output fields.
If false, indicates that only input fields are referenced.GenerationException
public java.util.List convertExprsDF(MapConnectorPoint scopingInputPoint, java.util.List exprs, boolean referenceOutputFields, boolean qualified) throws GenerationException
scopingInputPoint
- If it's not null, specifies a mapping path that includes the specified input connector point.
The scoping point is an input point owned by the same component that owns the expressionexprs
- The list of mapping expressionsreferenceOutputFields
- If true, indicates that the expression references output fields.
If false, indicates that only input fields are referenced.qualified
- If true, indicates that the expression referenced fields will be prefixed with the dataframe nameGenerationException
public java.lang.String getStrctType(java.lang.String builtinType)
public java.lang.String getSparkContextType()
public java.lang.String getSparkContextName()
public java.lang.String toString()
toString
in class java.lang.Object
public int getCommandIndex()