Modifier and Type | Method and Description |
---|---|
TypeInformation<Row> |
HBaseRowInputFormat.getProducedType() |
TypeInformation<Row> |
HBaseTableSource.getReturnType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation[] |
DataDistribution.getKeyTypes()
Gets the type of the key by which the dataSet is partitioned.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<IN> |
UnaryOperatorInformation.inputType
Input Type of the operator
|
protected TypeInformation<IN1> |
BinaryOperatorInformation.inputType1
Input type of the first input
|
protected TypeInformation<IN2> |
BinaryOperatorInformation.inputType2
Input type of the second input
|
protected TypeInformation<OUT> |
OperatorInformation.outputType
Output type of the operator
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<IN1> |
BinaryOperatorInformation.getFirstInputType() |
TypeInformation<IN> |
UnaryOperatorInformation.getInputType() |
TypeInformation<T> |
Keys.SelectorFunctionKeys.getInputType() |
abstract TypeInformation<?>[] |
Keys.getKeyFieldTypes() |
TypeInformation<?>[] |
Keys.SelectorFunctionKeys.getKeyFieldTypes() |
TypeInformation<?>[] |
Keys.ExpressionKeys.getKeyFieldTypes() |
TypeInformation<K> |
Keys.SelectorFunctionKeys.getKeyType() |
abstract TypeInformation<?>[] |
Keys.getOriginalKeyFieldTypes() |
TypeInformation<?>[] |
Keys.SelectorFunctionKeys.getOriginalKeyFieldTypes() |
TypeInformation<?>[] |
Keys.ExpressionKeys.getOriginalKeyFieldTypes() |
TypeInformation<OUT> |
OperatorInformation.getOutputType()
Gets the return type of the user code function.
|
TypeInformation<IN2> |
BinaryOperatorInformation.getSecondInputType() |
Modifier and Type | Method and Description |
---|---|
static boolean |
Keys.ExpressionKeys.isSortKey(int fieldPos,
TypeInformation<?> type) |
static boolean |
Keys.ExpressionKeys.isSortKey(String fieldExpr,
TypeInformation<?> type) |
abstract <E> void |
Keys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
<E> void |
Keys.SelectorFunctionKeys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
<E> void |
Keys.ExpressionKeys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
Constructor and Description |
---|
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
ExpressionKeys(int[] keyPositions,
TypeInformation<T> type)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(int[] keyPositions,
TypeInformation<T> type,
boolean allowEmpty)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(int keyPosition,
TypeInformation<T> type)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(String[] keyExpressions,
TypeInformation<T> type)
Create String-based (nested) field expression keys on a composite type.
|
ExpressionKeys(String keyExpression,
TypeInformation<T> type)
Create String-based (nested) field expression keys on a composite type.
|
ExpressionKeys(TypeInformation<T> type)
ExpressionKeys that is defined by the full data type.
|
IncompatibleKeysException(TypeInformation<?> typeInformation,
TypeInformation<?> typeInformation2) |
IncompatibleKeysException(TypeInformation<?> typeInformation,
TypeInformation<?> typeInformation2) |
OperatorInformation(TypeInformation<OUT> outputType) |
SelectorFunctionKeys(KeySelector<T,K> keyExtractor,
TypeInformation<T> inputType,
TypeInformation<K> keyType) |
SelectorFunctionKeys(KeySelector<T,K> keyExtractor,
TypeInformation<T> inputType,
TypeInformation<K> keyType) |
UnaryOperatorInformation(TypeInformation<IN> inputType,
TypeInformation<OUT> outputType) |
UnaryOperatorInformation(TypeInformation<IN> inputType,
TypeInformation<OUT> outputType) |
Constructor and Description |
---|
AggregatingStateDescriptor(String name,
AggregateFunction<IN,ACC,OUT> aggFunction,
TypeInformation<ACC> stateType)
Creates a new
ReducingStateDescriptor with the given name and default value. |
FoldingStateDescriptor(String name,
ACC initialValue,
FoldFunction<T,ACC> foldFunction,
TypeInformation<ACC> typeInfo)
Deprecated.
Creates a new
FoldingStateDescriptor with the given name and default value. |
ListStateDescriptor(String name,
TypeInformation<T> elementTypeInfo)
Creates a new
ListStateDescriptor with the given name and list element type. |
MapStateDescriptor(String name,
TypeInformation<UK> keyTypeInfo,
TypeInformation<UV> valueTypeInfo)
Create a new
MapStateDescriptor with the given name and the given type informations. |
MapStateDescriptor(String name,
TypeInformation<UK> keyTypeInfo,
TypeInformation<UV> valueTypeInfo)
Create a new
MapStateDescriptor with the given name and the given type informations. |
ReducingStateDescriptor(String name,
ReduceFunction<T> reduceFunction,
TypeInformation<T> typeInfo)
Creates a new
ReducingStateDescriptor with the given name and default value. |
StateDescriptor(String name,
TypeInformation<T> typeInfo,
T defaultValue)
Create a new
StateDescriptor with the given name and the given type information. |
ValueStateDescriptor(String name,
TypeInformation<T> typeInfo)
Creates a new
ValueStateDescriptor with the given name and type. |
ValueStateDescriptor(String name,
TypeInformation<T> typeInfo,
T defaultValue)
Deprecated.
Use
ValueStateDescriptor.ValueStateDescriptor(String, TypeInformation) instead and manually
manage the default value by checking whether the contents of the state is null . |
Modifier and Type | Class and Description |
---|---|
class |
BasicArrayTypeInfo<T,C> |
class |
BasicTypeInfo<T>
Type information for primitive types (int, long, double, byte, ...), String, Date, Void,
BigInteger, and BigDecimal.
|
class |
FractionalTypeInfo<T>
Type information for numeric fractional primitive types (double, float).
|
class |
IntegerTypeInfo<T>
Type information for numeric integer primitive types: int, long, byte, short, character.
|
class |
NothingTypeInfo
Placeholder type information for the
Nothing type. |
class |
NumericTypeInfo<T>
Type information for numeric primitive types: int, long, double, byte, short, float, char.
|
class |
PrimitiveArrayTypeInfo<T>
A
TypeInformation for arrays of primitive types (int, long, double, ...). |
class |
SqlTimeTypeInfo<T>
Type information for Java SQL Date/Time/Timestamp.
|
Modifier and Type | Method and Description |
---|---|
abstract TypeInformation<T> |
TypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters)
Creates type information for the type the factory is targeted for.
|
TypeInformation<C> |
BasicArrayTypeInfo.getComponentInfo() |
TypeInformation<?> |
PrimitiveArrayTypeInfo.getComponentType()
Gets the type information of the component type.
|
TypeInformation<T> |
TypeHint.getTypeInfo()
Gets the type information described by this TypeHint.
|
static <T> TypeInformation<T> |
TypeInformation.of(Class<T> typeClass)
Creates a TypeInformation for the type described by the given class.
|
static <T> TypeInformation<T> |
TypeInformation.of(TypeHint<T> typeHint)
Creates a TypeInformation for a generic type via a utility "type hint".
|
Modifier and Type | Method and Description |
---|---|
Map<String,TypeInformation<?>> |
TypeInformation.getGenericParameters()
Optional method for giving Flink's type extraction system information about the mapping
of a generic type parameter to the type information of a subtype.
|
Modifier and Type | Method and Description |
---|---|
static RowTypeInfo |
Types.ROW_NAMED(String[] fieldNames,
TypeInformation<?>... types)
Generates a RowTypeInfo with fields of the given types and with given names.
|
static RowTypeInfo |
Types.ROW(TypeInformation<?>... types)
Generates a RowTypeInfo with fields of the given types.
|
Modifier and Type | Method and Description |
---|---|
abstract TypeInformation<T> |
TypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters)
Creates type information for the type the factory is targeted for.
|
Modifier and Type | Class and Description |
---|---|
class |
CompositeType<T>
Base type information class for Tuple and Pojo types
The class is taking care of serialization and comparators for Tuples as well.
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<?> |
CompositeType.FlatFieldDescriptor.getType() |
abstract <X> TypeInformation<X> |
CompositeType.getTypeAt(int pos)
Returns the type of the (unnested) field at the given field position.
|
abstract <X> TypeInformation<X> |
CompositeType.getTypeAt(String fieldExpression)
Returns the type of the (nested) field at the given field expression position.
|
Constructor and Description |
---|
FlatFieldDescriptor(int keyPosition,
TypeInformation<?> type) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
DataSet.getType()
Returns the
TypeInformation for the type of this DataSet. |
Modifier and Type | Method and Description |
---|---|
<X> DataSource<X> |
ExecutionEnvironment.createInput(InputFormat<X,?> inputFormat,
TypeInformation<X> producedType)
Generic method to create an input DataSet with in
InputFormat . |
protected void |
DataSet.fillInType(TypeInformation<T> typeInfo)
Tries to fill in the type information.
|
<X> DataSource<X> |
ExecutionEnvironment.fromCollection(Collection<X> data,
TypeInformation<X> type)
Creates a DataSet from the given non-empty collection.
|
<X> DataSource<X> |
ExecutionEnvironment.fromCollection(Iterator<X> data,
TypeInformation<X> type)
Creates a DataSet from the given iterator.
|
<X> DataSource<X> |
ExecutionEnvironment.fromParallelCollection(SplittableIterator<X> iterator,
TypeInformation<X> type)
Creates a new data set that contains elements in the iterator.
|
static <T> String |
Utils.getSerializerTree(TypeInformation<T> ti)
Debugging utility to understand the hierarchy of serializers created by the Java API.
|
Constructor and Description |
---|
DataSet(ExecutionEnvironment context,
TypeInformation<T> typeInfo) |
Modifier and Type | Method and Description |
---|---|
static DualInputSemanticProperties |
SemanticPropUtil.createProjectionPropertiesDual(int[] fields,
boolean[] isFromFirst,
TypeInformation<?> inType1,
TypeInformation<?> inType2) |
static DualInputSemanticProperties |
SemanticPropUtil.createProjectionPropertiesDual(int[] fields,
boolean[] isFromFirst,
TypeInformation<?> inType1,
TypeInformation<?> inType2) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static SingleInputSemanticProperties |
SemanticPropUtil.getSemanticPropsSingle(Set<Annotation> set,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static SingleInputSemanticProperties |
SemanticPropUtil.getSemanticPropsSingle(Set<Annotation> set,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
HadoopInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
HadoopInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
TypeSerializerInputFormat.getProducedType() |
TypeInformation<Row> |
RowCsvInputFormat.getProducedType() |
TypeInformation<E> |
AvroInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
void |
TypeSerializerOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
LocalCollectionOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
CsvOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
The purpose of this method is solely to check whether the data type to be processed
is in fact a tuple type.
|
Constructor and Description |
---|
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypes) |
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypes,
boolean emptyColumnAsNull) |
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypes,
int[] selectedFields) |
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypes,
String lineDelimiter,
String fieldDelimiter) |
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypes,
String lineDelimiter,
String fieldDelimiter,
int[] selectedFields) |
RowCsvInputFormat(Path filePath,
TypeInformation[] fieldTypeInfos,
String lineDelimiter,
String fieldDelimiter,
int[] selectedFields,
boolean emptyColumnAsNull) |
SplitDataProperties(TypeInformation<T> type)
Creates SplitDataProperties for the given data types.
|
TypeSerializerInputFormat(TypeInformation<T> resultType) |
Modifier and Type | Method and Description |
---|---|
static <T,K> TypeInformation<Tuple2<K,T>> |
KeyFunctions.createTypeWithKey(Keys.SelectorFunctionKeys<T,K> key) |
static <T,K1,K2> TypeInformation<Tuple3<K1,K2,T>> |
KeyFunctions.createTypeWithKey(Keys.SelectorFunctionKeys<T,K1> key1,
Keys.SelectorFunctionKeys<T,K2> key2) |
TypeInformation<IN1> |
TwoInputOperator.getInput1Type()
Gets the type information of the data type of the first input data set.
|
TypeInformation<IN2> |
TwoInputOperator.getInput2Type()
Gets the type information of the data type of the second input data set.
|
TypeInformation<IN> |
SingleInputOperator.getInputType()
Gets the type information of the data type of the input data set.
|
TypeInformation<OUT> |
Operator.getResultType()
Returns the type of the result of this operator.
|
TypeInformation<T> |
DataSink.getType() |
TypeInformation<WT> |
DeltaIterationResultSet.getWorksetType() |
Modifier and Type | Method and Description |
---|---|
O |
TwoInputUdfOperator.returns(TypeInformation<OUT> typeInfo)
Adds a type information hint about the return type of this operator.
|
O |
SingleInputUdfOperator.returns(TypeInformation<OUT> typeInfo)
Adds a type information hint about the return type of this operator.
|
Constructor and Description |
---|
CoGroupOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
List<org.apache.commons.lang3.tuple.Pair<Integer,Order>> groupSortKeyOrderFirst,
List<org.apache.commons.lang3.tuple.Pair<Integer,Order>> groupSortKeyOrderSecond,
Partitioner<?> customPartitioner,
String defaultName) |
CoGroupOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
Partitioner<?> customPartitioner,
String defaultName) |
CoGroupRawOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
String defaultName) |
CrossOperator(DataSet<I1> input1,
DataSet<I2> input2,
CrossFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
CrossOperatorBase.CrossHint hint,
String defaultName) |
DataSink(DataSet<T> data,
OutputFormat<T> format,
TypeInformation<T> type) |
DataSource(ExecutionEnvironment context,
InputFormat<OUT,?> inputFormat,
TypeInformation<OUT> type,
String dataSourceLocationName)
Creates a new data source.
|
DeltaIteration(ExecutionEnvironment context,
TypeInformation<ST> type,
DataSet<ST> solutionSet,
DataSet<WT> workset,
Keys<ST> keys,
int maxIterations) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> generatedFunction,
JoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> generatedFunction,
JoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName,
JoinType type) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName,
JoinType type) |
FlatMapOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
FlatMapFunction<IN,OUT> function,
String defaultName) |
GroupCombineOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
GroupCombineFunction<IN,OUT> function,
String defaultName)
Constructor for a non-grouped reduce (all reduce).
|
GroupCombineOperator(Grouping<IN> input,
TypeInformation<OUT> resultType,
GroupCombineFunction<IN,OUT> function,
String defaultName)
Constructor for a grouped reduce.
|
GroupReduceOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
GroupReduceFunction<IN,OUT> function,
String defaultName)
Constructor for a non-grouped reduce (all reduce).
|
GroupReduceOperator(Grouping<IN> input,
TypeInformation<OUT> resultType,
GroupReduceFunction<IN,OUT> function,
String defaultName)
Constructor for a grouped reduce.
|
IterativeDataSet(ExecutionEnvironment context,
TypeInformation<T> type,
DataSet<T> input,
int maxIterations) |
JoinOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
JoinType type) |
MapOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
MapFunction<IN,OUT> function,
String defaultName) |
MapPartitionOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
MapPartitionFunction<IN,OUT> function,
String defaultName) |
NoOpOperator(DataSet<IN> input,
TypeInformation<IN> resultType) |
Operator(ExecutionEnvironment context,
TypeInformation<OUT> resultType) |
PartitionOperator(DataSet<T> input,
Keys<T> pKeys,
Partitioner<P> customPartitioner,
TypeInformation<P> partitionerTypeInfo,
String partitionLocationName) |
SingleInputOperator(DataSet<IN> input,
TypeInformation<OUT> resultType) |
SingleInputUdfOperator(DataSet<IN> input,
TypeInformation<OUT> resultType)
Creates a new operators with the given data set as input.
|
TwoInputOperator(DataSet<IN1> input1,
DataSet<IN2> input2,
TypeInformation<OUT> resultType) |
TwoInputUdfOperator(DataSet<IN1> input1,
DataSet<IN2> input2,
TypeInformation<OUT> resultType)
Creates a new operators with the two given data sets as inputs.
|
Modifier and Type | Method and Description |
---|---|
static TaggedValue |
UdfAnalyzerUtils.convertTypeInfoToTaggedValue(TaggedValue.Input input,
TypeInformation<?> typeInfo,
String flatFieldExpr,
List<CompositeType.FlatFieldDescriptor> flatFieldDesc,
int[] groupedKeys) |
Constructor and Description |
---|
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
Modifier and Type | Class and Description |
---|---|
class |
AvroTypeInfo<T extends org.apache.avro.specific.SpecificRecordBase>
Special type information to generate a special AvroTypeInfo for Avro POJOs (implementing SpecificRecordBase, the typed Avro POJOs)
Proceeding: It uses a regular pojo type analysis and replaces all
GenericType<CharSequence>
with a GenericType<avro.Utf8> . |
class |
EitherTypeInfo<L,R>
A
TypeInformation for the Either type of the Java API. |
class |
EnumTypeInfo<T extends Enum<T>>
A
TypeInformation for java enumeration types. |
class |
GenericTypeInfo<T> |
class |
ListTypeInfo<T>
A
TypeInformation for the list types of the Java API. |
class |
MapTypeInfo<K,V>
Special
TypeInformation used by MapStateDescriptor . |
class |
MissingTypeInfo
A special type information signifying that the type extraction failed.
|
class |
ObjectArrayTypeInfo<T,C> |
class |
PojoTypeInfo<T>
TypeInformation for "Java Beans"-style types.
|
class |
RowTypeInfo
TypeInformation for
Row |
class |
TupleTypeInfo<T extends Tuple>
A
TypeInformation for the tuple types of the Java API. |
class |
TupleTypeInfoBase<T> |
class |
ValueTypeInfo<T extends Value>
Type information for data types that extend the
Value interface. |
class |
WritableTypeInfo<T extends org.apache.hadoop.io.Writable>
Type information for data types that extend Hadoop's
Writable interface. |
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<?>[] |
TupleTypeInfoBase.types |
Modifier and Type | Method and Description |
---|---|
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <T> TypeInformation<T> |
TypeExtractor.createHadoopWritableTypeInfo(Class<T> clazz) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <T> TypeInformation<T> |
TypeExtractor.createTypeInfo(Class<T> type) |
static <OUT> TypeInformation<OUT> |
TypeExtractor.createTypeInfo(Object instance,
Class<?> baseClass,
Class<?> clazz,
int returnParamPos)
Creates a
TypeInformation from the given parameters. |
static TypeInformation<?> |
TypeExtractor.createTypeInfo(Type t) |
TypeInformation<Either<L,R>> |
EitherTypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters) |
static <IN,ACC> TypeInformation<ACC> |
TypeExtractor.getAggregateFunctionAccumulatorType(AggregateFunction<IN,ACC,?> function,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getAggregateFunctionReturnType(AggregateFunction<IN,?,OUT> function,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int input1TypeArgumentIndex,
int input2TypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaInput1TypeArgumentIndices,
int[] lambdaInput2TypeArgumentIndices,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
TypeInformation<C> |
ObjectArrayTypeInfo.getComponentInfo() |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
TypeInformation<T> |
ListTypeInfo.getElementTypeInfo()
Gets the type information for the elements contained in the list
|
TypeInformation<?>[] |
RowTypeInfo.getFieldTypes()
Returns the field types of the row.
|
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType)
Deprecated.
will be removed in a future version
|
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Deprecated.
will be removed in a future version
|
static <X> TypeInformation<X> |
TypeExtractor.getForClass(Class<X> clazz)
Creates type information from a given Class such as Integer, String[] or POJOs.
|
static <X> TypeInformation<X> |
TypeExtractor.getForObject(X value) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN> TypeInformation<IN> |
TypeExtractor.getInputFormatTypes(InputFormat<IN,?> inputFormatInterface) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
TypeInformation<K> |
MapTypeInfo.getKeyTypeInfo()
Gets the type information for the keys in the map
|
TypeInformation<L> |
EitherTypeInfo.getLeftType() |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <T> TypeInformation<T> |
TypeExtractor.getPartitionerTypes(Partitioner<T> partitioner) |
static <T> TypeInformation<T> |
TypeExtractor.getPartitionerTypes(Partitioner<T> partitioner,
String functionName,
boolean allowMissing) |
TypeInformation<T> |
ResultTypeQueryable.getProducedType()
Gets the data type (as a
TypeInformation ) produced by this function or input format. |
TypeInformation<R> |
EitherTypeInfo.getRightType() |
<X> TypeInformation<X> |
TupleTypeInfoBase.getTypeAt(int pos) |
<X> TypeInformation<X> |
PojoTypeInfo.getTypeAt(int pos) |
<X> TypeInformation<X> |
TupleTypeInfoBase.getTypeAt(String fieldExpression) |
<X> TypeInformation<X> |
RowTypeInfo.getTypeAt(String fieldExpression) |
<X> TypeInformation<X> |
PojoTypeInfo.getTypeAt(String fieldExpression) |
TypeInformation<?> |
PojoField.getTypeInformation() |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaInputTypeArgumentIndices,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
TypeInformation<V> |
MapTypeInfo.getValueTypeInfo()
Gets the type information for the values in the map
|
static <X> TypeInformation<X> |
TypeInfoParser.parse(String infoString)
Generates an instance of
TypeInformation by parsing a type
information string. |
Modifier and Type | Method and Description |
---|---|
Map<String,TypeInformation<?>> |
TupleTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
EitherTypeInfo.getGenericParameters() |
Modifier and Type | Method and Description |
---|---|
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN,ACC> TypeInformation<ACC> |
TypeExtractor.getAggregateFunctionAccumulatorType(AggregateFunction<IN,ACC,?> function,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getAggregateFunctionReturnType(AggregateFunction<IN,?,OUT> function,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int input1TypeArgumentIndex,
int input2TypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaInput1TypeArgumentIndices,
int[] lambdaInput2TypeArgumentIndices,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int input1TypeArgumentIndex,
int input2TypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaInput1TypeArgumentIndices,
int[] lambdaInput2TypeArgumentIndices,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType)
Deprecated.
will be removed in a future version
|
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Deprecated.
will be removed in a future version
|
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <T,C> ObjectArrayTypeInfo<T,C> |
ObjectArrayTypeInfo.getInfoFor(Class<T> arrayClass,
TypeInformation<C> componentInfo) |
static <T,C> ObjectArrayTypeInfo<T,C> |
ObjectArrayTypeInfo.getInfoFor(TypeInformation<C> componentInfo)
Creates a new
ObjectArrayTypeInfo from a
TypeInformation for the component type. |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
int[] lambdaInputTypeArgumentIndices,
int[] lambdaOutputTypeArgumentIndices,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
void |
InputTypeConfigurable.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
Method that is called on an
OutputFormat when it is passed to
the DataSet's output method. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Either<L,R>> |
EitherTypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters) |
Constructor and Description |
---|
EitherTypeInfo(TypeInformation<L> leftType,
TypeInformation<R> rightType) |
EitherTypeInfo(TypeInformation<L> leftType,
TypeInformation<R> rightType) |
ListTypeInfo(TypeInformation<T> elementTypeInfo) |
MapTypeInfo(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo) |
MapTypeInfo(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo) |
NamedFlatFieldDescriptor(String name,
int keyPosition,
TypeInformation<?> type) |
PojoField(Field field,
TypeInformation<?> type) |
RowTypeInfo(TypeInformation<?>... types) |
RowTypeInfo(TypeInformation<?>[] types,
String[] fieldNames) |
TupleTypeInfo(Class<T> tupleType,
TypeInformation<?>... types) |
TupleTypeInfo(TypeInformation<?>... types) |
TupleTypeInfoBase(Class<T> tupleType,
TypeInformation<?>... types) |
Modifier and Type | Method and Description |
---|---|
static void |
Serializers.recursivelyRegisterType(TypeInformation<?> typeInfo,
ExecutionConfig config,
Set<Class<?>> alreadySeen) |
Modifier and Type | Method and Description |
---|---|
static TypeInformation<T> |
CrossDataSet.getType() |
TypeInformation<T> |
DataSet.getType()
Returns the TypeInformation for the elements of this DataSet.
|
Modifier and Type | Method and Description |
---|---|
<O> DataSet<O> |
CoGroupDataSet.apply(CoGroupFunction<L,R,O> coGrouper,
TypeInformation<O> evidence$5,
scala.reflect.ClassTag<O> evidence$6)
Creates a new
DataSet by passing each pair of co-grouped element lists to the given
function. |
<O> DataSet<O> |
CrossDataSet.apply(CrossFunction<L,R,O> crosser,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet by passing each pair of values to the given function. |
<O> DataSet<O> |
JoinDataSet.apply(FlatJoinFunction<L,R,O> joiner,
TypeInformation<O> evidence$5,
scala.reflect.ClassTag<O> evidence$6)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(FlatJoinFunction<L,R,O> fun,
TypeInformation<O> evidence$24,
scala.reflect.ClassTag<O> evidence$25) |
<O> DataSet<O> |
CoGroupDataSet.apply(scala.Function2<scala.collection.Iterator<L>,scala.collection.Iterator<R>,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of co-grouped element lists is the
result of the given function. |
<O> DataSet<O> |
CrossDataSet.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of elements is the result
of the given function. |
<O> DataSet<O> |
JoinDataSet.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of joined elements is the result
of the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$20,
scala.reflect.ClassTag<O> evidence$21) |
<O> DataSet<O> |
CoGroupDataSet.apply(scala.Function3<scala.collection.Iterator<L>,scala.collection.Iterator<R>,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet where the result for each pair of co-grouped element lists is the
result of the given function. |
<O> DataSet<O> |
JoinDataSet.apply(scala.Function3<L,R,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(scala.Function3<L,R,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$22,
scala.reflect.ClassTag<O> evidence$23) |
<O> DataSet<O> |
JoinDataSet.apply(JoinFunction<L,R,O> fun,
TypeInformation<O> evidence$7,
scala.reflect.ClassTag<O> evidence$8)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(JoinFunction<L,R,O> fun,
TypeInformation<O> evidence$26,
scala.reflect.ClassTag<O> evidence$27) |
<R> DataSet<R> |
GroupedDataSet.combineGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$10,
scala.reflect.ClassTag<R> evidence$11)
Applies a CombineFunction on a grouped
DataSet . |
static <R> DataSet<R> |
CrossDataSet.combineGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$26,
scala.reflect.ClassTag<R> evidence$27) |
<R> DataSet<R> |
DataSet.combineGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$26,
scala.reflect.ClassTag<R> evidence$27)
Applies a GroupCombineFunction on a grouped
DataSet . |
<R> DataSet<R> |
GroupedDataSet.combineGroup(GroupCombineFunction<T,R> combiner,
TypeInformation<R> evidence$12,
scala.reflect.ClassTag<R> evidence$13)
Applies a CombineFunction on a grouped
DataSet . |
static <R> DataSet<R> |
CrossDataSet.combineGroup(GroupCombineFunction<T,R> combiner,
TypeInformation<R> evidence$24,
scala.reflect.ClassTag<R> evidence$25) |
<R> DataSet<R> |
DataSet.combineGroup(GroupCombineFunction<T,R> combiner,
TypeInformation<R> evidence$24,
scala.reflect.ClassTag<R> evidence$25)
Applies a GroupCombineFunction on a grouped
DataSet . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.createHadoopInput(org.apache.hadoop.mapred.InputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.createHadoopInput(org.apache.hadoop.mapreduce.InputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<T> DataSet<T> |
ExecutionEnvironment.createInput(InputFormat<T,?> inputFormat,
scala.reflect.ClassTag<T> evidence$7,
TypeInformation<T> evidence$8)
Generic method to create an input DataSet with an
InputFormat . |
static <K> DataSet<T> |
CrossDataSet.distinct(scala.Function1<T,K> fun,
TypeInformation<K> evidence$28) |
<K> DataSet<T> |
DataSet.distinct(scala.Function1<T,K> fun,
TypeInformation<K> evidence$28)
Creates a new DataSet containing the distinct elements of this DataSet.
|
<K> O |
HalfUnfinishedKeyPairOperation.equalTo(scala.Function1<R,K> fun,
TypeInformation<K> evidence$2)
Specify the key selector function for the right side of the key based operation.
|
static <R> DataSet<R> |
CrossDataSet.flatMap(FlatMapFunction<T,R> flatMapper,
TypeInformation<R> evidence$12,
scala.reflect.ClassTag<R> evidence$13) |
<R> DataSet<R> |
DataSet.flatMap(FlatMapFunction<T,R> flatMapper,
TypeInformation<R> evidence$12,
scala.reflect.ClassTag<R> evidence$13)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
static <R> DataSet<R> |
CrossDataSet.flatMap(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$16,
scala.reflect.ClassTag<R> evidence$17) |
<R> DataSet<R> |
DataSet.flatMap(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$16,
scala.reflect.ClassTag<R> evidence$17)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
static <R> DataSet<R> |
CrossDataSet.flatMap(scala.Function2<T,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$14,
scala.reflect.ClassTag<R> evidence$15) |
<R> DataSet<R> |
DataSet.flatMap(scala.Function2<T,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$14,
scala.reflect.ClassTag<R> evidence$15)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
<T> DataSet<T> |
ExecutionEnvironment.fromCollection(scala.collection.Iterable<T> data,
scala.reflect.ClassTag<T> evidence$10,
TypeInformation<T> evidence$11)
Creates a DataSet from the given non-empty
Iterable . |
<T> DataSet<T> |
ExecutionEnvironment.fromCollection(scala.collection.Iterator<T> data,
scala.reflect.ClassTag<T> evidence$12,
TypeInformation<T> evidence$13)
Creates a DataSet from the given
Iterator . |
<T> DataSet<T> |
ExecutionEnvironment.fromElements(scala.collection.Seq<T> data,
scala.reflect.ClassTag<T> evidence$14,
TypeInformation<T> evidence$15)
Creates a new data set that contains the given elements.
|
<T> DataSet<T> |
ExecutionEnvironment.fromParallelCollection(SplittableIterator<T> iterator,
scala.reflect.ClassTag<T> evidence$16,
TypeInformation<T> evidence$17)
Creates a new data set that contains elements in the iterator.
|
static <K> GroupedDataSet<T> |
CrossDataSet.groupBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$29) |
<K> GroupedDataSet<T> |
DataSet.groupBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$29)
Creates a
GroupedDataSet which provides operations on groups of elements. |
static <R> DataSet<R> |
CrossDataSet.map(scala.Function1<T,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5) |
<R> DataSet<R> |
DataSet.map(scala.Function1<T,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5)
Creates a new DataSet by applying the given function to every element of this DataSet.
|
static <R> DataSet<R> |
CrossDataSet.map(MapFunction<T,R> mapper,
TypeInformation<R> evidence$2,
scala.reflect.ClassTag<R> evidence$3) |
<R> DataSet<R> |
DataSet.map(MapFunction<T,R> mapper,
TypeInformation<R> evidence$2,
scala.reflect.ClassTag<R> evidence$3)
Creates a new DataSet by applying the given function to every element of this DataSet.
|
static <R> DataSet<R> |
CrossDataSet.mapPartition(scala.Function1<scala.collection.Iterator<T>,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$10,
scala.reflect.ClassTag<R> evidence$11) |
<R> DataSet<R> |
DataSet.mapPartition(scala.Function1<scala.collection.Iterator<T>,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$10,
scala.reflect.ClassTag<R> evidence$11)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
static <R> DataSet<R> |
CrossDataSet.mapPartition(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$8,
scala.reflect.ClassTag<R> evidence$9) |
<R> DataSet<R> |
DataSet.mapPartition(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$8,
scala.reflect.ClassTag<R> evidence$9)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
static <R> DataSet<R> |
CrossDataSet.mapPartition(MapPartitionFunction<T,R> partitionMapper,
TypeInformation<R> evidence$6,
scala.reflect.ClassTag<R> evidence$7) |
<R> DataSet<R> |
DataSet.mapPartition(MapPartitionFunction<T,R> partitionMapper,
TypeInformation<R> evidence$6,
scala.reflect.ClassTag<R> evidence$7)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
static <K> DataSet<T> |
CrossDataSet.partitionByHash(scala.Function1<T,K> fun,
TypeInformation<K> evidence$35) |
<K> DataSet<T> |
DataSet.partitionByHash(scala.Function1<T,K> fun,
TypeInformation<K> evidence$35)
Partitions a DataSet using the specified key selector function.
|
static <K> DataSet<T> |
CrossDataSet.partitionByRange(scala.Function1<T,K> fun,
TypeInformation<K> evidence$36) |
<K> DataSet<T> |
DataSet.partitionByRange(scala.Function1<T,K> fun,
TypeInformation<K> evidence$36)
Range-partitions a DataSet using the specified key selector function.
|
static <K> DataSet<T> |
CrossDataSet.partitionCustom(Partitioner<K> partitioner,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$39) |
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$39)
Partitions a DataSet on the key returned by the selector, using a custom partitioner.
|
static <K> DataSet<T> |
CrossDataSet.partitionCustom(Partitioner<K> partitioner,
int field,
TypeInformation<K> evidence$37) |
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
int field,
TypeInformation<K> evidence$37)
Partitions a tuple DataSet on the specified key fields using a custom partitioner.
|
static <K> DataSet<T> |
CrossDataSet.partitionCustom(Partitioner<K> partitioner,
String field,
TypeInformation<K> evidence$38) |
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
String field,
TypeInformation<K> evidence$38)
Partitions a POJO DataSet on the specified key fields using a custom partitioner.
|
<T> DataSet<T> |
ExecutionEnvironment.readCsvFile(String filePath,
String lineDelimiter,
String fieldDelimiter,
Character quoteCharacter,
boolean ignoreFirstLine,
String ignoreComments,
boolean lenient,
int[] includedFields,
String[] pojoFields,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Creates a DataSet by reading the given CSV file.
|
<T> DataSet<T> |
ExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
scala.reflect.ClassTag<T> evidence$5,
TypeInformation<T> evidence$6)
Creates a new DataSource by reading the specified file using the custom
FileInputFormat . |
<T> DataSet<T> |
ExecutionEnvironment.readFileOfPrimitives(String filePath,
String delimiter,
scala.reflect.ClassTag<T> evidence$3,
TypeInformation<T> evidence$4)
Creates a DataSet that represents the primitive type produced by reading the
given file in delimited way.This method is similar to
readCsvFile with
single field, but it produces a DataSet not through Tuple. |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
|
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readSequenceFile(Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Deprecated.
Please use
HadoopInputs.readSequenceFile(java.lang.Class<K>, java.lang.Class<V>, java.lang.String, org.apache.flink.api.common.typeinfo.TypeInformation<scala.Tuple2<K, V>>)
from the flink-hadoop-compatibility module. |
<R> DataSet<R> |
GroupedDataSet.reduceGroup(scala.Function1<scala.collection.Iterator<T>,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the group reduce function. |
static <R> DataSet<R> |
CrossDataSet.reduceGroup(scala.Function1<scala.collection.Iterator<T>,R> fun,
TypeInformation<R> evidence$22,
scala.reflect.ClassTag<R> evidence$23) |
<R> DataSet<R> |
DataSet.reduceGroup(scala.Function1<scala.collection.Iterator<T>,R> fun,
TypeInformation<R> evidence$22,
scala.reflect.ClassTag<R> evidence$23)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<R> DataSet<R> |
GroupedDataSet.reduceGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$6,
scala.reflect.ClassTag<R> evidence$7)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the group reduce function. |
static <R> DataSet<R> |
CrossDataSet.reduceGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$20,
scala.reflect.ClassTag<R> evidence$21) |
<R> DataSet<R> |
DataSet.reduceGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$20,
scala.reflect.ClassTag<R> evidence$21)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<R> DataSet<R> |
GroupedDataSet.reduceGroup(GroupReduceFunction<T,R> reducer,
TypeInformation<R> evidence$8,
scala.reflect.ClassTag<R> evidence$9)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the GroupReduceFunction . |
static <R> DataSet<R> |
CrossDataSet.reduceGroup(GroupReduceFunction<T,R> reducer,
TypeInformation<R> evidence$18,
scala.reflect.ClassTag<R> evidence$19) |
<R> DataSet<R> |
DataSet.reduceGroup(GroupReduceFunction<T,R> reducer,
TypeInformation<R> evidence$18,
scala.reflect.ClassTag<R> evidence$19)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<K> GroupedDataSet<T> |
GroupedDataSet.sortGroup(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$2)
Adds a secondary sort key to this
GroupedDataSet . |
static <K> DataSet<T> |
CrossDataSet.sortPartition(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$40) |
<K> DataSet<T> |
PartitionSortedDataSet.sortPartition(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$2) |
<K> DataSet<T> |
DataSet.sortPartition(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$40)
Locally sorts the partitions of the DataSet on the extracted key in the specified order.
|
<K> HalfUnfinishedKeyPairOperation<L,R,O> |
UnfinishedKeyPairOperation.where(scala.Function1<L,K> fun,
TypeInformation<K> evidence$1)
Specify the key selector function for the left side of the key based operation.
|
<K> GroupedDataSet<T> |
GroupedDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$3)
Sets a custom partitioner for the grouping.
|
<K> JoinDataSet<L,R> |
JoinDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$9) |
<K> CoGroupDataSet<L,R> |
CoGroupDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$7) |
<K> JoinFunctionAssigner<L,R> |
JoinFunctionAssigner.withPartitioner(Partitioner<K> part,
TypeInformation<K> evidence$19) |
Modifier and Type | Method and Description |
---|---|
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkArrayTypeInfo(TypeDescriptors.ArrayDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$10) |
<T extends scala.Product> |
TypeInformationGen.mkCaseClassTypeInfo(TypeDescriptors.CaseClassDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$4) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkEitherTypeInfo(TypeDescriptors.EitherDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$5) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkEnumValueTypeInfo(TypeDescriptors.EnumValueDescriptor d,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$6) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkGenericTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$15) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkJavaTuple(TypeDescriptors.JavaTupleDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$13) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkOptionTypeInfo(TypeDescriptors.OptionDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$8) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkPojo(TypeDescriptors.PojoDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$14) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkPrimitiveTypeInfo(scala.reflect.macros.Context.universe tpe,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$17) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTraversableTypeInfo(TypeDescriptors.TraversableDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$9) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTryTypeInfo(TypeDescriptors.TryDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$7) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$2) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeInfo(scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeInfoFromFactory(TypeDescriptors.FactoryTypeDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$3) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeParameter(TypeDescriptors.TypeParameterDescriptor typeParameter,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$16) |
<T extends Value> |
TypeInformationGen.mkValueTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$11) |
<T extends org.apache.hadoop.io.Writable> |
TypeInformationGen.mkWritableTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$12) |
Modifier and Type | Method and Description |
---|---|
<R> DataSet<R> |
OnGroupedDataSet.combineGroupWith(scala.Function1<scala.collection.immutable.Stream<T>,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5)
Same as a reducing operation but only acts locally,
ideal to perform pre-aggregation before a reduction.
|
<R> DataSet<R> |
OnDataSet.flatMapWith(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$5,
scala.reflect.ClassTag<R> evidence$6)
Applies a function
fun to each item of the dataset, producing a collection of items
that will be flattened in the resulting data set |
<K> GroupedDataSet<T> |
OnDataSet.groupingBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$9)
Groups the items according to a grouping function
fun |
<K> O |
OnHalfUnfinishedKeyPairOperation.isEqualTo(scala.Function1<R,K> fun,
TypeInformation<K> evidence$1)
Initiates a join or co-group operation, defining the second half of
the where clause with an equality over the right data set items.
|
<R> DataSet<R> |
OnDataSet.mapPartitionWith(scala.Function1<scala.collection.immutable.Stream<T>,R> fun,
TypeInformation<R> evidence$3,
scala.reflect.ClassTag<R> evidence$4)
Applies a function
fun to a partition as a whole |
<R> DataSet<R> |
OnDataSet.mapWith(scala.Function1<T,R> fun,
TypeInformation<R> evidence$1,
scala.reflect.ClassTag<R> evidence$2)
Applies a function
fun to each item of the data set |
<O> DataSet<O> |
OnCrossDataSet.projecting(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Starting from a cross data set, uses the function
fun to project elements from
both the input data sets in the resulting data set |
<O> DataSet<O> |
OnJoinFunctionAssigner.projecting(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Joins the data sets using the function
fun to project elements from both in the
resulting data set |
<O> DataSet<O> |
OnCoGroupDataSet.projecting(scala.Function2<scala.collection.immutable.Stream<L>,scala.collection.immutable.Stream<R>,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Co-groups the data sets using the function
fun to project elements from both in
the resulting data set |
<R> DataSet<R> |
OnDataSet.reduceGroupWith(scala.Function1<scala.collection.immutable.Stream<T>,R> fun,
TypeInformation<R> evidence$7,
scala.reflect.ClassTag<R> evidence$8)
Applies a reducer
fun to a grouped data set |
<R> DataSet<R> |
OnGroupedDataSet.reduceGroupWith(scala.Function1<scala.collection.immutable.Stream<T>,R> fun,
TypeInformation<R> evidence$2,
scala.reflect.ClassTag<R> evidence$3)
Reduces the data set group-wise with a reducer
fun |
<K> GroupedDataSet<T> |
OnGroupedDataSet.sortGroupWith(Order order,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$1)
Sorts a group using a sorting function
fun and an Order |
<K> HalfUnfinishedKeyPairOperation<L,R,O> |
OnUnfinishedKeyPairOperation.whereClause(scala.Function1<L,K> fun,
TypeInformation<K> evidence$1)
Initiates a join or co-group operation, defining the first half of
the where clause with the items of the left data set that will be
checked for equality with the ones provided by the second half.
|
Modifier and Type | Method and Description |
---|---|
void |
ScalaCsvOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
The purpose of this method is solely to check whether the data type to be processed
is in fact a tuple type.
|
Constructor and Description |
---|
AggregatingUdf(TypeInformation<T> typeInfo,
AggregationFunction<Object>[] aggFunctions,
int[] fieldPositions) |
Modifier and Type | Class and Description |
---|---|
class |
CaseClassTypeInfo<T extends scala.Product>
TypeInformation for Case Classes.
|
class |
EnumValueTypeInfo<E extends scala.Enumeration>
TypeInformation for
Enumeration values. |
class |
OptionTypeInfo<A,T extends scala.Option<A>>
TypeInformation for
Option . |
class |
ScalaNothingTypeInfo |
class |
TraversableTypeInfo<T extends scala.collection.TraversableOnce<E>,E>
TypeInformation for Scala Collections.
|
class |
TryTypeInfo<A,T extends scala.util.Try<A>>
TypeInformation for
Try . |
class |
UnitTypeInfo |
Modifier and Type | Method and Description |
---|---|
TypeInformation<E> |
TraversableTypeInfo.elementTypeInfo() |
TypeInformation<A> |
TryTypeInfo.elemTypeInfo() |
TypeInformation<A> |
OptionTypeInfo.getElemTypeInfo() |
<X> TypeInformation<X> |
CaseClassTypeInfo.getTypeAt(String fieldExpression) |
TypeInformation<A> |
EitherTypeInfo.leftTypeInfo() |
TypeInformation<B> |
EitherTypeInfo.rightTypeInfo() |
TypeInformation<?>[] |
CaseClassTypeInfo.typeParamTypeInfos() |
Modifier and Type | Method and Description |
---|---|
static <T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeUtils.createTypeInfo(scala.reflect.macros.Context c,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeUtils$.createTypeInfo(scala.reflect.macros.Context c,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
Map<String,TypeInformation<?>> |
TryTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
CaseClassTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
OptionTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
EnumValueTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
EitherTypeInfo.getGenericParameters() |
Map<String,TypeInformation<?>> |
TraversableTypeInfo.getGenericParameters() |
Constructor and Description |
---|
CaseClassTypeInfo(Class<T> clazz,
TypeInformation<?>[] typeParamTypeInfos,
scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.collection.Seq<String> fieldNames) |
EitherTypeInfo(Class<T> clazz,
TypeInformation<A> leftTypeInfo,
TypeInformation<B> rightTypeInfo) |
EitherTypeInfo(Class<T> clazz,
TypeInformation<A> leftTypeInfo,
TypeInformation<B> rightTypeInfo) |
OptionTypeInfo(TypeInformation<A> elemTypeInfo) |
TraversableTypeInfo(Class<T> clazz,
TypeInformation<E> elementTypeInfo) |
TryTypeInfo(TypeInformation<A> elemTypeInfo) |
Constructor and Description |
---|
CaseClassTypeInfo(Class<T> clazz,
TypeInformation<?>[] typeParamTypeInfos,
scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.collection.Seq<String> fieldNames) |
Modifier and Type | Method and Description |
---|---|
<R> SingleOutputStreamOperator<R> |
PatternStream.flatSelect(PatternFlatSelectFunction<T,R> patternFlatSelectFunction,
TypeInformation<R> outTypeInfo)
Applies a flat select function to the detected pattern sequence.
|
<R> SingleOutputStreamOperator<R> |
PatternStream.select(PatternSelectFunction<T,R> patternSelectFunction,
TypeInformation<R> outTypeInfo)
Applies a select function to the detected pattern sequence.
|
Modifier and Type | Method and Description |
---|---|
<R> DataStream<R> |
PatternStream.flatSelect(scala.Function2<scala.collection.Map<String,scala.collection.Iterable<T>>,Collector<R>,scala.runtime.BoxedUnit> patternFlatSelectFun,
TypeInformation<R> evidence$10)
Applies a flat select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.flatSelect(scala.Function3<scala.collection.Map<String,scala.collection.Iterable<T>>,Object,Collector<L>,scala.runtime.BoxedUnit> patternFlatTimeoutFunction,
scala.Function2<scala.collection.Map<String,scala.collection.Iterable<T>>,Collector<R>,scala.runtime.BoxedUnit> patternFlatSelectFunction,
TypeInformation<L> evidence$11,
TypeInformation<R> evidence$12)
Applies a flat select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.flatSelect(scala.Function3<scala.collection.Map<String,scala.collection.Iterable<T>>,Object,Collector<L>,scala.runtime.BoxedUnit> patternFlatTimeoutFunction,
scala.Function2<scala.collection.Map<String,scala.collection.Iterable<T>>,Collector<R>,scala.runtime.BoxedUnit> patternFlatSelectFunction,
TypeInformation<L> evidence$11,
TypeInformation<R> evidence$12)
Applies a flat select function to the detected pattern sequence.
|
<R> DataStream<R> |
PatternStream.flatSelect(PatternFlatSelectFunction<T,R> patternFlatSelectFunction,
TypeInformation<R> evidence$4)
Applies a flat select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.flatSelect(PatternFlatTimeoutFunction<T,L> patternFlatTimeoutFunction,
PatternFlatSelectFunction<T,R> patternFlatSelectFunction,
TypeInformation<L> evidence$5,
TypeInformation<R> evidence$6)
Applies a flat select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.flatSelect(PatternFlatTimeoutFunction<T,L> patternFlatTimeoutFunction,
PatternFlatSelectFunction<T,R> patternFlatSelectFunction,
TypeInformation<L> evidence$5,
TypeInformation<R> evidence$6)
Applies a flat select function to the detected pattern sequence.
|
<R> DataStream<R> |
PatternStream.select(scala.Function1<scala.collection.Map<String,scala.collection.Iterable<T>>,R> patternSelectFun,
TypeInformation<R> evidence$7)
Applies a select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.select(scala.Function2<scala.collection.Map<String,scala.collection.Iterable<T>>,Object,L> patternTimeoutFunction,
scala.Function1<scala.collection.Map<String,scala.collection.Iterable<T>>,R> patternSelectFunction,
TypeInformation<L> evidence$8,
TypeInformation<R> evidence$9)
Applies a select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.select(scala.Function2<scala.collection.Map<String,scala.collection.Iterable<T>>,Object,L> patternTimeoutFunction,
scala.Function1<scala.collection.Map<String,scala.collection.Iterable<T>>,R> patternSelectFunction,
TypeInformation<L> evidence$8,
TypeInformation<R> evidence$9)
Applies a select function to the detected pattern sequence.
|
<R> DataStream<R> |
PatternStream.select(PatternSelectFunction<T,R> patternSelectFunction,
TypeInformation<R> evidence$1)
Applies a select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.select(PatternTimeoutFunction<T,L> patternTimeoutFunction,
PatternSelectFunction<T,R> patternSelectFunction,
TypeInformation<L> evidence$2,
TypeInformation<R> evidence$3)
Applies a select function to the detected pattern sequence.
|
<L,R> DataStream<scala.util.Either<L,R>> |
PatternStream.select(PatternTimeoutFunction<T,L> patternTimeoutFunction,
PatternSelectFunction<T,R> patternSelectFunction,
TypeInformation<L> evidence$2,
TypeInformation<R> evidence$3)
Applies a select function to the detected pattern sequence.
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tweet> |
SimpleTweetInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunction<K,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the edge values of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunctionWithVertexValue<K,VV,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the edge values of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunction<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the neighbors (both edges and vertices)
of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunctionWithVertexValue<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the neighbors (both edges and vertices)
of each vertex.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(MapFunction<Edge<K,EV>,NV> mapper,
TypeInformation<Edge<K,NV>> returnType)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(MapFunction<Vertex<K,VV>,NV> mapper,
TypeInformation<Vertex<K,NV>> returnType)
Apply a function to the attribute of each vertex in the graph.
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<VV> |
LabelPropagation.SendNewLabelToNeighbors.getProducedType() |
TypeInformation<VV> |
ConnectedComponents.CCMessenger.getProducedType() |
Constructor and Description |
---|
CCMessenger(TypeInformation<VV> typeInformation) |
SendNewLabelToNeighbors(TypeInformation<VV> typeInformation) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,Either<NullValue,Message>>> |
VertexCentricIteration.MessageCombinerUdf.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K> Graph<K,NullValue,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$49,
scala.reflect.ClassTag<K> evidence$50)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K> Graph<K,NullValue,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$49,
scala.reflect.ClassTag<K> evidence$50)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV> Graph<K,VV,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV> Graph<K,VV,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K,VV> Graph<K,VV,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K,VV> Graph<K,VV,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunction<K,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> evidence$97,
scala.reflect.ClassTag<T> evidence$98)
Compute an aggregate over the edges of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunctionWithVertexValue<K,VV,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> evidence$95,
scala.reflect.ClassTag<T> evidence$96)
Compute an aggregate over the edges of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunction<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> evidence$101,
scala.reflect.ClassTag<T> evidence$102)
Compute an aggregate over the neighbors (edges and vertices) of each
vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunctionWithVertexValue<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> evidence$99,
scala.reflect.ClassTag<T> evidence$100)
Compute an aggregate over the neighbors (edges and vertices) of each
vertex.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdges(DataSet<scala.Tuple3<K,K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$89)
Joins the edge DataSet with an input DataSet on the composite key of both
source and target IDs and applies a user-defined transformation on the values
of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdges(DataSet<scala.Tuple3<K,K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$90)
Joins the edge DataSet with an input DataSet on the composite key of both
source and target IDs and applies a user-defined transformation on the values
of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnSource(DataSet<scala.Tuple2<K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$91)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnSource(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$92)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnTarget(DataSet<scala.Tuple2<K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$93)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnTarget(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$94)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithVertices(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<VV,T,VV> fun,
TypeInformation<T> evidence$88)
Joins the vertex DataSet of this graph with an input Tuple2 DataSet and applies
a user-defined transformation on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithVertices(DataSet<scala.Tuple2<K,T>> inputDataSet,
VertexJoinFunction<VV,T> vertexJoinFunction,
TypeInformation<T> evidence$87)
Joins the vertex DataSet of this graph with an input Tuple2 DataSet and applies
a user-defined transformation on the values of the matched records.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(scala.Function1<Edge<K,EV>,NV> fun,
TypeInformation<NV> evidence$73,
scala.reflect.ClassTag<NV> evidence$74)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(MapFunction<Edge<K,EV>,NV> mapper,
TypeInformation<NV> evidence$71,
scala.reflect.ClassTag<NV> evidence$72)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(scala.Function1<Vertex<K,VV>,NV> fun,
TypeInformation<NV> evidence$69,
scala.reflect.ClassTag<NV> evidence$70)
Apply a function to the attribute of each vertex in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(MapFunction<Vertex<K,VV>,NV> mapper,
TypeInformation<NV> evidence$67,
scala.reflect.ClassTag<NV> evidence$68)
Apply a function to the attribute of each vertex in the graph.
|
<T> T |
Graph.run(GraphAlgorithm<K,VV,EV,T> algorithm,
TypeInformation<T> evidence$103,
scala.reflect.ClassTag<T> evidence$104) |
<T> GraphAnalytic<K,VV,EV,T> |
Graph.run(GraphAnalytic<K,VV,EV,T> analytic,
TypeInformation<T> evidence$105,
scala.reflect.ClassTag<T> evidence$106)
A GraphAnalytic is similar to a GraphAlgorithm but is terminal and results
are retrieved via accumulators.
|
<NEW> Graph<K,VV,NEW> |
Graph.translateEdgeValues(scala.Function2<EV,NEW,NEW> fun,
TypeInformation<NEW> evidence$85,
scala.reflect.ClassTag<NEW> evidence$86)
Translate edge values using the given function.
|
<NEW> Graph<K,VV,NEW> |
Graph.translateEdgeValues(TranslateFunction<EV,NEW> translator,
TypeInformation<NEW> evidence$83,
scala.reflect.ClassTag<NEW> evidence$84)
Translate edge values using the given MapFunction.
|
<NEW> Graph<NEW,VV,EV> |
Graph.translateGraphIds(scala.Function2<K,NEW,NEW> fun,
TypeInformation<NEW> evidence$77,
scala.reflect.ClassTag<NEW> evidence$78)
Translate vertex and edge IDs using the given function.
|
<NEW> Graph<NEW,VV,EV> |
Graph.translateGraphIds(TranslateFunction<K,NEW> translator,
TypeInformation<NEW> evidence$75,
scala.reflect.ClassTag<NEW> evidence$76)
Translate vertex and edge IDs using the given MapFunction.
|
<NEW> Graph<K,NEW,EV> |
Graph.translateVertexValues(scala.Function2<VV,NEW,NEW> fun,
TypeInformation<NEW> evidence$81,
scala.reflect.ClassTag<NEW> evidence$82)
Translate vertex values using the given function.
|
<NEW> Graph<K,NEW,EV> |
Graph.translateVertexValues(TranslateFunction<VV,NEW> translator,
TypeInformation<NEW> evidence$79,
scala.reflect.ClassTag<NEW> evidence$80)
Translate vertex values using the given MapFunction.
|
Constructor and Description |
---|
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Modifier and Type | Class and Description |
---|---|
class |
ValueArrayTypeInfo<T>
A
TypeInformation for the ValueArray type. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<ValueArray<T>> |
ValueArrayTypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters) |
Modifier and Type | Method and Description |
---|---|
Map<String,TypeInformation<?>> |
ValueArrayTypeInfo.getGenericParameters() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<ValueArray<T>> |
ValueArrayTypeInfoFactory.createTypeInfo(Type t,
Map<String,TypeInformation<?>> genericParameters) |
Constructor and Description |
---|
ValueArrayTypeInfo(TypeInformation<T> valueType) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<O> |
GraphUtils.MapTo.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopReduceFunction.getProducedType() |
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopReduceCombineFunction.getProducedType() |
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopMapFunction.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.createHadoopInput(org.apache.hadoop.mapred.InputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
InputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.createHadoopInput(org.apache.hadoop.mapred.InputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
InputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.createHadoopInput(org.apache.hadoop.mapreduce.InputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
InputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.createHadoopInput(org.apache.hadoop.mapreduce.InputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
InputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that wraps the given Hadoop
FileInputFormat . |
<K,V> HadoopInputFormat<K,V> |
HadoopInputs$.readSequenceFile(Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that reads a Hadoop sequence
file with the given key and value classes. |
static <K,V> HadoopInputFormat<K,V> |
HadoopInputs.readSequenceFile(Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a Flink
InputFormat that reads a Hadoop sequence
file with the given key and value classes. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
HCatInputFormatBase.getProducedType() |
Constructor and Description |
---|
ListStateDescriptor(String name,
TypeInformation<T> typeInfo)
Deprecated.
Creates a new
ListStateDescriptor with the given name and list element type. |
Modifier and Type | Method and Description |
---|---|
static <T> DataSet<Block<T>> |
FlinkMLTools.block(DataSet<T> input,
int numBlocks,
scala.Option<Partitioner<Object>> partitionerOption,
TypeInformation<T> evidence$31,
scala.reflect.ClassTag<T> evidence$32)
Groups the DataSet input into numBlocks blocks.
|
<T> DataSet<Block<T>> |
FlinkMLTools$.block(DataSet<T> input,
int numBlocks,
scala.Option<Partitioner<Object>> partitionerOption,
TypeInformation<T> evidence$31,
scala.reflect.ClassTag<T> evidence$32)
Groups the DataSet input into numBlocks blocks.
|
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <T> DataSet<T> |
FlinkMLTools.persist(DataSet<T> dataset,
String path,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Writes a
DataSet to the specified path and returns it as a DataSource for subsequent
operations. |
<T> DataSet<T> |
FlinkMLTools$.persist(DataSet<T> dataset,
String path,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Writes a
DataSet to the specified path and returns it as a DataSource for subsequent
operations. |
Modifier and Type | Method and Description |
---|---|
static <T extends Vector> |
KNN.fitKNN(TypeInformation<T> evidence$1)
FitOperation which trains a KNN based on the given training data set. |
<T extends Vector> |
KNN$.fitKNN(TypeInformation<T> evidence$1)
FitOperation which trains a KNN based on the given training data set. |
static <T extends Vector> |
KNN.predictValues(scala.reflect.ClassTag<T> evidence$2,
TypeInformation<T> evidence$3)
PredictDataSetOperation which calculates k-nearest neighbors of the given testing data
set. |
<T extends Vector> |
KNN$.predictValues(scala.reflect.ClassTag<T> evidence$2,
TypeInformation<T> evidence$3)
PredictDataSetOperation which calculates k-nearest neighbors of the given testing data
set. |
Modifier and Type | Method and Description |
---|---|
static <T extends Vector> |
StochasticOutlierSelection.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation applies the stochastic outlier selection algorithm on a
Vector which will transform the high-dimensionaly input to a single Double output. |
<T extends Vector> |
StochasticOutlierSelection$.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation applies the stochastic outlier selection algorithm on a
Vector which will transform the high-dimensionaly input to a single Double output. |
Modifier and Type | Method and Description |
---|---|
<T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor$.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
<T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor$.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
static <T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
static <T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultEvaluateDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
EvaluateDataSetOperation which takes a PredictOperation to calculate a tuple
of true label value and predicted label value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultEvaluateDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
EvaluateDataSetOperation which takes a PredictOperation to calculate a tuple
of true label value and predicted label value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultPredictDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
PredictDataSetOperation which takes a PredictOperation to calculate a tuple
of testing element and its prediction value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultPredictDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
PredictDataSetOperation which takes a PredictOperation to calculate a tuple
of testing element and its prediction value. |
<Instance extends Estimator<Instance>,Model,Input,Output> |
Transformer$.defaultTransformDataSetOperation(TransformOperation<Instance,Model,Input,Output> transformOperation,
TypeInformation<Output> outputTypeInformation,
scala.reflect.ClassTag<Output> outputClassTag) |
Modifier and Type | Method and Description |
---|---|
<T extends Vector> |
StandardScaler$.fitLabelVectorTupleStandardScaler(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
Trains the
StandardScaler by learning the mean and standard deviation of the training
data which is of type (Vector , Double). |
static <T extends Vector> |
StandardScaler.fitLabelVectorTupleStandardScaler(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
Trains the
StandardScaler by learning the mean and standard deviation of the training
data which is of type (Vector , Double). |
<T> Splitter.TrainTestDataSet<T>[] |
Splitter$.kFoldSplit(DataSet<T> input,
int kFolds,
long seed,
TypeInformation<T> evidence$9,
scala.reflect.ClassTag<T> evidence$10)
Split a DataSet into an array of TrainTest DataSets
|
static <T> Splitter.TrainTestDataSet<T>[] |
Splitter.kFoldSplit(DataSet<T> input,
int kFolds,
long seed,
TypeInformation<T> evidence$9,
scala.reflect.ClassTag<T> evidence$10)
Split a DataSet into an array of TrainTest DataSets
|
<T> DataSet<T>[] |
Splitter$.multiRandomSplit(DataSet<T> input,
double[] fracArray,
long seed,
TypeInformation<T> evidence$7,
scala.reflect.ClassTag<T> evidence$8)
Split a DataSet by the probability fraction of each element of a vector.
|
static <T> DataSet<T>[] |
Splitter.multiRandomSplit(DataSet<T> input,
double[] fracArray,
long seed,
TypeInformation<T> evidence$7,
scala.reflect.ClassTag<T> evidence$8)
Split a DataSet by the probability fraction of each element of a vector.
|
<T> DataSet<T>[] |
Splitter$.randomSplit(DataSet<T> input,
double fraction,
boolean precise,
long seed,
TypeInformation<T> evidence$5,
scala.reflect.ClassTag<T> evidence$6)
Split a DataSet by the probability fraction of each element.
|
static <T> DataSet<T>[] |
Splitter.randomSplit(DataSet<T> input,
double fraction,
boolean precise,
long seed,
TypeInformation<T> evidence$5,
scala.reflect.ClassTag<T> evidence$6)
Split a DataSet by the probability fraction of each element.
|
<T> Splitter.TrainTestHoldoutDataSet<T> |
Splitter$.trainTestHoldoutSplit(DataSet<T> input,
scala.Tuple3<Object,Object,Object> fracTuple,
long seed,
TypeInformation<T> evidence$13,
scala.reflect.ClassTag<T> evidence$14)
A wrapper for multiRandomSplit that yields a TrainTestHoldoutDataSet
|
static <T> Splitter.TrainTestHoldoutDataSet<T> |
Splitter.trainTestHoldoutSplit(DataSet<T> input,
scala.Tuple3<Object,Object,Object> fracTuple,
long seed,
TypeInformation<T> evidence$13,
scala.reflect.ClassTag<T> evidence$14)
A wrapper for multiRandomSplit that yields a TrainTestHoldoutDataSet
|
<T> Splitter.TrainTestDataSet<T> |
Splitter$.trainTestSplit(DataSet<T> input,
double fraction,
boolean precise,
long seed,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
A wrapper for randomSplit that yields a TrainTestDataSet
|
static <T> Splitter.TrainTestDataSet<T> |
Splitter.trainTestSplit(DataSet<T> input,
double fraction,
boolean precise,
long seed,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
A wrapper for randomSplit that yields a TrainTestDataSet
|
<T extends Vector> |
StandardScaler$.transformTupleVectorDouble(BreezeVectorConverter<T> evidence$10,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
|
static <T extends Vector> |
StandardScaler.transformTupleVectorDouble(BreezeVectorConverter<T> evidence$10,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
|
static <T extends Vector> |
PolynomialFeatures.transformVectorIntoPolynomialBase(VectorBuilder<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation to map a Vector into the
polynomial feature space. |
<T extends Vector> |
PolynomialFeatures$.transformVectorIntoPolynomialBase(VectorBuilder<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation to map a Vector into the
polynomial feature space. |
<T extends Vector> |
StandardScaler$.transformVectors(BreezeVectorConverter<T> evidence$7,
TypeInformation<T> evidence$8,
scala.reflect.ClassTag<T> evidence$9)
TransformOperation to transform Vector types |
<T extends Vector> |
MinMaxScaler$.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation which scales input data of subtype of Vector with respect to
the calculated minimum and maximum of the training data. |
static <T extends Vector> |
MinMaxScaler.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation which scales input data of subtype of Vector with respect to
the calculated minimum and maximum of the training data. |
static <T extends Vector> |
StandardScaler.transformVectors(BreezeVectorConverter<T> evidence$7,
TypeInformation<T> evidence$8,
scala.reflect.ClassTag<T> evidence$9)
TransformOperation to transform Vector types |
Constructor and Description |
---|
StandardScalerTransformOperation(TypeInformation<T> evidence$4,
scala.reflect.ClassTag<T> evidence$5) |
TrainTestDataSet(DataSet<T> training,
DataSet<T> testing,
TypeInformation<T> evidence$1,
scala.reflect.ClassTag<T> evidence$2) |
TrainTestHoldoutDataSet(DataSet<T> training,
DataSet<T> testing,
DataSet<T> holdout,
TypeInformation<T> evidence$3,
scala.reflect.ClassTag<T> evidence$4) |
Constructor and Description |
---|
NoOpBinaryUdfOp(TypeInformation<OUT> type) |
Modifier and Type | Field and Description |
---|---|
TypeInformation<?> |
PythonOperationInfo.types |
Modifier and Type | Method and Description |
---|---|
TypeInformation<OUT> |
PythonMapPartition.getProducedType() |
TypeInformation<OUT> |
PythonCoGroup.getProducedType() |
Constructor and Description |
---|
PythonCoGroup(Configuration config,
int envID,
int setID,
TypeInformation<OUT> typeInformation) |
PythonMapPartition(Configuration config,
int envId,
int setId,
TypeInformation<OUT> typeInformation) |
Modifier and Type | Method and Description |
---|---|
static TypeInformation<?>[] |
StreamProjection.extractFieldTypes(int[] fields,
TypeInformation<?> inType) |
TypeInformation<T> |
WindowedStream.getInputType() |
TypeInformation<T> |
AllWindowedStream.getInputType() |
TypeInformation<KEY> |
KeyedStream.getKeyType()
Gets the type of the key by which the stream is partitioned.
|
TypeInformation<T> |
DataStream.getType()
Gets the type of the stream.
|
TypeInformation<IN1> |
ConnectedStreams.getType1()
Gets the type of the first input.
|
TypeInformation<IN2> |
ConnectedStreams.getType2()
Gets the type of the second input.
|
Modifier and Type | Method and Description |
---|---|
<ACC,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,R> function,
TypeInformation<ACC> accumulatorType,
TypeInformation<R> resultType)
Applies the given aggregation function to each window.
|
<ACC,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,R> function,
TypeInformation<ACC> accumulatorType,
TypeInformation<R> resultType)
Applies the given aggregation function to each window.
|
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,R> function,
TypeInformation<ACC> accumulatorType,
TypeInformation<R> resultType)
Applies the given
AggregateFunction to each window. |
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,R> function,
TypeInformation<ACC> accumulatorType,
TypeInformation<R> resultType)
Applies the given
AggregateFunction to each window. |
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<ACC,V,R> SingleOutputStreamOperator<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> aggregateFunction,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> accumulatorType,
TypeInformation<V> aggregateResultType,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(AllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<T> DataStream<T> |
CoGroupedStreams.WithWindow.apply(CoGroupFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<T> DataStream<T> |
JoinedStreams.WithWindow.apply(FlatJoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the join operation with the user function that is executed
for each combination of elements with the same key in a window.
|
<T> DataStream<T> |
JoinedStreams.WithWindow.apply(JoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the join operation with the user function that is executed
for each combination of elements with the same key in a window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(ReduceFunction<T> reduceFunction,
AllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(ReduceFunction<T> reduceFunction,
WindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
AllWindowFunction<R,R,W> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
WindowFunction<R,R,K,W> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(WindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
static TypeInformation<?>[] |
StreamProjection.extractFieldTypes(int[] fields,
TypeInformation<?> inType) |
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
AllWindowFunction<ACC,R,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
AllWindowFunction<ACC,R,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessAllWindowFunction<ACC,R,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<ACC,R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessAllWindowFunction<ACC,R,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<R,ACC> SingleOutputStreamOperator<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> windowFunction,
TypeInformation<ACC> foldResultType,
TypeInformation<R> windowResultType)
|
<R,ACC> SingleOutputStreamOperator<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> windowFunction,
TypeInformation<ACC> foldResultType,
TypeInformation<R> windowResultType)
|
<ACC,R> SingleOutputStreamOperator<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
WindowFunction<ACC,R,K,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<ACC,R> SingleOutputStreamOperator<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
WindowFunction<ACC,R,K,W> function,
TypeInformation<ACC> foldAccumulatorType,
TypeInformation<R> resultType)
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> resultType)
Deprecated.
|
<R> SingleOutputStreamOperator<R> |
ConnectedStreams.process(CoProcessFunction<IN1,IN2,R> coProcessFunction,
TypeInformation<R> outputType)
Applies the given
CoProcessFunction on the connected input streams,
thereby creating a transformed output stream. |
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.process(ProcessAllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
KeyedStream.process(ProcessFunction<T,R> processFunction,
TypeInformation<R> outputType)
Applies the given
ProcessFunction on the input stream, thereby
creating a transformed output stream. |
<R> SingleOutputStreamOperator<R> |
DataStream.process(ProcessFunction<T,R> processFunction,
TypeInformation<R> outputType)
Applies the given
ProcessFunction on the input stream, thereby
creating a transformed output stream. |
<R> SingleOutputStreamOperator<R> |
WindowedStream.process(ProcessWindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.reduce(ReduceFunction<T> reduceFunction,
AllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.reduce(ReduceFunction<T> reduceFunction,
ProcessAllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.reduce(ReduceFunction<T> reduceFunction,
ProcessWindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.reduce(ReduceFunction<T> reduceFunction,
WindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
SingleOutputStreamOperator<T> |
SingleOutputStreamOperator.returns(TypeInformation<T> typeInfo)
Adds a type information hint about the return type of this operator.
|
<R> SingleOutputStreamOperator<R> |
KeyedStream.transform(String operatorName,
TypeInformation<R> outTypeInfo,
OneInputStreamOperator<T,R> operator) |
<R> SingleOutputStreamOperator<R> |
DataStream.transform(String operatorName,
TypeInformation<R> outTypeInfo,
OneInputStreamOperator<T,R> operator)
Method for passing user defined operators along with the type
information that will transform the DataStream.
|
<R> SingleOutputStreamOperator<R> |
ConnectedStreams.transform(String functionName,
TypeInformation<R> outTypeInfo,
TwoInputStreamOperator<IN1,IN2,R> operator) |
<T> SingleOutputStreamOperator<T> |
CoGroupedStreams.WithWindow.with(CoGroupFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Deprecated.
This method will be removed once the
CoGroupedStreams.WithWindow.apply(CoGroupFunction, TypeInformation)
method is fixed in the next major version of Flink (2.0). |
<T> SingleOutputStreamOperator<T> |
JoinedStreams.WithWindow.with(FlatJoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Deprecated.
This method will be replaced by
JoinedStreams.WithWindow.apply(FlatJoinFunction, TypeInformation) in Flink 2.0.
So use the JoinedStreams.WithWindow.apply(FlatJoinFunction, TypeInformation) in the future. |
<T> SingleOutputStreamOperator<T> |
JoinedStreams.WithWindow.with(JoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Deprecated.
This method will be removed once the
JoinedStreams.WithWindow.apply(JoinFunction, TypeInformation)
method is fixed in the next major version of Flink (2.0). |
<F> IterativeStream.ConnectedIterativeStreams<T,F> |
IterativeStream.withFeedbackType(TypeInformation<F> feedbackType)
Changes the feedback type of the iteration and allows the user to apply
co-transformations on the input and feedback stream, as in a
ConnectedStreams . |
Modifier and Type | Method and Description |
---|---|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.addSource(SourceFunction<OUT> function,
String sourceName,
TypeInformation<OUT> typeInfo)
Ads a data source with a custom type information thus opening a
DataStream . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.addSource(SourceFunction<OUT> function,
TypeInformation<OUT> typeInfo)
Ads a data source with a custom type information thus opening a
DataStream . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.createInput(InputFormat<OUT,?> inputFormat,
TypeInformation<OUT> typeInfo)
Generic method to create an input data stream with
InputFormat . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromCollection(Collection<OUT> data,
TypeInformation<OUT> typeInfo)
Creates a data stream from the given non-empty collection.
|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromCollection(Iterator<OUT> data,
TypeInformation<OUT> typeInfo)
Creates a data stream from the given iterator.
|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromParallelCollection(SplittableIterator<OUT> iterator,
TypeInformation<OUT> typeInfo)
Creates a new data stream that contains elements in the iterator.
|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.readFile(FileInputFormat<OUT> inputFormat,
String filePath,
FileProcessingMode watchType,
long interval,
TypeInformation<OUT> typeInformation)
Reads the contents of the user-specified
filePath based on the given FileInputFormat . |
Constructor and Description |
---|
ComparableAggregator(int positionToAggregate,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
boolean first,
ExecutionConfig config) |
ComparableAggregator(int positionToAggregate,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
ExecutionConfig config) |
ComparableAggregator(String field,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
boolean first,
ExecutionConfig config) |
SumAggregator(int pos,
TypeInformation<T> typeInfo,
ExecutionConfig config) |
SumAggregator(String field,
TypeInformation<T> typeInfo,
ExecutionConfig config) |
Modifier and Type | Method and Description |
---|---|
void |
OutputFormatSinkFunction.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
void |
ContinuousFileReaderOperator.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig) |
Constructor and Description |
---|
InputFormatSourceFunction(InputFormat<OUT,?> format,
TypeInformation<OUT> typeInfo) |
MessageAcknowledgingSourceBase(TypeInformation<UId> idTypeInfo)
Creates a new MessageAcknowledgingSourceBase for IDs of the given type.
|
MultipleIdsMessageAcknowledgingSourceBase(TypeInformation<UId> idTypeInfo)
Creates a new MessageAcknowledgingSourceBase for IDs of the given type.
|
Modifier and Type | Method and Description |
---|---|
void |
FoldApplyWindowFunction.setOutputType(TypeInformation<R> outTypeInfo,
ExecutionConfig executionConfig)
Deprecated.
|
void |
FoldApplyProcessWindowFunction.setOutputType(TypeInformation<R> outTypeInfo,
ExecutionConfig executionConfig)
Deprecated.
|
void |
FoldApplyProcessAllWindowFunction.setOutputType(TypeInformation<R> outTypeInfo,
ExecutionConfig executionConfig)
Deprecated.
|
void |
FoldApplyAllWindowFunction.setOutputType(TypeInformation<R> outTypeInfo,
ExecutionConfig executionConfig)
Deprecated.
|
Constructor and Description |
---|
FoldApplyAllWindowFunction(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
AllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> accTypeInformation)
Deprecated.
|
FoldApplyProcessAllWindowFunction(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessAllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> accTypeInformation)
Deprecated.
|
FoldApplyProcessWindowFunction(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> windowFunction,
TypeInformation<ACC> accTypeInformation)
Deprecated.
|
FoldApplyWindowFunction(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
WindowFunction<ACC,R,K,W> windowFunction,
TypeInformation<ACC> accTypeInformation)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addOperator(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addOperator(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSink(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSink(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSource(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSource(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<OUT> void |
StreamGraph.setOutType(Integer vertexID,
TypeInformation<OUT> outType) |
Modifier and Type | Method and Description |
---|---|
void |
StreamGroupedFold.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig)
Deprecated.
|
void |
OutputTypeConfigurable.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig)
Is called by the
StreamGraph.addOperator(Integer, String, StreamOperator, TypeInformation, TypeInformation, String)
method when the StreamGraph is generated. |
void |
AbstractUdfStreamOperator.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
DataStream.dataType()
Returns the TypeInformation for the elements of this DataStream.
|
TypeInformation<K> |
KeyedStream.getKeyType()
Gets the type of the key by which this stream is keyed.
|
TypeInformation<K> |
KeySelectorWithType.getProducedType() |
TypeInformation<T> |
DataStream.getType()
Deprecated.
Use
dataType instead. |
static TypeInformation<T> |
OutputTag.getTypeInfo() |
Modifier and Type | Method and Description |
---|---|
<T> DataStream<T> |
StreamExecutionEnvironment.addSource(scala.Function1<SourceFunction.SourceContext<T>,scala.runtime.BoxedUnit> function,
TypeInformation<T> evidence$10)
Create a DataStream using a user defined source function for arbitrary
source functionality.
|
<T> DataStream<T> |
StreamExecutionEnvironment.addSource(SourceFunction<T> function,
TypeInformation<T> evidence$9)
Create a DataStream using a user defined source function for arbitrary
source functionality.
|
<ACC,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,R> aggregateFunction,
TypeInformation<ACC> evidence$5,
TypeInformation<R> evidence$6)
Applies the given aggregation function to each window.
|
<ACC,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,R> aggregateFunction,
TypeInformation<ACC> evidence$5,
TypeInformation<R> evidence$6)
Applies the given aggregation function to each window.
|
<ACC,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,R> aggregateFunction,
TypeInformation<ACC> evidence$5,
TypeInformation<R> evidence$6)
Applies the given aggregation function to each window and key.
|
<ACC,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,R> aggregateFunction,
TypeInformation<ACC> evidence$5,
TypeInformation<R> evidence$6)
Applies the given aggregation function to each window and key.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
AllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function3<W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function3<W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function3<W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<V>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
AllWindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessAllWindowFunction<V,R,W> windowFunction,
TypeInformation<ACC> evidence$10,
TypeInformation<V> evidence$11,
TypeInformation<R> evidence$12)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
ProcessWindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$13,
TypeInformation<V> evidence$14,
TypeInformation<R> evidence$15)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<ACC,V,R> DataStream<R> |
WindowedStream.aggregate(AggregateFunction<T,ACC,V> preAggregator,
WindowFunction<V,R,K,W> windowFunction,
TypeInformation<ACC> evidence$7,
TypeInformation<V> evidence$8,
TypeInformation<R> evidence$9)
Applies the given window function to each window.
|
<R> DataStream<R> |
AllWindowedStream.apply(AllWindowFunction<T,R,W> function,
TypeInformation<R> evidence$27)
Applies the given window function to each window.
|
<T> DataStream<T> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(CoGroupFunction<T1,T2,T> function,
TypeInformation<T> evidence$4)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<T> DataStream<T> |
JoinedStreams.Where.EqualTo.WithWindow.apply(FlatJoinFunction<T1,T2,T> function,
TypeInformation<T> evidence$5)
Completes the join operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(scala.Function2<scala.collection.Iterator<T1>,scala.collection.Iterator<T2>,O> fun,
TypeInformation<O> evidence$2)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
JoinedStreams.Where.EqualTo.WithWindow.apply(scala.Function2<T1,T2,O> fun,
TypeInformation<O> evidence$2)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(scala.Function2<T,T,T> preAggregator,
scala.Function3<W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$30)
Deprecated.
Use
reduce(ReduceFunction, AllWindowFunction) instead. |
<R> DataStream<R> |
WindowedStream.apply(scala.Function2<T,T,T> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$30)
Deprecated.
Use
reduce(ReduceFunction, WindowFunction) instead. |
<O> DataStream<O> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(scala.Function3<scala.collection.Iterator<T1>,scala.collection.Iterator<T2>,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
JoinedStreams.Where.EqualTo.WithWindow.apply(scala.Function3<T1,T2,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(scala.Function3<W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> function,
TypeInformation<R> evidence$28)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(scala.Function4<K,W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> function,
TypeInformation<R> evidence$28)
Applies the given window function to each window.
|
<T> DataStream<T> |
JoinedStreams.Where.EqualTo.WithWindow.apply(JoinFunction<T1,T2,T> function,
TypeInformation<T> evidence$4)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(ReduceFunction<T> preAggregator,
AllWindowFunction<T,R,W> windowFunction,
TypeInformation<R> evidence$29)
Deprecated.
Use
reduce(ReduceFunction, AllWindowFunction) instead. |
<R> DataStream<R> |
WindowedStream.apply(ReduceFunction<T> preAggregator,
WindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$29)
Deprecated.
Use
reduce(ReduceFunction, WindowFunction) instead. |
<R> DataStream<R> |
AllWindowedStream.apply(R initialValue,
FoldFunction<T,R> preAggregator,
AllWindowFunction<R,R,W> windowFunction,
TypeInformation<R> evidence$31)
Deprecated.
Use
fold(R, FoldFunction, AllWindowFunction) instead. |
<R> DataStream<R> |
WindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
WindowFunction<R,R,K,W> function,
TypeInformation<R> evidence$31)
Deprecated.
Use
fold(R, FoldFunction, WindowFunction) instead. |
<R> DataStream<R> |
AllWindowedStream.apply(R initialValue,
scala.Function2<R,T,R> preAggregator,
scala.Function3<W,scala.collection.Iterable<R>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$32)
Deprecated.
Use
fold(R, FoldFunction, AllWindowFunction instead. |
<R> DataStream<R> |
WindowedStream.apply(R initialValue,
scala.Function2<R,T,R> foldFunction,
scala.Function4<K,W,scala.collection.Iterable<R>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$32)
Deprecated.
Use
fold(R, FoldFunction, WindowFunction) instead. |
<T> OutputTag<T> |
OutputTag$.apply(String id,
TypeInformation<T> evidence$2) |
static <T> OutputTag<T> |
OutputTag.apply(String id,
TypeInformation<T> evidence$2) |
<R> DataStream<R> |
WindowedStream.apply(WindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$27)
Applies the given window function to each window.
|
<T> DataStream<T> |
StreamExecutionEnvironment.createInput(InputFormat<T,?> inputFormat,
TypeInformation<T> evidence$8)
Generic method to create an input data stream with a specific input format.
|
<S> DataStream<T> |
KeyedStream.filterWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<Object,scala.Option<S>>> fun,
TypeInformation<S> evidence$4)
Creates a new DataStream that contains only the elements satisfying the given stateful filter
predicate.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(CoFlatMapFunction<IN1,IN2,R> coFlatMapper,
TypeInformation<R> evidence$4)
Applies a CoFlatMap transformation on these connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(FlatMapFunction<T,R> flatMapper,
TypeInformation<R> evidence$9)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(scala.Function1<IN1,scala.collection.TraversableOnce<R>> fun1,
scala.Function1<IN2,scala.collection.TraversableOnce<R>> fun2,
TypeInformation<R> evidence$6)
Applies a CoFlatMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$11)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(scala.Function2<IN1,Collector<R>,scala.runtime.BoxedUnit> fun1,
scala.Function2<IN2,Collector<R>,scala.runtime.BoxedUnit> fun2,
TypeInformation<R> evidence$5)
Applies a CoFlatMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(scala.Function2<T,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$10)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R,S> DataStream<R> |
KeyedStream.flatMapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<scala.collection.TraversableOnce<R>,scala.Option<S>>> fun,
TypeInformation<R> evidence$7,
TypeInformation<S> evidence$8)
Creates a new DataStream by applying the given stateful function to every element and
flattening the results.
|
<R,S> DataStream<R> |
KeyedStream.flatMapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<scala.collection.TraversableOnce<R>,scala.Option<S>>> fun,
TypeInformation<R> evidence$7,
TypeInformation<S> evidence$8)
Creates a new DataStream by applying the given stateful function to every element and
flattening the results.
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> preAggregator,
AllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$18,
TypeInformation<R> evidence$19)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> preAggregator,
AllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$18,
TypeInformation<R> evidence$19)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> preAggregator,
ProcessAllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$20,
TypeInformation<R> evidence$21)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> preAggregator,
ProcessAllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$20,
TypeInformation<R> evidence$21)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R,ACC> DataStream<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> function,
TypeInformation<R> evidence$24,
TypeInformation<ACC> evidence$25)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R,ACC> DataStream<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> function,
TypeInformation<R> evidence$24,
TypeInformation<ACC> evidence$25)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
WindowFunction<ACC,R,K,W> function,
TypeInformation<ACC> evidence$18,
TypeInformation<R> evidence$19)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
WindowedStream.fold(ACC initialValue,
FoldFunction<T,ACC> foldFunction,
WindowFunction<ACC,R,K,W> function,
TypeInformation<ACC> evidence$18,
TypeInformation<R> evidence$19)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> preAggregator,
scala.Function3<W,scala.collection.Iterable<ACC>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$22,
TypeInformation<R> evidence$23)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> preAggregator,
scala.Function3<W,scala.collection.Iterable<ACC>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$22,
TypeInformation<R> evidence$23)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
WindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
scala.Function4<K,W,scala.collection.Iterable<ACC>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$20,
TypeInformation<R> evidence$21)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
WindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
scala.Function4<K,W,scala.collection.Iterable<ACC>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<ACC> evidence$20,
TypeInformation<R> evidence$21)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> preAggregator,
ProcessAllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$24,
TypeInformation<R> evidence$25)
Deprecated.
use [[aggregate()]] instead. Since .
|
<ACC,R> DataStream<R> |
AllWindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> preAggregator,
ProcessAllWindowFunction<ACC,R,W> windowFunction,
TypeInformation<ACC> evidence$24,
TypeInformation<R> evidence$25)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R,ACC> DataStream<R> |
WindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> function,
TypeInformation<R> evidence$22,
TypeInformation<ACC> evidence$23)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R,ACC> DataStream<R> |
WindowedStream.fold(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
ProcessWindowFunction<ACC,R,K,W> function,
TypeInformation<R> evidence$22,
TypeInformation<ACC> evidence$23)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R> DataStream<R> |
AllWindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> evidence$16)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R> DataStream<R> |
WindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> evidence$16)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R> DataStream<R> |
KeyedStream.fold(R initialValue,
FoldFunction<T,R> folder,
TypeInformation<R> evidence$2)
Deprecated.
will be removed in a future version. Since .
|
<R> DataStream<R> |
AllWindowedStream.fold(R initialValue,
scala.Function2<R,T,R> function,
TypeInformation<R> evidence$17)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R> DataStream<R> |
WindowedStream.fold(R initialValue,
scala.Function2<R,T,R> function,
TypeInformation<R> evidence$17)
Deprecated.
use [[aggregate()]] instead. Since .
|
<R> DataStream<R> |
KeyedStream.fold(R initialValue,
scala.Function2<R,T,R> fun,
TypeInformation<R> evidence$3)
Deprecated.
will be removed in a future version. Since .
|
<T> DataStream<T> |
StreamExecutionEnvironment.fromCollection(scala.collection.Iterator<T> data,
TypeInformation<T> evidence$3)
Creates a DataStream from the given
Iterator . |
<T> DataStream<T> |
StreamExecutionEnvironment.fromCollection(scala.collection.Seq<T> data,
TypeInformation<T> evidence$2)
Creates a DataStream from the given non-empty
Seq . |
<T> DataStream<T> |
StreamExecutionEnvironment.fromElements(scala.collection.Seq<T> data,
TypeInformation<T> evidence$1)
Creates a DataStream that contains the given elements.
|
<T> DataStream<T> |
StreamExecutionEnvironment.fromParallelCollection(SplittableIterator<T> data,
TypeInformation<T> evidence$4)
Creates a DataStream from the given
SplittableIterator . |
<X> DataStream<X> |
DataStream.getSideOutput(OutputTag<X> tag,
TypeInformation<X> evidence$1) |
<R,F> DataStream<R> |
DataStream.iterate(scala.Function1<ConnectedStreams<T,F>,scala.Tuple2<DataStream<F>,DataStream<R>>> stepFunction,
long maxWaitTimeMillis,
TypeInformation<F> evidence$6)
Initiates an iterative part of the program that creates a loop by feeding
back data streams.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
ConnectedStreams.keyBy(scala.Function1<IN1,K1> fun1,
scala.Function1<IN2,K2> fun2,
TypeInformation<K1> evidence$7,
TypeInformation<K2> evidence$8)
Keys the two connected streams together.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
ConnectedStreams.keyBy(scala.Function1<IN1,K1> fun1,
scala.Function1<IN2,K2> fun2,
TypeInformation<K1> evidence$7,
TypeInformation<K2> evidence$8)
Keys the two connected streams together.
|
<K> KeyedStream<T,K> |
DataStream.keyBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$2)
Groups the elements of a DataStream by the given K key to
be used with grouped operators like grouped reduce or grouped aggregations.
|
<R> DataStream<R> |
ConnectedStreams.map(CoMapFunction<IN1,IN2,R> coMapper,
TypeInformation<R> evidence$2)
Applies a CoMap transformation on these connected streams.
|
<R> DataStream<R> |
ConnectedStreams.map(scala.Function1<IN1,R> fun1,
scala.Function1<IN2,R> fun2,
TypeInformation<R> evidence$1)
Applies a CoMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.map(scala.Function1<T,R> fun,
TypeInformation<R> evidence$7)
Creates a new DataStream by applying the given function to every element of this DataStream.
|
<R> DataStream<R> |
DataStream.map(MapFunction<T,R> mapper,
TypeInformation<R> evidence$8)
Creates a new DataStream by applying the given function to every element of this DataStream.
|
<R,S> DataStream<R> |
KeyedStream.mapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<R,scala.Option<S>>> fun,
TypeInformation<R> evidence$5,
TypeInformation<S> evidence$6)
Creates a new DataStream by applying the given stateful function to every element of this
DataStream.
|
<R,S> DataStream<R> |
KeyedStream.mapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<R,scala.Option<S>>> fun,
TypeInformation<R> evidence$5,
TypeInformation<S> evidence$6)
Creates a new DataStream by applying the given stateful function to every element of this
DataStream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.orderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
int capacity,
TypeInformation<OUT> evidence$5)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.orderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
int capacity,
TypeInformation<OUT> evidence$5)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.orderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
TypeInformation<OUT> evidence$6)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.orderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
TypeInformation<OUT> evidence$6)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.orderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$8)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.orderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$8)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.orderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
int capacity,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$7)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.orderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
int capacity,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$7)
Apply an asynchronous function on the input data stream.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$5)
Partitions a DataStream on the key returned by the selector, using a custom partitioner.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
int field,
TypeInformation<K> evidence$3)
Partitions a tuple DataStream on the specified key fields using a custom partitioner.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
String field,
TypeInformation<K> evidence$4)
Partitions a POJO DataStream on the specified key fields using a custom partitioner.
|
<R> DataStream<R> |
ConnectedStreams.process(CoProcessFunction<IN1,IN2,R> coProcessFunction,
TypeInformation<R> evidence$3)
Applies the given
CoProcessFunction on the connected input streams,
thereby creating a transformed output stream. |
<R> DataStream<R> |
AllWindowedStream.process(ProcessAllWindowFunction<T,R,W> function,
TypeInformation<R> evidence$26)
Applies the given window function to each window.
|
<R> DataStream<R> |
DataStream.process(ProcessFunction<T,R> processFunction,
TypeInformation<R> evidence$12)
Applies the given
ProcessFunction on the input stream, thereby
creating a transformed output stream. |
<R> DataStream<R> |
KeyedStream.process(ProcessFunction<T,R> processFunction,
TypeInformation<R> evidence$1)
Applies the given
ProcessFunction on the input stream, thereby
creating a transformed output stream. |
<R> DataStream<R> |
WindowedStream.process(ProcessWindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$26)
Applies the given window function to each window.
|
<T> DataStream<T> |
StreamExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
FileProcessingMode watchType,
long interval,
FilePathFilter filter,
TypeInformation<T> evidence$6)
Deprecated.
Use
FileInputFormat#setFilesFilter(FilePathFilter) to set a filter and
StreamExecutionEnvironment#readFile(FileInputFormat, String, FileProcessingMode, long) |
<T> DataStream<T> |
StreamExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
FileProcessingMode watchType,
long interval,
TypeInformation<T> evidence$7)
Reads the contents of the user-specified path based on the given
FileInputFormat . |
<T> DataStream<T> |
StreamExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
TypeInformation<T> evidence$5)
Reads the given file with the given input format.
|
<R> DataStream<R> |
AllWindowedStream.reduce(scala.Function2<T,T,T> preAggregator,
scala.Function3<W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$2)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.reduce(scala.Function2<T,T,T> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$2)
Applies the given window function to each window.
|
<R> DataStream<R> |
AllWindowedStream.reduce(scala.Function2<T,T,T> preAggregator,
ProcessAllWindowFunction<T,R,W> windowFunction,
TypeInformation<R> evidence$4)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.reduce(scala.Function2<T,T,T> preAggregator,
ProcessWindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$3)
Applies the given reduce function to each window.
|
<R> DataStream<R> |
AllWindowedStream.reduce(ReduceFunction<T> preAggregator,
AllWindowFunction<T,R,W> windowFunction,
TypeInformation<R> evidence$1)
Applies the given window function to each window.
|
<R> DataStream<R> |
AllWindowedStream.reduce(ReduceFunction<T> preAggregator,
ProcessAllWindowFunction<T,R,W> windowFunction,
TypeInformation<R> evidence$3)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.reduce(ReduceFunction<T> preAggregator,
ProcessWindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$4)
Applies the given reduce function to each window.
|
<R> DataStream<R> |
WindowedStream.reduce(ReduceFunction<T> preAggregator,
WindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$1)
Applies the given window function to each window.
|
<R> DataStream<R> |
DataStream.transform(String operatorName,
OneInputStreamOperator<T,R> operator,
TypeInformation<R> evidence$13)
Transforms the
DataStream by using a custom OneInputStreamOperator . |
<R> DataStream<R> |
ConnectedStreams.transform(String functionName,
TwoInputStreamOperator<IN1,IN2,R> operator,
TypeInformation<R> evidence$9) |
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.unorderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
int capacity,
TypeInformation<OUT> evidence$1)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.unorderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
int capacity,
TypeInformation<OUT> evidence$1)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.unorderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
TypeInformation<OUT> evidence$2)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.unorderedWait(DataStream<IN> input,
AsyncFunction<IN,OUT> asyncFunction,
long timeout,
TimeUnit timeUnit,
TypeInformation<OUT> evidence$2)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.unorderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$4)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.unorderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$4)
Apply an asynchronous function on the input data stream.
|
<IN,OUT> DataStream<OUT> |
AsyncDataStream$.unorderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
int capacity,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$3)
Apply an asynchronous function on the input data stream.
|
static <IN,OUT> DataStream<OUT> |
AsyncDataStream.unorderedWait(DataStream<IN> input,
long timeout,
TimeUnit timeUnit,
int capacity,
scala.Function2<IN,AsyncCollector<OUT>,scala.runtime.BoxedUnit> asyncFunction,
TypeInformation<OUT> evidence$3)
Apply an asynchronous function on the input data stream.
|
<KEY> CoGroupedStreams.Where<KEY> |
CoGroupedStreams.where(scala.Function1<T1,KEY> keySelector,
TypeInformation<KEY> evidence$1)
Specifies a
KeySelector for elements from the first input. |
<KEY> JoinedStreams.Where<KEY> |
JoinedStreams.where(scala.Function1<T1,KEY> keySelector,
TypeInformation<KEY> evidence$1)
Specifies a
KeySelector for elements from the first input. |
Constructor and Description |
---|
KeySelectorWithType(scala.Function1<IN,K> fun,
TypeInformation<K> info) |
OutputTag(String id,
TypeInformation<T> evidence$1) |
Where(KeySelector<T1,KEY> keySelector1,
TypeInformation<KEY> keyType) |
Where(KeySelector<T1,KEY> keySelector1,
TypeInformation<KEY> keyType) |
Modifier and Type | Method and Description |
---|---|
<ACC,R> DataStream<R> |
OnWindowedStream.applyWith(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
scala.Function3<K,W,scala.collection.immutable.Stream<ACC>,scala.collection.TraversableOnce<R>> windowFunction,
TypeInformation<ACC> evidence$2,
TypeInformation<R> evidence$3)
Applies the given window function to each window.
|
<ACC,R> DataStream<R> |
OnWindowedStream.applyWith(ACC initialValue,
scala.Function2<ACC,T,ACC> foldFunction,
scala.Function3<K,W,scala.collection.immutable.Stream<ACC>,scala.collection.TraversableOnce<R>> windowFunction,
TypeInformation<ACC> evidence$2,
TypeInformation<R> evidence$3)
Applies the given window function to each window.
|
<R> DataStream<R> |
OnConnectedStream.flatMapWith(scala.Function1<IN1,scala.collection.TraversableOnce<R>> flatMap1,
scala.Function1<IN2,scala.collection.TraversableOnce<R>> flatMap2,
TypeInformation<R> evidence$2)
Applies a CoFlatMap transformation on the connected streams.
|
<R> DataStream<R> |
OnDataStream.flatMapWith(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$2)
Applies a function
fun to each item of the stream, producing a collection of items
that will be flattened in the resulting stream |
<R> DataStream<R> |
OnKeyedStream.foldWith(R initialValue,
scala.Function2<R,T,R> fun,
TypeInformation<R> evidence$1)
Folds the stream over a zero element with a reducer
fun |
<R> DataStream<R> |
OnWindowedStream.foldWith(R initialValue,
scala.Function2<R,T,R> function,
TypeInformation<R> evidence$1)
Applies the given fold function to each window.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
OnConnectedStream.keyingBy(scala.Function1<IN1,K1> key1,
scala.Function1<IN2,K2> key2,
TypeInformation<K1> evidence$3,
TypeInformation<K2> evidence$4)
Keys the two connected streams together.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
OnConnectedStream.keyingBy(scala.Function1<IN1,K1> key1,
scala.Function1<IN2,K2> key2,
TypeInformation<K1> evidence$3,
TypeInformation<K2> evidence$4)
Keys the two connected streams together.
|
<K> KeyedStream<T,K> |
OnDataStream.keyingBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$3)
Keys the items according to a keying function
fun |
<R> DataStream<R> |
OnConnectedStream.mapWith(scala.Function1<IN1,R> map1,
scala.Function1<IN2,R> map2,
TypeInformation<R> evidence$1)
Applies a CoMap transformation on the connected streams.
|
<R> DataStream<R> |
OnDataStream.mapWith(scala.Function1<T,R> fun,
TypeInformation<R> evidence$1)
Applies a function
fun to each item of the stream |
<O> DataStream<O> |
OnJoinedStream.projecting(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1)
Completes the join operation with the user function that is executed
for windowed groups.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<T> |
StreamTransformation.outputType |
Modifier and Type | Method and Description |
---|---|
TypeInformation<IN> |
OneInputTransformation.getInputType()
Returns the
TypeInformation for the elements of the input. |
TypeInformation<IN1> |
TwoInputTransformation.getInputType1()
Returns the
TypeInformation for the elements from the first input. |
TypeInformation<IN2> |
TwoInputTransformation.getInputType2()
Returns the
TypeInformation for the elements from the first input. |
TypeInformation<T> |
StreamTransformation.getOutputType()
Returns the output type of this
StreamTransformation as a TypeInformation . |
TypeInformation<?> |
TwoInputTransformation.getStateKeyType() |
TypeInformation<?> |
SinkTransformation.getStateKeyType() |
TypeInformation<?> |
OneInputTransformation.getStateKeyType() |
Modifier and Type | Method and Description |
---|---|
void |
StreamTransformation.setOutputType(TypeInformation<T> outputType)
Tries to fill in the type information.
|
void |
TwoInputTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
void |
SinkTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
void |
OneInputTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
Constructor and Description |
---|
CoFeedbackTransformation(int parallelism,
TypeInformation<F> feedbackType,
Long waitTime)
Creates a new
CoFeedbackTransformation from the given input. |
OneInputTransformation(StreamTransformation<IN> input,
String name,
OneInputStreamOperator<IN,OUT> operator,
TypeInformation<OUT> outputType,
int parallelism)
Creates a new
OneInputTransformation from the given input and operator. |
SourceTransformation(String name,
StreamSource<T,?> operator,
TypeInformation<T> outputType,
int parallelism)
Creates a new
SourceTransformation from the given operator. |
StreamTransformation(String name,
TypeInformation<T> outputType,
int parallelism)
Creates a new
StreamTransformation with the given name, output type and parallelism. |
TwoInputTransformation(StreamTransformation<IN1> input1,
StreamTransformation<IN2> input2,
String name,
TwoInputStreamOperator<IN1,IN2,OUT> operator,
TypeInformation<OUT> outputType,
int parallelism)
Creates a new
TwoInputTransformation from the given inputs and operator. |
Modifier and Type | Method and Description |
---|---|
<S extends Serializable> |
Trigger.TriggerContext.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState)
Deprecated.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<IN> |
CassandraSink.CassandraSinkBuilder.typeInfo |
Constructor and Description |
---|
CassandraPojoSinkBuilder(DataStream<IN> input,
TypeInformation<IN> typeInfo,
TypeSerializer<IN> serializer) |
CassandraSinkBuilder(DataStream<IN> input,
TypeInformation<IN> typeInfo,
TypeSerializer<IN> serializer) |
CassandraTupleSinkBuilder(DataStream<IN> input,
TypeInformation<IN> typeInfo,
TypeSerializer<IN> serializer) |
Modifier and Type | Method and Description |
---|---|
void |
AvroKeyValueSinkWriter.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
SequenceFileWriter.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
RollingSink.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
void |
BucketingSink.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
Modifier and Type | Field and Description |
---|---|
protected TypeInformation[] |
KafkaTableSink.fieldTypes |
Modifier and Type | Method and Description |
---|---|
TypeInformation<?>[] |
KafkaTableSink.getFieldTypes() |
TypeInformation<Row> |
KafkaTableSink.getOutputType() |
TypeInformation<T> |
FlinkKafkaConsumerBase.getProducedType() |
TypeInformation<Row> |
KafkaTableSource.getReturnType() |
Modifier and Type | Method and Description |
---|---|
KafkaTableSink |
KafkaTableSink.configure(String[] fieldNames,
TypeInformation<?>[] fieldTypes) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<OUT> |
RMQSource.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<S extends Serializable> |
WindowOperator.Context.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState) |
Modifier and Type | Method and Description |
---|---|
static <T> void |
StreamingFunctionUtils.setOutputType(Function userFunction,
TypeInformation<T> outTypeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple> |
KeySelectorUtil.ComparableKeySelector.getProducedType() |
TypeInformation<Tuple> |
KeySelectorUtil.ArrayKeySelector.getProducedType() |
Modifier and Type | Method and Description |
---|---|
static <X> KeySelectorUtil.ArrayKeySelector<X> |
KeySelectorUtil.getSelectorForArray(int[] positions,
TypeInformation<X> typeInfo) |
static <X> KeySelector<X,Tuple> |
KeySelectorUtil.getSelectorForKeys(Keys<X> keys,
TypeInformation<X> typeInfo,
ExecutionConfig executionConfig) |
static <X,K> KeySelector<X,K> |
KeySelectorUtil.getSelectorForOneKey(Keys<X> keys,
Partitioner<K> partitioner,
TypeInformation<X> typeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
TypeInformationKeyValueSerializationSchema.getProducedType() |
TypeInformation<T> |
KeyedDeserializationSchemaWrapper.getProducedType() |
TypeInformation<Row> |
JsonRowDeserializationSchema.getProducedType() |
TypeInformation<com.fasterxml.jackson.databind.node.ObjectNode> |
JSONKeyValueDeserializationSchema.getProducedType() |
TypeInformation<T> |
TypeInformationSerializationSchema.getProducedType() |
TypeInformation<String> |
SimpleStringSchema.getProducedType() |
TypeInformation<T> |
AbstractDeserializationSchema.getProducedType() |
Constructor and Description |
---|
JsonRowDeserializationSchema(TypeInformation<Row> typeInfo)
Creates a JSON deserialization schema for the given fields and types.
|
TypeInformationKeyValueSerializationSchema(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given types.
|
TypeInformationKeyValueSerializationSchema(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given types.
|
TypeInformationSerializationSchema(TypeInformation<T> typeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given type.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInformation |
FieldAccessor.fieldType |
Modifier and Type | Method and Description |
---|---|
TypeInformation<F> |
FieldAccessor.getFieldType()
Gets the TypeInformation for the type of the field.
|
Modifier and Type | Method and Description |
---|---|
static <T,F> FieldAccessor<T,F> |
FieldAccessorFactory.getAccessor(TypeInformation<T> typeInfo,
int pos,
ExecutionConfig config)
Creates a
FieldAccessor for the given field position, which can be used to get and set
the specified field on instances of this type. |
static <T,F> FieldAccessor<T,F> |
FieldAccessorFactory.getAccessor(TypeInformation<T> typeInfo,
String field,
ExecutionConfig config)
Creates a
FieldAccessor for the field that is given by a field expression,
which can be used to get and set the specified field on instances of this type. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<?>[] |
TableEnvironment$.getFieldTypes(TypeInformation<?> inputType)
Returns field types for a given
TypeInformation . |
static TypeInformation<?>[] |
TableEnvironment.getFieldTypes(TypeInformation<?> inputType)
Returns field types for a given
TypeInformation . |
TypeInformation<?>[] |
TableSchema.getTypes()
Returns all type information as an array.
|
static TypeInformation<?> |
Types.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
TypeInformation<?> |
Types$.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
static TypeInformation<?> |
Types.OBJECT_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java object elements.
|
TypeInformation<?> |
Types$.OBJECT_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java object elements.
|
static TypeInformation<?> |
Types.PRIMITIVE_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java primitive elements.
|
TypeInformation<?> |
Types$.PRIMITIVE_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java primitive elements.
|
static TypeInformation<Row> |
Types.ROW(scala.collection.Seq<TypeInformation<?>> types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(scala.collection.Seq<TypeInformation<?>> types)
Generates row type information.
|
static TypeInformation<Row> |
Types.ROW(String[] names,
TypeInformation<?>[] types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(String[] names,
TypeInformation<?>[] types)
Generates row type information.
|
static TypeInformation<Row> |
Types.ROW(TypeInformation<?>... types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(TypeInformation<?>... types)
Generates row type information.
|
Modifier and Type | Method and Description |
---|---|
scala.Option<TypeInformation<?>> |
TableSchema.getType(int columnIndex)
Returns the specified type information for the given column index.
|
scala.Option<TypeInformation<?>> |
TableSchema.getType(String columnName)
Returns the specified type information for the given column name.
|
Modifier and Type | Method and Description |
---|---|
protected <OUT> GeneratedFunction<MapFunction<Row,OUT>,OUT> |
TableEnvironment.generateRowConverterFunction(TypeInformation<Row> inputTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName) |
protected <OUT> GeneratedFunction<MapFunction<Row,OUT>,OUT> |
TableEnvironment.generateRowConverterFunction(TypeInformation<Row> inputTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName) |
protected <IN,OUT> scala.Option<MapFunction<IN,OUT>> |
BatchTableEnvironment.getConversionMapper(TypeInformation<IN> physicalTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName)
Creates a final converter that maps the internal row type to external type.
|
protected <IN,OUT> scala.Option<MapFunction<IN,OUT>> |
BatchTableEnvironment.getConversionMapper(TypeInformation<IN> physicalTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName)
Creates a final converter that maps the internal row type to external type.
|
protected <IN,OUT> MapFunction<IN,OUT> |
StreamTableEnvironment.getConversionMapper(TypeInformation<IN> physicalTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName)
Creates a final converter that maps the internal row type to external type.
|
protected <IN,OUT> MapFunction<IN,OUT> |
StreamTableEnvironment.getConversionMapper(TypeInformation<IN> physicalTypeInfo,
RowSchema schema,
TypeInformation<OUT> requestedTypeInfo,
String functionName)
Creates a final converter that maps the internal row type to external type.
|
int[] |
TableEnvironment$.getFieldIndices(TypeInformation<?> inputType)
Returns field indexes for a given
TypeInformation . |
static int[] |
TableEnvironment.getFieldIndices(TypeInformation<?> inputType)
Returns field indexes for a given
TypeInformation . |
protected <A> scala.Tuple2<String[],int[]> |
TableEnvironment.getFieldInfo(TypeInformation<A> inputType)
Returns field names and field positions for a given
TypeInformation . |
protected <A> scala.Tuple2<String[],int[]> |
TableEnvironment.getFieldInfo(TypeInformation<A> inputType,
Expression[] exprs)
Returns field names and field positions for a given
TypeInformation and Array of
Expression . |
<A> String[] |
TableEnvironment$.getFieldNames(TypeInformation<A> inputType)
Returns field names for a given
TypeInformation . |
static <A> String[] |
TableEnvironment.getFieldNames(TypeInformation<A> inputType)
Returns field names for a given
TypeInformation . |
TypeInformation<?>[] |
TableEnvironment$.getFieldTypes(TypeInformation<?> inputType)
Returns field types for a given
TypeInformation . |
static TypeInformation<?>[] |
TableEnvironment.getFieldTypes(TypeInformation<?> inputType)
Returns field types for a given
TypeInformation . |
static TypeInformation<?> |
Types.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
static TypeInformation<?> |
Types.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
TypeInformation<?> |
Types$.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
TypeInformation<?> |
Types$.MAP(TypeInformation<?> keyType,
TypeInformation<?> valueType)
Generates type information for a Java HashMap.
|
static TypeInformation<?> |
Types.OBJECT_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java object elements.
|
TypeInformation<?> |
Types$.OBJECT_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java object elements.
|
static TypeInformation<?> |
Types.PRIMITIVE_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java primitive elements.
|
TypeInformation<?> |
Types$.PRIMITIVE_ARRAY(TypeInformation<?> elementType)
Generates type information for an array consisting of Java primitive elements.
|
<T,ACC> void |
TableEnvironment.registerAggregateFunctionInternal(String name,
AggregateFunction<T,ACC> function,
TypeInformation<T> evidence$2)
Registers an
AggregateFunction under a unique name. |
<T> void |
TableEnvironment.registerTableFunctionInternal(String name,
TableFunction<T> function,
TypeInformation<T> evidence$1)
Registers a
TableFunction under a unique name. |
static TypeInformation<Row> |
Types.ROW(String[] names,
TypeInformation<?>[] types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(String[] names,
TypeInformation<?>[] types)
Generates row type information.
|
static TypeInformation<Row> |
Types.ROW(TypeInformation<?>... types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(TypeInformation<?>... types)
Generates row type information.
|
protected <A> DataStream<A> |
StreamTableEnvironment.translate(org.apache.calcite.rel.RelNode logicalPlan,
org.apache.calcite.rel.type.RelDataType logicalType,
StreamQueryConfig queryConfig,
boolean withChangeFlag,
TypeInformation<A> tpe)
Translates a logical
RelNode into a DataStream . |
protected <A> DataSet<A> |
BatchTableEnvironment.translate(org.apache.calcite.rel.RelNode logicalPlan,
org.apache.calcite.rel.type.RelDataType logicalType,
TypeInformation<A> tpe)
Translates a logical
RelNode into a DataSet . |
protected <A> DataStream<A> |
StreamTableEnvironment.translate(Table table,
StreamQueryConfig queryConfig,
boolean updatesAsRetraction,
boolean withChangeFlag,
TypeInformation<A> tpe)
Translates a
Table into a DataStream . |
protected <A> DataSet<A> |
BatchTableEnvironment.translate(Table table,
TypeInformation<A> tpe)
Translates a
Table into a DataSet . |
void |
TableEnvironment$.validateType(TypeInformation<?> typeInfo)
Validate if class represented by the typeInfo is static and globally accessible
|
static void |
TableEnvironment.validateType(TypeInformation<?> typeInfo)
Validate if class represented by the typeInfo is static and globally accessible
|
Modifier and Type | Method and Description |
---|---|
static TypeInformation<Row> |
Types.ROW(scala.collection.Seq<TypeInformation<?>> types)
Generates row type information.
|
TypeInformation<Row> |
Types$.ROW(scala.collection.Seq<TypeInformation<?>> types)
Generates row type information.
|
Constructor and Description |
---|
TableSchema(String[] columnNames,
TypeInformation<?>[] columnTypes) |
Modifier and Type | Method and Description |
---|---|
<T> DataStream<T> |
StreamTableEnvironment.toAppendStream(Table table,
TypeInformation<T> typeInfo)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataStream<T> |
StreamTableEnvironment.toAppendStream(Table table,
TypeInformation<T> typeInfo,
StreamQueryConfig queryConfig)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataSet<T> |
BatchTableEnvironment.toDataSet(Table table,
TypeInformation<T> typeInfo)
Converts the given
Table into a DataSet of a specified type. |
<T> DataStream<T> |
StreamTableEnvironment.toDataStream(Table table,
TypeInformation<T> typeInfo)
Deprecated.
This method only supports conversion of append-only tables. In order to
make this more explicit in the future, please use toAppendStream() instead.
|
<T> DataStream<T> |
StreamTableEnvironment.toDataStream(Table table,
TypeInformation<T> typeInfo,
StreamQueryConfig queryConfig)
Deprecated.
This method only supports conversion of append-only tables. In order to
make this more explicit in the future, please use toAppendStream() instead.
|
<T> DataStream<Tuple2<Boolean,T>> |
StreamTableEnvironment.toRetractStream(Table table,
TypeInformation<T> typeInfo)
Converts the given
Table into a DataStream of add and retract messages. |
<T> DataStream<Tuple2<Boolean,T>> |
StreamTableEnvironment.toRetractStream(Table table,
TypeInformation<T> typeInfo,
StreamQueryConfig queryConfig)
Converts the given
Table into a DataStream of add and retract messages. |
Modifier and Type | Method and Description |
---|---|
Table |
TableFunctionConversions.apply(scala.collection.Seq<Expression> args,
TypeInformation<T> typeInfo)
Creates a
Table from a TableFunction in Scala Table API. |
Cast |
ImplicitExpressionOperations.cast(TypeInformation<?> toType)
Converts a value to a given type.
|
<T,ACC> void |
BatchTableEnvironment.registerFunction(String name,
AggregateFunction<T,ACC> f,
TypeInformation<T> evidence$3)
Registers an
AggregateFunction under a unique name in the TableEnvironment's catalog. |
<T,ACC> void |
StreamTableEnvironment.registerFunction(String name,
AggregateFunction<T,ACC> f,
TypeInformation<T> evidence$8)
Registers an
AggregateFunction under a unique name in the TableEnvironment's catalog. |
<T> void |
BatchTableEnvironment.registerFunction(String name,
TableFunction<T> tf,
TypeInformation<T> evidence$2)
Registers a
TableFunction under a unique name in the TableEnvironment's catalog. |
<T> void |
StreamTableEnvironment.registerFunction(String name,
TableFunction<T> tf,
TypeInformation<T> evidence$7)
Registers a
TableFunction under a unique name in the TableEnvironment's catalog. |
<T> DataStream<T> |
TableConversions.toAppendStream(StreamQueryConfig queryConfig,
TypeInformation<T> evidence$5)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataStream<T> |
StreamTableEnvironment.toAppendStream(Table table,
StreamQueryConfig queryConfig,
TypeInformation<T> evidence$4)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataStream<T> |
StreamTableEnvironment.toAppendStream(Table table,
TypeInformation<T> evidence$3)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataStream<T> |
TableConversions.toAppendStream(TypeInformation<T> evidence$4)
Converts the given
Table into an append DataStream of a specified type. |
<T> DataSet<T> |
BatchTableEnvironment.toDataSet(Table table,
TypeInformation<T> evidence$1)
Converts the given
Table into a DataSet of a specified type. |
<T> DataSet<T> |
TableConversions.toDataSet(TypeInformation<T> evidence$1)
Converts the given
Table into a DataSet of a specified type. |
<T> DataStream<T> |
TableConversions.toDataStream(StreamQueryConfig queryConfig,
TypeInformation<T> evidence$3)
Deprecated.
This method only supports conversion of append-only tables. In order to make this more explicit in the future, please use toAppendStream() instead. Since .
|
<T> DataStream<T> |
StreamTableEnvironment.toDataStream(Table table,
StreamQueryConfig queryConfig,
TypeInformation<T> evidence$2)
Deprecated.
This method only supports conversion of append-only tables. In order to make this more explicit in the future, please use toAppendStream() instead. Since .
|
<T> DataStream<T> |
StreamTableEnvironment.toDataStream(Table table,
TypeInformation<T> evidence$1)
Deprecated.
This method only supports conversion of append-only tables. In order to make this more explicit in the future, please use toAppendStream() instead. Since .
|
<T> DataStream<T> |
TableConversions.toDataStream(TypeInformation<T> evidence$2)
Deprecated.
This method only supports conversion of append-only tables. In order to make this more explicit in the future, please use toAppendStream() instead. Since .
|
<T> DataStream<scala.Tuple2<Object,T>> |
TableConversions.toRetractStream(StreamQueryConfig queryConfig,
TypeInformation<T> evidence$7)
Converts the
Table to a DataStream of add and retract messages. |
<T> DataStream<scala.Tuple2<Object,T>> |
StreamTableEnvironment.toRetractStream(Table table,
StreamQueryConfig queryConfig,
TypeInformation<T> evidence$6)
Converts the given
Table into a DataStream of add and retract messages. |
<T> DataStream<scala.Tuple2<Object,T>> |
StreamTableEnvironment.toRetractStream(Table table,
TypeInformation<T> evidence$5)
Converts the given
Table into a DataStream of add and retract messages. |
<T> DataStream<scala.Tuple2<Object,T>> |
TableConversions.toRetractStream(TypeInformation<T> evidence$6)
Converts the
Table to a DataStream of add and retract messages. |
<T,ACC> UDAGGExpression<T,ACC> |
ImplicitExpressionConversions.userDefinedAggFunctionConstructor(AggregateFunction<T,ACC> udagg,
TypeInformation<T> evidence$1) |
Constructor and Description |
---|
DataSetConversions(DataSet<T> dataSet,
TypeInformation<T> inputType) |
DataStreamConversions(DataStream<T> dataStream,
TypeInformation<T> inputType) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Row> |
FlinkTypeFactory$.toInternalRowTypeInfo(org.apache.calcite.rel.type.RelDataType logicalRowType)
Deprecated.
Use the RowSchema class instead because it handles both logical and physical rows. Since .
|
static TypeInformation<Row> |
FlinkTypeFactory.toInternalRowTypeInfo(org.apache.calcite.rel.type.RelDataType logicalRowType)
Deprecated.
Use the RowSchema class instead because it handles both logical and physical rows. Since .
|
TypeInformation<?> |
FlinkTypeFactory$.toTypeInfo(org.apache.calcite.rel.type.RelDataType relDataType) |
static TypeInformation<?> |
FlinkTypeFactory.toTypeInfo(org.apache.calcite.rel.type.RelDataType relDataType) |
Modifier and Type | Method and Description |
---|---|
org.apache.calcite.rel.type.RelDataType |
FlinkTypeFactory.createTypeFromTypeInfo(TypeInformation<?> typeInfo,
boolean isNullable) |
boolean |
FlinkTypeFactory$.isProctimeIndicatorType(TypeInformation<?> typeInfo) |
static boolean |
FlinkTypeFactory.isProctimeIndicatorType(TypeInformation<?> typeInfo) |
boolean |
FlinkTypeFactory$.isRowtimeIndicatorType(TypeInformation<?> typeInfo) |
static boolean |
FlinkTypeFactory.isRowtimeIndicatorType(TypeInformation<?> typeInfo) |
boolean |
FlinkTypeFactory$.isTimeIndicatorType(TypeInformation<?> typeInfo) |
static boolean |
FlinkTypeFactory.isTimeIndicatorType(TypeInformation<?> typeInfo) |
Modifier and Type | Method and Description |
---|---|
org.apache.calcite.rel.type.RelDataType |
FlinkTypeFactory.buildLogicalRowType(scala.collection.Seq<String> fieldNames,
scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.Option<scala.Tuple2<Object,String>> rowtime,
scala.Option<scala.Tuple2<Object,String>> proctime)
Creates a struct type with the input fieldNames and input fieldTypes using FlinkTypeFactory
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
GeneratedFunction.returnType() |
TypeInformation<T> |
GeneratedInput.returnType() |
Modifier and Type | Method and Description |
---|---|
void |
CodeGenerator.addReusableOutRecord(TypeInformation<?> ti)
Adds a reusable output record to the member area of the generated
Function . |
String |
CodeGenUtils$.boxedTypeTermForTypeInfo(TypeInformation<?> tpe) |
static String |
CodeGenUtils.boxedTypeTermForTypeInfo(TypeInformation<?> tpe) |
GeneratedExpression |
CodeGenerator.generateConverterResultExpression(TypeInformation<?> returnType,
scala.collection.Seq<String> resultFieldNames)
Generates an expression that converts the first input (and second input) into the given type.
|
<F extends Function,T> |
CodeGenerator.generateFunction(String name,
Class<F> clazz,
String bodyCode,
TypeInformation<T> returnType)
Generates a
Function that can be passed to Java
compiler. |
GeneratedExpression |
CodeGenerator.generateInputFieldUnboxing(TypeInformation<?> fieldType,
String fieldTerm)
Converts the external boxed format to an internal mostly primitive field representation.
|
GeneratedExpression |
CodeGenerator.generateNonNullLiteral(TypeInformation<?> literalType,
String literalCode) |
GeneratedExpression |
CodeGenerator.generateResultExpression(scala.collection.Seq<GeneratedExpression> fieldExprs,
TypeInformation<?> returnType,
scala.collection.Seq<String> resultFieldNames)
Generates an expression from a sequence of other expressions.
|
GeneratedExpression |
CodeGenerator.generateResultExpression(TypeInformation<?> returnType,
scala.collection.Seq<String> resultFieldNames,
scala.collection.Seq<org.apache.calcite.rex.RexNode> rexNodes)
Generates an expression from a sequence of RexNode.
|
GeneratedCollector |
CodeGenerator.generateTableFunctionCollector(String name,
String bodyCode,
TypeInformation<Object> collectedType)
Generates a
TableFunctionCollector that can be passed to Java compiler. |
<T extends Row> |
CodeGenerator.generateValuesInputFormat(String name,
scala.collection.Seq<String> records,
TypeInformation<T> returnType)
Generates a values input format that can be passed to Java compiler.
|
String |
CodeGenUtils$.internalToTimePointCode(TypeInformation<?> resultType,
String resultTerm) |
static String |
CodeGenUtils.internalToTimePointCode(TypeInformation<?> resultType,
String resultTerm) |
boolean |
CodeGenUtils$.isReference(TypeInformation<?> typeInfo) |
static boolean |
CodeGenUtils.isReference(TypeInformation<?> typeInfo) |
String |
CodeGenUtils$.primitiveDefaultValue(TypeInformation<?> tpe) |
static String |
CodeGenUtils.primitiveDefaultValue(TypeInformation<?> tpe) |
String |
CodeGenUtils$.primitiveTypeTermForTypeInfo(TypeInformation<?> tpe) |
static String |
CodeGenUtils.primitiveTypeTermForTypeInfo(TypeInformation<?> tpe) |
String |
CodeGenUtils$.superPrimitive(TypeInformation<?> typeInfo) |
static String |
CodeGenUtils.superPrimitive(TypeInformation<?> typeInfo) |
String |
CodeGenUtils$.timePointToInternalCode(TypeInformation<?> resultType,
String resultTerm) |
static String |
CodeGenUtils.timePointToInternalCode(TypeInformation<?> resultType,
String resultTerm) |
Modifier and Type | Method and Description |
---|---|
GeneratedAggregationsFunction |
CodeGenerator.generateAggregations(String name,
CodeGenerator generator,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
AggregateFunction<?,?>[] aggregates,
int[][] aggFields,
int[] aggMapping,
boolean partialResults,
int[] fwdMapping,
scala.Option<int[]> mergeMapping,
scala.Option<scala.Tuple2<Object,Object>[]> constantFlags,
int outputArity,
boolean needRetract,
boolean needMerge,
boolean needReset)
Generates a
GeneratedAggregations that can be
passed to a Java compiler. |
Constructor and Description |
---|
CodeGenerator(TableConfig config,
boolean nullableInput,
TypeInformation<?> input1,
scala.Option<TypeInformation<?>> input2,
scala.Option<int[]> input1FieldMapping,
scala.Option<int[]> input2FieldMapping) |
CodeGenerator(TableConfig config,
boolean nullableInput,
TypeInformation<Object> input,
int[] inputFieldMapping)
A code generator for generating unary Flink
Function s with one input. |
GeneratedExpression(String resultTerm,
String nullTerm,
String code,
TypeInformation<?> resultType) |
GeneratedFunction(String name,
TypeInformation<T> returnType,
String code) |
GeneratedInput(String name,
TypeInformation<T> returnType,
String code) |
Constructor and Description |
---|
CodeGenerator(TableConfig config,
boolean nullableInput,
TypeInformation<?> input1,
scala.Option<TypeInformation<?>> input2,
scala.Option<int[]> input1FieldMapping,
scala.Option<int[]> input2FieldMapping) |
Modifier and Type | Method and Description |
---|---|
GeneratedExpression |
ScalarOperators$.generateArithmeticOperator(String operator,
boolean nullCheck,
TypeInformation<?> resultType,
GeneratedExpression left,
GeneratedExpression right) |
static GeneratedExpression |
ScalarOperators.generateArithmeticOperator(String operator,
boolean nullCheck,
TypeInformation<?> resultType,
GeneratedExpression left,
GeneratedExpression right) |
GeneratedExpression |
ScalarOperators$.generateArray(CodeGenerator codeGenerator,
TypeInformation<?> resultType,
scala.collection.Seq<GeneratedExpression> elements) |
static GeneratedExpression |
ScalarOperators.generateArray(CodeGenerator codeGenerator,
TypeInformation<?> resultType,
scala.collection.Seq<GeneratedExpression> elements) |
GeneratedExpression |
CallGenerator$.generateCallIfArgsNotNull(boolean nullCheck,
TypeInformation<?> returnType,
scala.collection.Seq<GeneratedExpression> operands,
scala.Function1<scala.collection.Seq<String>,String> call) |
GeneratedExpression |
ScalarOperators$.generateCast(boolean nullCheck,
GeneratedExpression operand,
TypeInformation<?> targetType) |
static GeneratedExpression |
ScalarOperators.generateCast(boolean nullCheck,
GeneratedExpression operand,
TypeInformation<?> targetType) |
GeneratedExpression |
ScalarOperators$.generateIfElse(boolean nullCheck,
scala.collection.Seq<GeneratedExpression> operands,
TypeInformation<?> resultType,
int i) |
static GeneratedExpression |
ScalarOperators.generateIfElse(boolean nullCheck,
scala.collection.Seq<GeneratedExpression> operands,
TypeInformation<?> resultType,
int i) |
GeneratedExpression |
ScalarOperators$.generateUnaryArithmeticOperator(String operator,
boolean nullCheck,
TypeInformation<?> resultType,
GeneratedExpression operand) |
static GeneratedExpression |
ScalarOperators.generateUnaryArithmeticOperator(String operator,
boolean nullCheck,
TypeInformation<?> resultType,
GeneratedExpression operand) |
static scala.Option<CallGenerator> |
FunctionGenerator.getCallGenerator(org.apache.calcite.sql.SqlOperator sqlOperator,
scala.collection.Seq<TypeInformation<?>> operandTypes,
TypeInformation<?> resultType)
Returns a
CallGenerator that generates all required code for calling the given
SqlOperator . |
scala.Option<CallGenerator> |
FunctionGenerator$.getCallGenerator(org.apache.calcite.sql.SqlOperator sqlOperator,
scala.collection.Seq<TypeInformation<?>> operandTypes,
TypeInformation<?> resultType)
Returns a
CallGenerator that generates all required code for calling the given
SqlOperator . |
Modifier and Type | Method and Description |
---|---|
static scala.Option<CallGenerator> |
FunctionGenerator.getCallGenerator(org.apache.calcite.sql.SqlOperator sqlOperator,
scala.collection.Seq<TypeInformation<?>> operandTypes,
TypeInformation<?> resultType)
Returns a
CallGenerator that generates all required code for calling the given
SqlOperator . |
scala.Option<CallGenerator> |
FunctionGenerator$.getCallGenerator(org.apache.calcite.sql.SqlOperator sqlOperator,
scala.collection.Seq<TypeInformation<?>> operandTypes,
TypeInformation<?> resultType)
Returns a
CallGenerator that generates all required code for calling the given
SqlOperator . |
Constructor and Description |
---|
ConstantCallGen(TypeInformation<?> targetType,
String constantCode) |
CurrentTimePointCallGen(TypeInformation<?> targetType,
boolean local) |
MethodCallGen(TypeInformation<?> returnType,
Method method) |
ScalarFunctionCallGen(ScalarFunction scalarFunction,
scala.collection.Seq<TypeInformation<?>> signature,
TypeInformation<?> returnType) |
TableFunctionCallGen(TableFunction<?> tableFunction,
scala.collection.Seq<TypeInformation<?>> signature,
TypeInformation<?> returnType) |
Constructor and Description |
---|
ScalarFunctionCallGen(ScalarFunction scalarFunction,
scala.collection.Seq<TypeInformation<?>> signature,
TypeInformation<?> returnType) |
TableFunctionCallGen(TableFunction<?> tableFunction,
scala.collection.Seq<TypeInformation<?>> signature,
TypeInformation<?> returnType) |
Modifier and Type | Method and Description |
---|---|
static scala.util.parsing.combinator.PackratParsers.PackratParser<TypeInformation<?>> |
ExpressionParser.dataType() |
scala.util.parsing.combinator.PackratParsers.PackratParser<TypeInformation<?>> |
ExpressionParser$.dataType() |
scala.collection.Seq<TypeInformation<?>> |
Ln.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Position.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Overlay.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Upper.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Exp.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
InputTypeSpec.expectedTypes()
Input type specification for each child.
|
scala.collection.Seq<TypeInformation<?>> |
Power.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Quarter.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Sqrt.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Log10.expectedTypes() |
scala.collection.Seq<TypeInformation<?>> |
Substring.expectedTypes() |
scala.Option<TypeInformation<?>> |
WindowReference.tpe() |
Constructor and Description |
---|
Cast(Expression child,
TypeInformation<?> resultType) |
CurrentTimePoint(TypeInformation<?> targetType,
boolean local) |
Literal(Object value,
TypeInformation<?> resultType) |
Null(TypeInformation<?> resultType) |
ResolvedFieldReference(String name,
TypeInformation<?> resultType) |
TableFunctionCall(String functionName,
TableFunction<?> tableFunction,
scala.collection.Seq<Expression> parameters,
TypeInformation<?> resultType) |
UDAGGExpression(AggregateFunction<T,ACC> aggregateFunction,
TypeInformation<T> evidence$1) |
Constructor and Description |
---|
WindowReference(String name,
scala.Option<TypeInformation<?>> tpe) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<?>[] |
ScalarFunction.getParameterTypes(Class<?>[] signature)
Returns
TypeInformation about the operands of the evaluation method with a given
signature. |
TypeInformation<T> |
TableFunction.getResultType()
Returns the result type of the evaluation method with a given signature.
|
TypeInformation<?> |
ScalarFunction.getResultType(Class<?>[] signature)
Returns the result type of the evaluation method with a given signature.
|
Modifier and Type | Method and Description |
---|---|
static TypeInformation<?>[] |
UserDefinedFunctionUtils.getParameterTypes(UserDefinedFunction function,
Class<?>[] signature) |
TypeInformation<?>[] |
UserDefinedFunctionUtils$.getParameterTypes(UserDefinedFunction function,
Class<?>[] signature) |
static TypeInformation<?> |
UserDefinedFunctionUtils.getResultTypeOfAggregateFunction(AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> extractedType)
Internal method of AggregateFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
TypeInformation<?> |
UserDefinedFunctionUtils$.getResultTypeOfAggregateFunction(AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> extractedType)
Internal method of AggregateFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
static TypeInformation<?> |
UserDefinedFunctionUtils.getResultTypeOfScalarFunction(ScalarFunction function,
Class<?>[] signature)
Internal method of
ScalarFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
TypeInformation<?> |
UserDefinedFunctionUtils$.getResultTypeOfScalarFunction(ScalarFunction function,
Class<?>[] signature)
Internal method of
ScalarFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
TypeInformation<?> |
TableSqlFunction.getRowTypeInfo()
Get the type information of the table returned by the table function.
|
Modifier and Type | Method and Description |
---|---|
static scala.collection.Seq<TypeInformation<?>> |
UserDefinedFunctionUtils.getOperandTypeInfo(org.apache.calcite.sql.SqlCallBinding callBinding) |
scala.collection.Seq<TypeInformation<?>> |
UserDefinedFunctionUtils$.getOperandTypeInfo(org.apache.calcite.sql.SqlCallBinding callBinding) |
Modifier and Type | Method and Description |
---|---|
AggSqlFunction |
AggSqlFunction$.apply(String name,
AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> returnType,
FlinkTypeFactory typeFactory,
boolean requiresOver) |
static AggSqlFunction |
AggSqlFunction.apply(String name,
AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> returnType,
FlinkTypeFactory typeFactory,
boolean requiresOver) |
static TableSqlFunction |
TableSqlFunction.apply(String name,
TableFunction<?> udtf,
TypeInformation<?> rowTypeInfo,
FlinkTypeFactory typeFactory,
FlinkTableFunctionImpl<?> functionImpl)
Util function to create a
TableSqlFunction . |
TableSqlFunction |
TableSqlFunction$.apply(String name,
TableFunction<?> udtf,
TypeInformation<?> rowTypeInfo,
FlinkTypeFactory typeFactory,
FlinkTableFunctionImpl<?> functionImpl)
Util function to create a
TableSqlFunction . |
static org.apache.calcite.sql.SqlFunction |
UserDefinedFunctionUtils.createAggregateSqlFunction(String name,
AggregateFunction<?,?> aggFunction,
TypeInformation<?> typeInfo,
FlinkTypeFactory typeFactory)
Create
SqlFunction for an AggregateFunction |
org.apache.calcite.sql.SqlFunction |
UserDefinedFunctionUtils$.createAggregateSqlFunction(String name,
AggregateFunction<?,?> aggFunction,
TypeInformation<?> typeInfo,
FlinkTypeFactory typeFactory)
Create
SqlFunction for an AggregateFunction |
org.apache.calcite.sql.type.SqlReturnTypeInference |
AggSqlFunction$.createReturnTypeInference(TypeInformation<?> resultType,
FlinkTypeFactory typeFactory) |
static org.apache.calcite.sql.type.SqlReturnTypeInference |
AggSqlFunction.createReturnTypeInference(TypeInformation<?> resultType,
FlinkTypeFactory typeFactory) |
static scala.collection.Seq<org.apache.calcite.sql.SqlFunction> |
UserDefinedFunctionUtils.createTableSqlFunctions(String name,
TableFunction<?> tableFunction,
TypeInformation<?> resultType,
FlinkTypeFactory typeFactory)
Create
SqlFunction s for a TableFunction 's every eval method |
scala.collection.Seq<org.apache.calcite.sql.SqlFunction> |
UserDefinedFunctionUtils$.createTableSqlFunctions(String name,
TableFunction<?> tableFunction,
TypeInformation<?> resultType,
FlinkTypeFactory typeFactory)
Create
SqlFunction s for a TableFunction 's every eval method |
static scala.Tuple3<String[],int[],TypeInformation<?>[]> |
UserDefinedFunctionUtils.getFieldInfo(TypeInformation<?> inputType)
Returns field names and field positions for a given
TypeInformation . |
scala.Tuple3<String[],int[],TypeInformation<?>[]> |
UserDefinedFunctionUtils$.getFieldInfo(TypeInformation<?> inputType)
Returns field names and field positions for a given
TypeInformation . |
static TypeInformation<?> |
UserDefinedFunctionUtils.getResultTypeOfAggregateFunction(AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> extractedType)
Internal method of AggregateFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
TypeInformation<?> |
UserDefinedFunctionUtils$.getResultTypeOfAggregateFunction(AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> extractedType)
Internal method of AggregateFunction#getResultType() that does some pre-checking and uses
TypeExtractor as default return type inference. |
Modifier and Type | Method and Description |
---|---|
static scala.Option<Class<?>[]> |
UserDefinedFunctionUtils.getAccumulateMethodSignature(AggregateFunction<?,?> function,
scala.collection.Seq<TypeInformation<?>> signature)
Returns signatures of accumulate methods matching the given signature of
TypeInformation . |
scala.Option<Class<?>[]> |
UserDefinedFunctionUtils$.getAccumulateMethodSignature(AggregateFunction<?,?> function,
scala.collection.Seq<TypeInformation<?>> signature)
Returns signatures of accumulate methods matching the given signature of
TypeInformation . |
static scala.Option<Class<?>[]> |
UserDefinedFunctionUtils.getEvalMethodSignature(UserDefinedFunction function,
scala.collection.Seq<TypeInformation<?>> signature)
Returns signatures of eval methods matching the given signature of
TypeInformation . |
scala.Option<Class<?>[]> |
UserDefinedFunctionUtils$.getEvalMethodSignature(UserDefinedFunction function,
scala.collection.Seq<TypeInformation<?>> signature)
Returns signatures of eval methods matching the given signature of
TypeInformation . |
static String |
UserDefinedFunctionUtils.signatureToString(scala.collection.Seq<TypeInformation<?>> signature)
Prints one signature consisting of TypeInformation.
|
String |
UserDefinedFunctionUtils$.signatureToString(scala.collection.Seq<TypeInformation<?>> signature)
Prints one signature consisting of TypeInformation.
|
static Class<?>[] |
UserDefinedFunctionUtils.typeInfoToClass(scala.collection.Seq<TypeInformation<?>> typeInfos)
Extracts type classes of
TypeInformation in a null-aware way. |
Class<?>[] |
UserDefinedFunctionUtils$.typeInfoToClass(scala.collection.Seq<TypeInformation<?>> typeInfos)
Extracts type classes of
TypeInformation in a null-aware way. |
Constructor and Description |
---|
AggSqlFunction(String name,
AggregateFunction<?,?> aggregateFunction,
TypeInformation<?> returnType,
FlinkTypeFactory typeFactory,
boolean requiresOver) |
TableSqlFunction(String name,
TableFunction<?> udtf,
TypeInformation<?> rowTypeInfo,
org.apache.calcite.sql.type.SqlReturnTypeInference returnTypeInference,
org.apache.calcite.sql.type.SqlOperandTypeInference operandTypeInference,
org.apache.calcite.sql.type.SqlOperandTypeChecker operandTypeChecker,
List<org.apache.calcite.rel.type.RelDataType> paramTypes,
FlinkTableFunctionImpl<?> functionImpl) |
Constructor and Description |
---|
LogicalTableFunctionCall(String functionName,
TableFunction<?> tableFunction,
scala.collection.Seq<Expression> parameters,
TypeInformation<?> resultType,
String[] fieldNames,
LogicalNode child) |
Modifier and Type | Method and Description |
---|---|
GeneratedCollector |
CommonCorrelate.generateCollector(TableConfig config,
RowSchema inputSchema,
TypeInformation<Object> udtfTypeInfo,
RowSchema returnSchema,
scala.Option<org.apache.calcite.rex.RexNode> condition,
scala.Option<int[]> pojoFieldMapping)
Generates table function collector.
|
<F extends Function> |
CommonScan.generatedConversionFunction(TableConfig config,
Class<F> functionClass,
TypeInformation<Object> inputType,
TypeInformation<Row> expectedType,
String conversionOperatorName,
scala.collection.Seq<String> fieldNames,
scala.Option<int[]> inputFieldMapping) |
<F extends Function> |
CommonScan.generatedConversionFunction(TableConfig config,
Class<F> functionClass,
TypeInformation<Object> inputType,
TypeInformation<Row> expectedType,
String conversionOperatorName,
scala.collection.Seq<String> fieldNames,
scala.Option<int[]> inputFieldMapping) |
<T extends Function> |
CommonCorrelate.generateFunction(TableConfig config,
RowSchema inputSchema,
TypeInformation<Object> udtfTypeInfo,
RowSchema returnSchema,
org.apache.calcite.sql.SemiJoinType joinType,
org.apache.calcite.rex.RexCall rexCall,
scala.Option<int[]> pojoFieldMapping,
String ruleDescription,
Class<T> functionClass)
Generates the flat map function to run the user-defined table function.
|
boolean |
CommonScan.needsConversion(TypeInformation<Object> externalTypeInfo,
TypeInformation<T> internalTypeInfo)
We check if the input type is exactly the same as the internal row type.
|
boolean |
CommonScan.needsConversion(TypeInformation<Object> externalTypeInfo,
TypeInformation<T> internalTypeInfo)
We check if the input type is exactly the same as the internal row type.
|
Modifier and Type | Method and Description |
---|---|
static TypeInformation<?>[] |
DataSetTable.fieldTypes() |
static TypeInformation<?>[] |
StreamTableSourceTable.fieldTypes() |
TypeInformation<?>[] |
FlinkTableFunctionImpl.fieldTypes() |
TypeInformation<?>[] |
FlinkTable.fieldTypes() |
static TypeInformation<?>[] |
TableSourceTable.fieldTypes() |
static TypeInformation<?>[] |
DataStreamTable.fieldTypes() |
TypeInformation<Row> |
RowSchema.physicalTypeInfo()
Returns a physical
TypeInformation of row with no logical fields (i.e. |
static TypeInformation<T> |
DataSetTable.typeInfo() |
static TypeInformation<T> |
StreamTableSourceTable.typeInfo() |
TypeInformation<T> |
FlinkTableFunctionImpl.typeInfo() |
TypeInformation<T> |
FlinkTable.typeInfo() |
static TypeInformation<T> |
TableSourceTable.typeInfo() |
static TypeInformation<T> |
DataStreamTable.typeInfo() |
Modifier and Type | Method and Description |
---|---|
scala.collection.Seq<TypeInformation<?>> |
RowSchema.physicalFieldTypeInfo()
Returns
TypeInformation of the row's fields with no logical fields (i.e. |
Constructor and Description |
---|
ArrayRelDataType(TypeInformation<?> typeInfo,
org.apache.calcite.rel.type.RelDataType elementType,
boolean isNullable) |
FlinkTable(TypeInformation<T> typeInfo,
int[] fieldIndexes,
String[] fieldNames,
FlinkStatistic statistic) |
FlinkTableFunctionImpl(TypeInformation<T> typeInfo,
int[] fieldIndexes,
String[] fieldNames,
Method evalMethod) |
GenericRelDataType(TypeInformation<?> typeInfo,
boolean nullable,
org.apache.calcite.rel.type.RelDataTypeSystem typeSystem) |
MapRelDataType(TypeInformation<?> typeInfo,
org.apache.calcite.rel.type.RelDataType keyType,
org.apache.calcite.rel.type.RelDataType valueType,
boolean isNullable) |
Modifier and Type | Method and Description |
---|---|
TableFunction<?> |
ExplodeFunctionUtil$.explodeTableFuncFromType(TypeInformation<?> ti) |
static TableFunction<?> |
ExplodeFunctionUtil.explodeTableFuncFromType(TypeInformation<?> ti) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Row> |
DataSetWindowAggMapFunction.getProducedType() |
TypeInformation<Row> |
DataSetSessionWindowAggregatePreProcessor.getProducedType() |
TypeInformation<Row> |
DataSetSlideTimeWindowAggFlatMapFunction.getProducedType() |
TypeInformation<Row> |
DataSetSlideTimeWindowAggReduceGroupFunction.getProducedType() |
TypeInformation<Row> |
DataSetSessionWindowAggregatePreProcessor.intermediateRowType() |
Modifier and Type | Method and Description |
---|---|
static scala.Tuple3<scala.Option<DataSetPreAggFunction>,scala.Option<TypeInformation<Row>>,RichGroupReduceFunction<Row,Row>> |
AggregateUtil.createDataSetAggregateFunctions(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
boolean inGroupingSet)
Create functions to compute a
DataSetAggregate . |
scala.Tuple3<scala.Option<DataSetPreAggFunction>,scala.Option<TypeInformation<Row>>,RichGroupReduceFunction<Row,Row>> |
AggregateUtil$.createDataSetAggregateFunctions(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
boolean inGroupingSet)
Create functions to compute a
DataSetAggregate . |
Modifier and Type | Method and Description |
---|---|
static ProcessFunction<CRow,CRow> |
AggregateUtil.createBoundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
long precedingOffset,
StreamQueryConfig queryConfig,
boolean isRowsClause,
boolean isRowTimeType)
Create an
ProcessFunction for ROWS clause
bounded OVER window to evaluate final aggregate value. |
ProcessFunction<CRow,CRow> |
AggregateUtil$.createBoundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
long precedingOffset,
StreamQueryConfig queryConfig,
boolean isRowsClause,
boolean isRowTimeType)
Create an
ProcessFunction for ROWS clause
bounded OVER window to evaluate final aggregate value. |
static FlatMapFunction<Row,Row> |
AggregateUtil.createDataSetSlideWindowPrepareFlatMapFunction(LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
TypeInformation<Row> inputType,
boolean isParserCaseSensitive)
Create a
FlatMapFunction that prepares for
non-incremental aggregates of sliding windows (time-windows). |
FlatMapFunction<Row,Row> |
AggregateUtil$.createDataSetSlideWindowPrepareFlatMapFunction(LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
TypeInformation<Row> inputType,
boolean isParserCaseSensitive)
Create a
FlatMapFunction that prepares for
non-incremental aggregates of sliding windows (time-windows). |
static ProcessFunction<CRow,CRow> |
AggregateUtil.createUnboundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
StreamQueryConfig queryConfig,
boolean isRowTimeType,
boolean isPartitioned,
boolean isRowsClause)
Create an
ProcessFunction for unbounded OVER
window to evaluate final aggregate value. |
ProcessFunction<CRow,CRow> |
AggregateUtil$.createUnboundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
StreamQueryConfig queryConfig,
boolean isRowTimeType,
boolean isPartitioned,
boolean isRowsClause)
Create an
ProcessFunction for unbounded OVER
window to evaluate final aggregate value. |
Modifier and Type | Method and Description |
---|---|
static ProcessFunction<CRow,CRow> |
AggregateUtil.createBoundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
long precedingOffset,
StreamQueryConfig queryConfig,
boolean isRowsClause,
boolean isRowTimeType)
Create an
ProcessFunction for ROWS clause
bounded OVER window to evaluate final aggregate value. |
ProcessFunction<CRow,CRow> |
AggregateUtil$.createBoundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
long precedingOffset,
StreamQueryConfig queryConfig,
boolean isRowsClause,
boolean isRowTimeType)
Create an
ProcessFunction for ROWS clause
bounded OVER window to evaluate final aggregate value. |
static scala.Tuple3<scala.Option<DataSetPreAggFunction>,scala.Option<TypeInformation<Row>>,RichGroupReduceFunction<Row,Row>> |
AggregateUtil.createDataSetAggregateFunctions(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
boolean inGroupingSet)
Create functions to compute a
DataSetAggregate . |
scala.Tuple3<scala.Option<DataSetPreAggFunction>,scala.Option<TypeInformation<Row>>,RichGroupReduceFunction<Row,Row>> |
AggregateUtil$.createDataSetAggregateFunctions(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
boolean inGroupingSet)
Create functions to compute a
DataSetAggregate . |
static RichGroupReduceFunction<Row,Row> |
AggregateUtil.createDataSetSlideWindowPrepareGroupReduceFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
boolean isParserCaseSensitive)
Create a
GroupReduceFunction that prepares for
partial aggregates of sliding windows (time and count-windows). |
RichGroupReduceFunction<Row,Row> |
AggregateUtil$.createDataSetSlideWindowPrepareGroupReduceFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
boolean isParserCaseSensitive)
Create a
GroupReduceFunction that prepares for
partial aggregates of sliding windows (time and count-windows). |
static GroupCombineFunction<Row,Row> |
AggregateUtil.createDataSetWindowAggregationCombineFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
int[] groupings)
Create a
GroupCombineFunction that pre-aggregation
for aggregates. |
GroupCombineFunction<Row,Row> |
AggregateUtil$.createDataSetWindowAggregationCombineFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
int[] groupings)
Create a
GroupCombineFunction that pre-aggregation
for aggregates. |
static RichGroupReduceFunction<Row,Row> |
AggregateUtil.createDataSetWindowAggregationGroupReduceFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
scala.collection.Seq<FlinkRelBuilder.NamedWindowProperty> properties,
boolean isInputCombined)
Create a
GroupReduceFunction to compute window
aggregates on batch tables. |
RichGroupReduceFunction<Row,Row> |
AggregateUtil$.createDataSetWindowAggregationGroupReduceFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupings,
scala.collection.Seq<FlinkRelBuilder.NamedWindowProperty> properties,
boolean isInputCombined)
Create a
GroupReduceFunction to compute window
aggregates on batch tables. |
static MapPartitionFunction<Row,Row> |
AggregateUtil.createDataSetWindowAggregationMapPartitionFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
int[] groupings)
Create a
MapPartitionFunction that aggregation
for aggregates. |
MapPartitionFunction<Row,Row> |
AggregateUtil$.createDataSetWindowAggregationMapPartitionFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType physicalInputRowType,
scala.collection.Seq<TypeInformation<?>> physicalInputTypes,
int[] groupings)
Create a
MapPartitionFunction that aggregation
for aggregates. |
static MapFunction<Row,Row> |
AggregateUtil.createDataSetWindowPrepareMapFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
boolean isParserCaseSensitive)
Create a
MapFunction that prepares for aggregates. |
MapFunction<Row,Row> |
AggregateUtil$.createDataSetWindowPrepareMapFunction(CodeGenerator generator,
LogicalWindow window,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
int[] groupings,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
boolean isParserCaseSensitive)
Create a
MapFunction that prepares for aggregates. |
static scala.Tuple3<AggregateFunction<CRow,Row,Row>,RowTypeInfo,RowTypeInfo> |
AggregateUtil.createDataStreamAggregateFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupingKeys,
boolean needMerge) |
scala.Tuple3<AggregateFunction<CRow,Row,Row>,RowTypeInfo,RowTypeInfo> |
AggregateUtil$.createDataStreamAggregateFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
org.apache.calcite.rel.type.RelDataType outputType,
int[] groupingKeys,
boolean needMerge) |
static ProcessFunction<CRow,CRow> |
AggregateUtil.createGroupAggregateFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputRowType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypes,
int[] groupings,
StreamQueryConfig queryConfig,
boolean generateRetraction,
boolean consumeRetraction)
Create an
ProcessFunction for group (without
window) aggregate to evaluate final aggregate value. |
ProcessFunction<CRow,CRow> |
AggregateUtil$.createGroupAggregateFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputRowType,
scala.collection.Seq<TypeInformation<?>> inputFieldTypes,
int[] groupings,
StreamQueryConfig queryConfig,
boolean generateRetraction,
boolean consumeRetraction)
Create an
ProcessFunction for group (without
window) aggregate to evaluate final aggregate value. |
static ProcessFunction<CRow,CRow> |
AggregateUtil.createUnboundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
StreamQueryConfig queryConfig,
boolean isRowTimeType,
boolean isPartitioned,
boolean isRowsClause)
Create an
ProcessFunction for unbounded OVER
window to evaluate final aggregate value. |
ProcessFunction<CRow,CRow> |
AggregateUtil$.createUnboundedOverProcessFunction(CodeGenerator generator,
scala.collection.Seq<org.apache.calcite.util.Pair<org.apache.calcite.rel.core.AggregateCall,String>> namedAggregates,
org.apache.calcite.rel.type.RelDataType inputType,
TypeInformation<Row> inputTypeInfo,
scala.collection.Seq<TypeInformation<?>> inputFieldTypeInfo,
StreamQueryConfig queryConfig,
boolean isRowTimeType,
boolean isPartitioned,
boolean isRowsClause)
Create an
ProcessFunction for unbounded OVER
window to evaluate final aggregate value. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Row> |
ValuesInputFormat.getProducedType() |
TypeInformation<CRow> |
CRowValuesInputFormat.getProducedType() |
TypeInformation<Row> |
ValuesInputFormat.returnType() |
TypeInformation<CRow> |
CRowValuesInputFormat.returnType() |
Constructor and Description |
---|
CRowValuesInputFormat(String name,
String code,
TypeInformation<CRow> returnType) |
ValuesInputFormat(String name,
String code,
TypeInformation<Row> returnType) |
Modifier and Type | Class and Description |
---|---|
class |
CRowTypeInfo |
Modifier and Type | Method and Description |
---|---|
<X> TypeInformation<X> |
CRowTypeInfo.getTypeAt(int pos) |
<X> TypeInformation<X> |
CRowTypeInfo.getTypeAt(String fieldExpression) |
Modifier and Type | Method and Description |
---|---|
static Map<String,TypeInformation<?>> |
CRowTypeInfo.getGenericParameters() |
Modifier and Type | Method and Description |
---|---|
static CRowTypeInfo |
CRowTypeInfo.apply(TypeInformation<Row> rowType) |
CRowTypeInfo |
CRowTypeInfo$.apply(TypeInformation<Row> rowType) |
Modifier and Type | Method and Description |
---|---|
static TypeInformation<?>[] |
CsvTableSink.getFieldTypes() |
TypeInformation<?>[] |
TableSinkBase.getFieldTypes()
Return the field types of the
Table to emit. |
TypeInformation<?>[] |
TableSink.getFieldTypes()
Returns the types of the table fields.
|
TypeInformation<Row> |
CsvTableSink.getOutputType() |
TypeInformation<T> |
TableSink.getOutputType()
Return the type expected by this
TableSink . |
TypeInformation<T> |
UpsertStreamTableSink.getRecordType()
Returns the requested record type
|
TypeInformation<T> |
RetractStreamTableSink.getRecordType()
Returns the requested record type
|
Modifier and Type | Method and Description |
---|---|
static TableSink<T> |
CsvTableSink.configure(String[] fieldNames,
TypeInformation<?>[] fieldTypes) |
TableSink<T> |
TableSinkBase.configure(String[] fieldNames,
TypeInformation<?>[] fieldTypes)
Return a copy of this
TableSink configured with the field names and types of the
Table to emit. |
TableSink<T> |
TableSink.configure(String[] fieldNames,
TypeInformation<?>[] fieldTypes)
Return a copy of this
TableSink configured with the field names and types of the
Table to emit. |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
TableSource.getReturnType()
Returns the
TypeInformation for the return type of the TableSource . |
Modifier and Type | Method and Description |
---|---|
CsvTableSource.Builder |
CsvTableSource.Builder.field(String fieldName,
TypeInformation<?> fieldType)
Adds a field with the field name and the type information.
|
Constructor and Description |
---|
CsvTableSource(String path,
String[] fieldNames,
TypeInformation<?>[] fieldTypes)
A
BatchTableSource and StreamTableSource for simple CSV files with a
(logically) unlimited number of fields. |
CsvTableSource(String path,
String[] fieldNames,
TypeInformation<?>[] fieldTypes,
String fieldDelim,
String rowDelim,
Character quoteCharacter,
boolean ignoreFirstLine,
String ignoreComments,
boolean lenient) |
Modifier and Type | Class and Description |
---|---|
class |
InternalTypeInfo<T>
TypeInformation for internal types of the Table API that are for translation purposes only
and should not be contained in final plan.
|
class |
RowIntervalTypeInfo
TypeInformation for row intervals.
|
class |
TimeIndicatorTypeInfo
Type information for indicating event or processing time.
|
class |
TimeIntervalTypeInfo<T>
TypeInformation for SQL INTERVAL types.
|
Modifier and Type | Method and Description |
---|---|
static Map<String,TypeInformation<?>> |
TimeIndicatorTypeInfo.getGenericParameters() |
static Map<String,TypeInformation<?>> |
RowIntervalTypeInfo.getGenericParameters() |
static Map<String,TypeInformation<?>> |
TimeIntervalTypeInfo.getGenericParameters() |
static scala.collection.IndexedSeq<TypeInformation<?>> |
TypeCoercion.numericWideningPrecedence() |
scala.collection.IndexedSeq<TypeInformation<?>> |
TypeCoercion$.numericWideningPrecedence() |
static scala.Option<TypeInformation<?>> |
TypeCoercion.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
scala.Option<TypeInformation<?>> |
TypeCoercion$.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
Modifier and Type | Method and Description |
---|---|
ValidationResult |
TypeCheckUtils$.assertNumericExpr(TypeInformation<?> dataType,
String caller) |
static ValidationResult |
TypeCheckUtils.assertNumericExpr(TypeInformation<?> dataType,
String caller) |
ValidationResult |
TypeCheckUtils$.assertOrderableExpr(TypeInformation<?> dataType,
String caller) |
static ValidationResult |
TypeCheckUtils.assertOrderableExpr(TypeInformation<?> dataType,
String caller) |
static boolean |
TypeCoercion.canCast(TypeInformation<?> from,
TypeInformation<?> to)
All the supported cast types in flink-table.
|
static boolean |
TypeCoercion.canCast(TypeInformation<?> from,
TypeInformation<?> to)
All the supported cast types in flink-table.
|
boolean |
TypeCoercion$.canCast(TypeInformation<?> from,
TypeInformation<?> to)
All the supported cast types in flink-table.
|
boolean |
TypeCoercion$.canCast(TypeInformation<?> from,
TypeInformation<?> to)
All the supported cast types in flink-table.
|
static boolean |
TypeCoercion.canSafelyCast(TypeInformation<?> from,
TypeInformation<?> to)
Test if we can do cast safely without lose of information.
|
static boolean |
TypeCoercion.canSafelyCast(TypeInformation<?> from,
TypeInformation<?> to)
Test if we can do cast safely without lose of information.
|
boolean |
TypeCoercion$.canSafelyCast(TypeInformation<?> from,
TypeInformation<?> to)
Test if we can do cast safely without lose of information.
|
boolean |
TypeCoercion$.canSafelyCast(TypeInformation<?> from,
TypeInformation<?> to)
Test if we can do cast safely without lose of information.
|
boolean |
TypeCheckUtils$.isAdvanced(TypeInformation<?> dataType)
Checks if type information is an advanced type that can be converted to a
SQL type but NOT vice versa.
|
static boolean |
TypeCheckUtils.isAdvanced(TypeInformation<?> dataType)
Checks if type information is an advanced type that can be converted to a
SQL type but NOT vice versa.
|
boolean |
TypeCheckUtils$.isArray(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isArray(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isBoolean(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isBoolean(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isComparable(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isComparable(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isDecimal(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isDecimal(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isInteger(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isInteger(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isLong(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isLong(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isNumeric(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isNumeric(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isSimple(TypeInformation<?> dataType)
Checks if type information is a simple type that can be converted to a
SQL type and vice versa.
|
static boolean |
TypeCheckUtils.isSimple(TypeInformation<?> dataType)
Checks if type information is a simple type that can be converted to a
SQL type and vice versa.
|
boolean |
TypeCheckUtils$.isString(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isString(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isTemporal(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isTemporal(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isTimeInterval(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isTimeInterval(TypeInformation<?> dataType) |
boolean |
TypeCheckUtils$.isTimePoint(TypeInformation<?> dataType) |
static boolean |
TypeCheckUtils.isTimePoint(TypeInformation<?> dataType) |
static scala.Option<TypeInformation<?>> |
TypeCoercion.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
static scala.Option<TypeInformation<?>> |
TypeCoercion.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
scala.Option<TypeInformation<?>> |
TypeCoercion$.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
scala.Option<TypeInformation<?>> |
TypeCoercion$.widerTypeOf(TypeInformation<?> tp1,
TypeInformation<?> tp2) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
OutputTag.getTypeInfo() |
Constructor and Description |
---|
OutputTag(String id,
TypeInformation<T> typeInfo)
Creates a new named
OutputTag with the given id and output TypeInformation . |
Copyright © 2014–2018 The Apache Software Foundation. All rights reserved.