Modifier and Type | Method and Description |
---|---|
<S> OperatorState<S> |
RuntimeContext.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState)
Deprecated.
Use the more expressive
RuntimeContext.getState(ValueStateDescriptor) instead. |
Modifier and Type | Method and Description |
---|---|
<S> OperatorState<S> |
AbstractRuntimeUDFContext.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState)
Deprecated.
|
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<IN> |
UnaryOperatorInformation.inputType
Input Type of the operator
|
protected TypeInformation<IN1> |
BinaryOperatorInformation.inputType1
Input type of the first input
|
protected TypeInformation<IN2> |
BinaryOperatorInformation.inputType2
Input type of the second input
|
protected TypeInformation<OUT> |
OperatorInformation.outputType
Output type of the operator
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<IN1> |
BinaryOperatorInformation.getFirstInputType() |
TypeInformation<IN> |
UnaryOperatorInformation.getInputType() |
TypeInformation<T> |
Keys.SelectorFunctionKeys.getInputType() |
abstract TypeInformation<?>[] |
Keys.getKeyFieldTypes() |
TypeInformation<?>[] |
Keys.SelectorFunctionKeys.getKeyFieldTypes() |
TypeInformation<?>[] |
Keys.ExpressionKeys.getKeyFieldTypes() |
TypeInformation<K> |
Keys.SelectorFunctionKeys.getKeyType() |
TypeInformation<OUT> |
OperatorInformation.getOutputType()
Gets the return type of the user code function.
|
TypeInformation<IN2> |
BinaryOperatorInformation.getSecondInputType() |
Modifier and Type | Method and Description |
---|---|
static boolean |
Keys.ExpressionKeys.isSortKey(int fieldPos,
TypeInformation<?> type) |
static boolean |
Keys.ExpressionKeys.isSortKey(String fieldExpr,
TypeInformation<?> type) |
abstract <E> void |
Keys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
<E> void |
Keys.SelectorFunctionKeys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
<E> void |
Keys.ExpressionKeys.validateCustomPartitioner(Partitioner<E> partitioner,
TypeInformation<E> typeInfo) |
Constructor and Description |
---|
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
BinaryOperatorInformation(TypeInformation<IN1> inputType1,
TypeInformation<IN2> inputType2,
TypeInformation<OUT> outputType) |
ExpressionKeys(int[] keyPositions,
TypeInformation<T> type)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(int[] keyPositions,
TypeInformation<T> type,
boolean allowEmpty)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(int keyPosition,
TypeInformation<T> type)
Create int-based (non-nested) field position keys on a tuple type.
|
ExpressionKeys(String[] keyExpressions,
TypeInformation<T> type)
Create String-based (nested) field expression keys on a composite type.
|
ExpressionKeys(String keyExpression,
TypeInformation<T> type)
Create String-based (nested) field expression keys on a composite type.
|
ExpressionKeys(TypeInformation<T> type)
ExpressionKeys that is defined by the full data type.
|
IncompatibleKeysException(TypeInformation<?> typeInformation,
TypeInformation<?> typeInformation2) |
IncompatibleKeysException(TypeInformation<?> typeInformation,
TypeInformation<?> typeInformation2) |
OperatorInformation(TypeInformation<OUT> outputType) |
SelectorFunctionKeys(KeySelector<T,K> keyExtractor,
TypeInformation<T> inputType,
TypeInformation<K> keyType) |
SelectorFunctionKeys(KeySelector<T,K> keyExtractor,
TypeInformation<T> inputType,
TypeInformation<K> keyType) |
UnaryOperatorInformation(TypeInformation<IN> inputType,
TypeInformation<OUT> outputType) |
UnaryOperatorInformation(TypeInformation<IN> inputType,
TypeInformation<OUT> outputType) |
Constructor and Description |
---|
FoldingStateDescriptor(String name,
ACC initialValue,
FoldFunction<T,ACC> foldFunction,
TypeInformation<ACC> typeInfo)
Creates a new
FoldingStateDescriptor with the given name and default value. |
ListStateDescriptor(String name,
TypeInformation<T> typeInfo)
Creates a new
ListStateDescriptor with the given name and list element type. |
ReducingStateDescriptor(String name,
ReduceFunction<T> reduceFunction,
TypeInformation<T> typeInfo)
Creates a new
ReducingStateDescriptor with the given name and default value. |
StateDescriptor(String name,
TypeInformation<T> typeInfo,
T defaultValue)
Create a new
StateDescriptor with the given name and the given type information. |
ValueStateDescriptor(String name,
TypeInformation<T> typeInfo,
T defaultValue)
Creates a new
ValueStateDescriptor with the given name and default value. |
Modifier and Type | Class and Description |
---|---|
class |
BasicArrayTypeInfo<T,C> |
class |
BasicTypeInfo<T>
Type information for primitive types (int, long, double, byte, ...), String, Date, and Void.
|
class |
FractionalTypeInfo<T>
Type information for numeric fractional primitive types (double, float).
|
class |
IntegerTypeInfo<T>
Type information for numeric integer primitive types: int, long, byte, short, character.
|
class |
NothingTypeInfo
Placeholder type information for the
Nothing type. |
class |
NumericTypeInfo<T>
Type information for numeric primitive types: int, long, double, byte, short, float, char.
|
class |
PrimitiveArrayTypeInfo<T>
A
TypeInformation for arrays of primitive types (int, long, double, ...). |
Modifier and Type | Method and Description |
---|---|
TypeInformation<C> |
BasicArrayTypeInfo.getComponentInfo() |
TypeInformation<?> |
PrimitiveArrayTypeInfo.getComponentType()
Gets the type information of the component type.
|
TypeInformation<T> |
TypeHint.getTypeInfo()
Gets the type information described by this TypeHint.
|
static <T> TypeInformation<T> |
TypeInformation.of(Class<T> typeClass)
Creates a TypeInformation for the type described by the given class.
|
static <T> TypeInformation<T> |
TypeInformation.of(TypeHint<T> typeHint)
Creates a TypeInformation for a generic type via a utility "type hint".
|
Modifier and Type | Method and Description |
---|---|
List<TypeInformation<?>> |
TypeInformation.getGenericParameters()
Returns the generic parameters of this type.
|
Modifier and Type | Class and Description |
---|---|
class |
CompositeType<T>
Base type information class for Tuple and Pojo types
The class is taking care of serialization and comparators for Tuples as well.
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<?> |
CompositeType.FlatFieldDescriptor.getType() |
abstract <X> TypeInformation<X> |
CompositeType.getTypeAt(int pos)
Returns the type of the (unnested) field at the given field position.
|
abstract <X> TypeInformation<X> |
CompositeType.getTypeAt(String fieldExpression)
Returns the type of the (nested) field at the given field expression position.
|
Constructor and Description |
---|
FlatFieldDescriptor(int keyPosition,
TypeInformation<?> type) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
DataSet.getType()
Returns the
TypeInformation for the type of this DataSet. |
Modifier and Type | Method and Description |
---|---|
<X> DataSource<X> |
ExecutionEnvironment.createInput(InputFormat<X,?> inputFormat,
TypeInformation<X> producedType)
Generic method to create an input DataSet with in
InputFormat . |
protected void |
DataSet.fillInType(TypeInformation<T> typeInfo)
Tries to fill in the type information.
|
<X> DataSource<X> |
ExecutionEnvironment.fromCollection(Collection<X> data,
TypeInformation<X> type)
Creates a DataSet from the given non-empty collection.
|
<X> DataSource<X> |
ExecutionEnvironment.fromCollection(Iterator<X> data,
TypeInformation<X> type)
Creates a DataSet from the given iterator.
|
<X> DataSource<X> |
ExecutionEnvironment.fromParallelCollection(SplittableIterator<X> iterator,
TypeInformation<X> type)
Creates a new data set that contains elements in the iterator.
|
static <T> String |
Utils.getSerializerTree(TypeInformation<T> ti)
Debugging utility to understand the hierarchy of serializers created by the Java API.
|
Constructor and Description |
---|
DataSet(ExecutionEnvironment context,
TypeInformation<T> typeInfo) |
Modifier and Type | Method and Description |
---|---|
static DualInputSemanticProperties |
SemanticPropUtil.createProjectionPropertiesDual(int[] fields,
boolean[] isFromFirst,
TypeInformation<?> inType1,
TypeInformation<?> inType2) |
static DualInputSemanticProperties |
SemanticPropUtil.createProjectionPropertiesDual(int[] fields,
boolean[] isFromFirst,
TypeInformation<?> inType1,
TypeInformation<?> inType2) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static DualInputSemanticProperties |
SemanticPropUtil.getSemanticPropsDual(Set<Annotation> set,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsDualFromString(DualInputSemanticProperties result,
String[] forwardedFirst,
String[] forwardedSecond,
String[] nonForwardedFirst,
String[] nonForwardedSecond,
String[] readFieldsFirst,
String[] readFieldsSecond,
TypeInformation<?> inType1,
TypeInformation<?> inType2,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static SingleInputSemanticProperties |
SemanticPropUtil.getSemanticPropsSingle(Set<Annotation> set,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static SingleInputSemanticProperties |
SemanticPropUtil.getSemanticPropsSingle(Set<Annotation> set,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
static void |
SemanticPropUtil.getSemanticPropsSingleFromString(SingleInputSemanticProperties result,
String[] forwarded,
String[] nonForwarded,
String[] readSet,
TypeInformation<?> inType,
TypeInformation<?> outType,
boolean skipIncompatibleTypes) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
HadoopInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
HadoopInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
TypeSerializerInputFormat.getProducedType() |
TypeInformation<E> |
AvroInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
void |
TypeSerializerOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
LocalCollectionOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
CsvOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
The purpose of this method is solely to check whether the data type to be processed
is in fact a tuple type.
|
Constructor and Description |
---|
SplitDataProperties(TypeInformation<T> type)
Creates SplitDataProperties for the given data types.
|
TypeSerializerInputFormat(TypeInformation<T> resultType) |
Modifier and Type | Method and Description |
---|---|
static <T,K> TypeInformation<Tuple2<K,T>> |
KeyFunctions.createTypeWithKey(Keys.SelectorFunctionKeys<T,K> key) |
static <T,K1,K2> TypeInformation<Tuple3<K1,K2,T>> |
KeyFunctions.createTypeWithKey(Keys.SelectorFunctionKeys<T,K1> key1,
Keys.SelectorFunctionKeys<T,K2> key2) |
TypeInformation<IN1> |
TwoInputOperator.getInput1Type()
Gets the type information of the data type of the first input data set.
|
TypeInformation<IN2> |
TwoInputOperator.getInput2Type()
Gets the type information of the data type of the second input data set.
|
TypeInformation<IN> |
SingleInputOperator.getInputType()
Gets the type information of the data type of the input data set.
|
TypeInformation<OUT> |
Operator.getResultType()
Returns the type of the result of this operator.
|
TypeInformation<T> |
DataSink.getType() |
TypeInformation<WT> |
DeltaIterationResultSet.getWorksetType() |
Modifier and Type | Method and Description |
---|---|
O |
TwoInputUdfOperator.returns(TypeInformation<OUT> typeInfo)
Adds a type information hint about the return type of this operator.
|
O |
SingleInputUdfOperator.returns(TypeInformation<OUT> typeInfo)
Adds a type information hint about the return type of this operator.
|
Constructor and Description |
---|
CoGroupOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
List<org.apache.commons.lang3.tuple.Pair<Integer,Order>> groupSortKeyOrderFirst,
List<org.apache.commons.lang3.tuple.Pair<Integer,Order>> groupSortKeyOrderSecond,
Partitioner<?> customPartitioner,
String defaultName) |
CoGroupOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
Partitioner<?> customPartitioner,
String defaultName) |
CoGroupRawOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
CoGroupFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
String defaultName) |
CrossOperator(DataSet<I1> input1,
DataSet<I2> input2,
CrossFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
CrossOperatorBase.CrossHint hint,
String defaultName) |
DataSink(DataSet<T> data,
OutputFormat<T> format,
TypeInformation<T> type) |
DataSource(ExecutionEnvironment context,
InputFormat<OUT,?> inputFormat,
TypeInformation<OUT> type,
String dataSourceLocationName)
Creates a new data source.
|
DeltaIteration(ExecutionEnvironment context,
TypeInformation<ST> type,
DataSet<ST> solutionSet,
DataSet<WT> workset,
Keys<ST> keys,
int maxIterations) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> generatedFunction,
JoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> generatedFunction,
JoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName,
JoinType type) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName) |
EquiJoin(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
FlatJoinFunction<I1,I2,OUT> function,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
String joinLocationName,
JoinType type) |
FlatMapOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
FlatMapFunction<IN,OUT> function,
String defaultName) |
GroupCombineOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
GroupCombineFunction<IN,OUT> function,
String defaultName)
Constructor for a non-grouped reduce (all reduce).
|
GroupCombineOperator(Grouping<IN> input,
TypeInformation<OUT> resultType,
GroupCombineFunction<IN,OUT> function,
String defaultName)
Constructor for a grouped reduce.
|
GroupReduceOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
GroupReduceFunction<IN,OUT> function,
String defaultName)
Constructor for a non-grouped reduce (all reduce).
|
GroupReduceOperator(Grouping<IN> input,
TypeInformation<OUT> resultType,
GroupReduceFunction<IN,OUT> function,
String defaultName)
Constructor for a grouped reduce.
|
IterativeDataSet(ExecutionEnvironment context,
TypeInformation<T> type,
DataSet<T> input,
int maxIterations) |
JoinOperator(DataSet<I1> input1,
DataSet<I2> input2,
Keys<I1> keys1,
Keys<I2> keys2,
TypeInformation<OUT> returnType,
JoinOperatorBase.JoinHint hint,
JoinType type) |
MapOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
MapFunction<IN,OUT> function,
String defaultName) |
MapPartitionOperator(DataSet<IN> input,
TypeInformation<OUT> resultType,
MapPartitionFunction<IN,OUT> function,
String defaultName) |
Operator(ExecutionEnvironment context,
TypeInformation<OUT> resultType) |
PartitionOperator(DataSet<T> input,
Keys<T> pKeys,
Partitioner<P> customPartitioner,
TypeInformation<P> partitionerTypeInfo,
String partitionLocationName) |
SingleInputOperator(DataSet<IN> input,
TypeInformation<OUT> resultType) |
SingleInputUdfOperator(DataSet<IN> input,
TypeInformation<OUT> resultType)
Creates a new operators with the given data set as input.
|
TwoInputOperator(DataSet<IN1> input1,
DataSet<IN2> input2,
TypeInformation<OUT> resultType) |
TwoInputUdfOperator(DataSet<IN1> input1,
DataSet<IN2> input2,
TypeInformation<OUT> resultType)
Creates a new operators with the two given data sets as inputs.
|
Modifier and Type | Method and Description |
---|---|
static TaggedValue |
UdfAnalyzerUtils.convertTypeInfoToTaggedValue(TaggedValue.Input input,
TypeInformation<?> typeInfo,
String flatFieldExpr,
List<CompositeType.FlatFieldDescriptor> flatFieldDesc,
int[] groupedKeys) |
Constructor and Description |
---|
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
UdfAnalyzer(Class<?> baseClass,
Class<?> udfClass,
String externalUdfName,
TypeInformation<?> in1Type,
TypeInformation<?> in2Type,
TypeInformation<?> outType,
Keys<?> keys1,
Keys<?> keys2,
boolean throwErrorExceptions) |
Modifier and Type | Method and Description |
---|---|
<A> DataSet<A> |
JavaBatchTranslator.translate(PlanNode op,
TypeInformation<A> tpe) |
<A> DataStream<A> |
JavaStreamingTranslator.translate(PlanNode op,
TypeInformation<A> tpe) |
Modifier and Type | Method and Description |
---|---|
<A> Table |
JavaBatchTranslator.createTable(DataSet<A> repr,
CompositeType<A> inputType,
Expression[] expressions,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> resultFields) |
<A> Table |
JavaStreamingTranslator.createTable(DataStream<A> repr,
CompositeType<A> inputType,
Expression[] expressions,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> resultFields) |
Modifier and Type | Class and Description |
---|---|
class |
AvroTypeInfo<T extends org.apache.avro.specific.SpecificRecordBase>
Special type information to generate a special AvroTypeInfo for Avro POJOs (implementing SpecificRecordBase, the typed Avro POJOs)
Proceeding: It uses a regular pojo type analysis and replaces all
GenericType<CharSequence>
with a GenericType<avro.Utf8> . |
class |
EitherTypeInfo<L,R>
A
TypeInformation for the Either type of the Java API. |
class |
EnumTypeInfo<T extends Enum<T>>
A
TypeInformation for java enumeration types. |
class |
GenericTypeInfo<T> |
class |
MissingTypeInfo
A special type information signifying that the type extraction failed.
|
class |
ObjectArrayTypeInfo<T,C> |
class |
PojoTypeInfo<T>
TypeInformation for "Java Beans"-style types.
|
class |
TupleTypeInfo<T extends Tuple>
A
TypeInformation for the tuple types of the Java API. |
class |
TupleTypeInfoBase<T> |
class |
ValueTypeInfo<T extends Value>
Type information for data types that extend the
Value interface. |
class |
WritableTypeInfo<T extends org.apache.hadoop.io.Writable>
Type information for data types that extend Hadoop's
Writable interface. |
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<?>[] |
TupleTypeInfoBase.types |
Modifier and Type | Method and Description |
---|---|
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <T> TypeInformation<T> |
TypeExtractor.createTypeInfo(Class<T> type) |
static <OUT> TypeInformation<OUT> |
TypeExtractor.createTypeInfo(Object instance,
Class<?> baseClass,
Class<?> clazz,
int returnParamPos)
Creates a
TypeInformation from the given parameters. |
static TypeInformation<?> |
TypeExtractor.createTypeInfo(Type t) |
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
boolean hasIterables,
boolean hasCollector,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
TypeInformation<C> |
ObjectArrayTypeInfo.getComponentInfo() |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <X> TypeInformation<X> |
TypeExtractor.getForClass(Class<X> clazz)
Creates type information from a given Class such as Integer, String[] or POJOs.
|
static <X> TypeInformation<X> |
TypeExtractor.getForObject(X value) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN> TypeInformation<IN> |
TypeExtractor.getInputFormatTypes(InputFormat<IN,?> inputFormatInterface) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
TypeInformation<L> |
EitherTypeInfo.getLeftType() |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <T> TypeInformation<T> |
TypeExtractor.getPartitionerTypes(Partitioner<T> partitioner) |
static <T> TypeInformation<T> |
TypeExtractor.getPartitionerTypes(Partitioner<T> partitioner,
String functionName,
boolean allowMissing) |
TypeInformation<T> |
ResultTypeQueryable.getProducedType()
Gets the data type (as a
TypeInformation ) produced by this function or input format. |
TypeInformation<R> |
EitherTypeInfo.getRightType() |
<X> TypeInformation<X> |
TupleTypeInfoBase.getTypeAt(int pos) |
<X> TypeInformation<X> |
PojoTypeInfo.getTypeAt(int pos) |
<X> TypeInformation<X> |
TupleTypeInfoBase.getTypeAt(String fieldExpression) |
<X> TypeInformation<X> |
PojoTypeInfo.getTypeAt(String fieldExpression) |
TypeInformation<?> |
PojoField.getTypeInformation() |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
boolean hasIterable,
boolean hasCollector,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
static <X> TypeInformation<X> |
TypeInfoParser.parse(String infoString)
Generates an instance of
TypeInformation by parsing a type
information string. |
Modifier and Type | Method and Description |
---|---|
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
protected <OUT,IN1,IN2> |
TypeExtractor.analyzePojo(Class<OUT> clazz,
ArrayList<Type> typeHierarchy,
ParameterizedType parameterizedType,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.createTypeInfo(Class<?> baseClass,
Class<?> clazz,
int returnParamPos,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
boolean hasIterables,
boolean hasCollector,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
boolean hasIterables,
boolean hasCollector,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getBinaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing)
Returns the binary operator's return type.
|
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCoGroupReturnTypes(CoGroupFunction<IN1,IN2,OUT> coGroupInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getCrossReturnTypes(CrossFunction<IN1,IN2,OUT> crossInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getFlatJoinReturnTypes(FlatJoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFlatMapReturnTypes(FlatMapFunction<IN,OUT> flatMapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getFoldReturnTypes(FoldFunction<IN,OUT> foldInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupCombineReturnTypes(GroupCombineFunction<IN,OUT> combineInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getGroupReduceReturnTypes(GroupReduceFunction<IN,OUT> groupReduceInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <T,C> ObjectArrayTypeInfo<T,C> |
ObjectArrayTypeInfo.getInfoFor(Class<T> arrayClass,
TypeInformation<C> componentInfo) |
static <T,C> ObjectArrayTypeInfo<T,C> |
ObjectArrayTypeInfo.getInfoFor(TypeInformation<C> componentInfo)
Creates a new
ObjectArrayTypeInfo from a
TypeInformation for the component type. |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN1,IN2,OUT> |
TypeExtractor.getJoinReturnTypes(JoinFunction<IN1,IN2,OUT> joinInterface,
TypeInformation<IN1> in1Type,
TypeInformation<IN2> in2Type,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getKeySelectorTypes(KeySelector<IN,OUT> selectorInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapPartitionReturnTypes(MapPartitionFunction<IN,OUT> mapPartitionInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getMapReturnTypes(MapFunction<IN,OUT> mapInterface,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing) |
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
boolean hasIterable,
boolean hasCollector,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
static <IN,OUT> TypeInformation<OUT> |
TypeExtractor.getUnaryOperatorReturnType(Function function,
Class<?> baseClass,
int inputTypeArgumentIndex,
int outputTypeArgumentIndex,
TypeInformation<IN> inType,
String functionName,
boolean allowMissing)
Returns the unary operator's return type.
|
void |
InputTypeConfigurable.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
Method that is called on an
OutputFormat when it is passed to
the DataSet's output method. |
Constructor and Description |
---|
EitherTypeInfo(TypeInformation<L> leftType,
TypeInformation<R> rightType) |
EitherTypeInfo(TypeInformation<L> leftType,
TypeInformation<R> rightType) |
NamedFlatFieldDescriptor(String name,
int keyPosition,
TypeInformation<?> type) |
PojoField(Field field,
TypeInformation<?> type) |
TupleTypeInfo(Class<T> tupleType,
TypeInformation<?>... types) |
TupleTypeInfo(TypeInformation<?>... types) |
TupleTypeInfoBase(Class<T> tupleType,
TypeInformation<?>... types) |
Modifier and Type | Method and Description |
---|---|
static void |
Serializers.recursivelyRegisterType(TypeInformation<?> typeInfo,
ExecutionConfig config,
Set<Class<?>> alreadySeen) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
DataSet.getType()
Returns the TypeInformation for the elements of this DataSet.
|
Modifier and Type | Method and Description |
---|---|
<O> DataSet<O> |
CoGroupDataSet.apply(CoGroupFunction<L,R,O> coGrouper,
TypeInformation<O> evidence$5,
scala.reflect.ClassTag<O> evidence$6)
Creates a new
DataSet by passing each pair of co-grouped element lists to the given
function. |
<O> DataSet<O> |
CrossDataSet.apply(CrossFunction<L,R,O> crosser,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet by passing each pair of values to the given function. |
<O> DataSet<O> |
JoinDataSet.apply(FlatJoinFunction<L,R,O> joiner,
TypeInformation<O> evidence$5,
scala.reflect.ClassTag<O> evidence$6)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(FlatJoinFunction<L,R,O> fun,
TypeInformation<O> evidence$24,
scala.reflect.ClassTag<O> evidence$25) |
<O> DataSet<O> |
CoGroupDataSet.apply(scala.Function2<scala.collection.Iterator<L>,scala.collection.Iterator<R>,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of co-grouped element lists is the
result of the given function. |
<O> DataSet<O> |
CrossDataSet.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of elements is the result
of the given function. |
<O> DataSet<O> |
JoinDataSet.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$1,
scala.reflect.ClassTag<O> evidence$2)
Creates a new
DataSet where the result for each pair of joined elements is the result
of the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(scala.Function2<L,R,O> fun,
TypeInformation<O> evidence$20,
scala.reflect.ClassTag<O> evidence$21) |
<O> DataSet<O> |
CoGroupDataSet.apply(scala.Function3<scala.collection.Iterator<L>,scala.collection.Iterator<R>,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet where the result for each pair of co-grouped element lists is the
result of the given function. |
<O> DataSet<O> |
JoinDataSet.apply(scala.Function3<L,R,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3,
scala.reflect.ClassTag<O> evidence$4)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(scala.Function3<L,R,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$22,
scala.reflect.ClassTag<O> evidence$23) |
<O> DataSet<O> |
JoinDataSet.apply(JoinFunction<L,R,O> fun,
TypeInformation<O> evidence$7,
scala.reflect.ClassTag<O> evidence$8)
Creates a new
DataSet by passing each pair of joined values to the given function. |
<O> DataSet<O> |
JoinFunctionAssigner.apply(JoinFunction<L,R,O> fun,
TypeInformation<O> evidence$26,
scala.reflect.ClassTag<O> evidence$27) |
<R> DataSet<R> |
GroupedDataSet.combineGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$10,
scala.reflect.ClassTag<R> evidence$11)
Applies a CombineFunction on a grouped
DataSet . |
<R> DataSet<R> |
DataSet.combineGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$26,
scala.reflect.ClassTag<R> evidence$27)
Applies a GroupCombineFunction on a grouped
DataSet . |
<R> DataSet<R> |
GroupedDataSet.combineGroup(GroupCombineFunction<T,R> combiner,
TypeInformation<R> evidence$12,
scala.reflect.ClassTag<R> evidence$13)
Applies a CombineFunction on a grouped
DataSet . |
<R> DataSet<R> |
DataSet.combineGroup(GroupCombineFunction<T,R> combiner,
TypeInformation<R> evidence$24,
scala.reflect.ClassTag<R> evidence$25)
Applies a GroupCombineFunction on a grouped
DataSet . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.createHadoopInput(org.apache.hadoop.mapred.InputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given InputFormat . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.createHadoopInput(org.apache.hadoop.mapreduce.InputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given InputFormat . |
<T> DataSet<T> |
ExecutionEnvironment.createInput(InputFormat<T,?> inputFormat,
scala.reflect.ClassTag<T> evidence$7,
TypeInformation<T> evidence$8)
Generic method to create an input DataSet with an
InputFormat . |
<K> DataSet<T> |
DataSet.distinct(scala.Function1<T,K> fun,
TypeInformation<K> evidence$28)
Creates a new DataSet containing the distinct elements of this DataSet.
|
<K> O |
HalfUnfinishedKeyPairOperation.equalTo(scala.Function1<R,K> fun,
TypeInformation<K> evidence$2)
Specify the key selector function for the right side of the key based operation.
|
<R> DataSet<R> |
DataSet.flatMap(FlatMapFunction<T,R> flatMapper,
TypeInformation<R> evidence$12,
scala.reflect.ClassTag<R> evidence$13)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
<R> DataSet<R> |
DataSet.flatMap(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$16,
scala.reflect.ClassTag<R> evidence$17)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
<R> DataSet<R> |
DataSet.flatMap(scala.Function2<T,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$14,
scala.reflect.ClassTag<R> evidence$15)
Creates a new DataSet by applying the given function to every element and flattening
the results.
|
<T> DataSet<T> |
ExecutionEnvironment.fromCollection(scala.collection.Iterable<T> data,
scala.reflect.ClassTag<T> evidence$10,
TypeInformation<T> evidence$11)
Creates a DataSet from the given non-empty
Iterable . |
<T> DataSet<T> |
ExecutionEnvironment.fromCollection(scala.collection.Iterator<T> data,
scala.reflect.ClassTag<T> evidence$12,
TypeInformation<T> evidence$13)
Creates a DataSet from the given
Iterator . |
<T> DataSet<T> |
ExecutionEnvironment.fromElements(scala.collection.Seq<T> data,
scala.reflect.ClassTag<T> evidence$14,
TypeInformation<T> evidence$15)
Creates a new data set that contains the given elements.
|
<T> DataSet<T> |
ExecutionEnvironment.fromParallelCollection(SplittableIterator<T> iterator,
scala.reflect.ClassTag<T> evidence$16,
TypeInformation<T> evidence$17)
Creates a new data set that contains elements in the iterator.
|
<K> GroupedDataSet<T> |
DataSet.groupBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$29)
Creates a
GroupedDataSet which provides operations on groups of elements. |
<R> DataSet<R> |
DataSet.map(scala.Function1<T,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5)
Creates a new DataSet by applying the given function to every element of this DataSet.
|
<R> DataSet<R> |
DataSet.map(MapFunction<T,R> mapper,
TypeInformation<R> evidence$2,
scala.reflect.ClassTag<R> evidence$3)
Creates a new DataSet by applying the given function to every element of this DataSet.
|
<R> DataSet<R> |
DataSet.mapPartition(scala.Function1<scala.collection.Iterator<T>,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$10,
scala.reflect.ClassTag<R> evidence$11)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
<R> DataSet<R> |
DataSet.mapPartition(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$8,
scala.reflect.ClassTag<R> evidence$9)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
<R> DataSet<R> |
DataSet.mapPartition(MapPartitionFunction<T,R> partitionMapper,
TypeInformation<R> evidence$6,
scala.reflect.ClassTag<R> evidence$7)
Creates a new DataSet by applying the given function to each parallel partition of the
DataSet.
|
<K> DataSet<T> |
DataSet.partitionByHash(scala.Function1<T,K> fun,
TypeInformation<K> evidence$35)
Partitions a DataSet using the specified key selector function.
|
<K> DataSet<T> |
DataSet.partitionByRange(scala.Function1<T,K> fun,
TypeInformation<K> evidence$36)
Range-partitions a DataSet using the specified key selector function.
|
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$39)
Partitions a DataSet on the key returned by the selector, using a custom partitioner.
|
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
int field,
TypeInformation<K> evidence$37)
Partitions a tuple DataSet on the specified key fields using a custom partitioner.
|
<K> DataSet<T> |
DataSet.partitionCustom(Partitioner<K> partitioner,
String field,
TypeInformation<K> evidence$38)
Partitions a POJO DataSet on the specified key fields using a custom partitioner.
|
<T> DataSet<T> |
ExecutionEnvironment.readCsvFile(String filePath,
String lineDelimiter,
String fieldDelimiter,
Character quoteCharacter,
boolean ignoreFirstLine,
String ignoreComments,
boolean lenient,
int[] includedFields,
String[] pojoFields,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Creates a DataSet by reading the given CSV file.
|
<T> DataSet<T> |
ExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
scala.reflect.ClassTag<T> evidence$5,
TypeInformation<T> evidence$6)
Creates a new DataSource by reading the specified file using the custom
FileInputFormat . |
<T> DataSet<T> |
ExecutionEnvironment.readFileOfPrimitives(String filePath,
String delimiter,
scala.reflect.ClassTag<T> evidence$3,
TypeInformation<T> evidence$4)
Creates a DataSet that represents the primitive type produced by reading the
given file in delimited way.This method is similar to
readCsvFile with
single field, but it produces a DataSet not through Tuple. |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapred.JobConf job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given FileInputFormat . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
org.apache.hadoop.mapreduce.Job job,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given FileInputFormat . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapred.FileInputFormat<K,V> mapredInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given FileInputFormat . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readHadoopFile(org.apache.hadoop.mapreduce.lib.input.FileInputFormat<K,V> mapreduceInputFormat,
Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
Creates a
DataSet from the given
FileInputFormat . |
<K,V> DataSet<scala.Tuple2<K,V>> |
ExecutionEnvironment.readSequenceFile(Class<K> key,
Class<V> value,
String inputPath,
TypeInformation<scala.Tuple2<K,V>> tpe)
|
<R> DataSet<R> |
GroupedDataSet.reduceGroup(scala.Function1<scala.collection.Iterator<T>,R> fun,
TypeInformation<R> evidence$4,
scala.reflect.ClassTag<R> evidence$5)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the group reduce function. |
<R> DataSet<R> |
DataSet.reduceGroup(scala.Function1<scala.collection.Iterator<T>,R> fun,
TypeInformation<R> evidence$22,
scala.reflect.ClassTag<R> evidence$23)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<R> DataSet<R> |
GroupedDataSet.reduceGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$6,
scala.reflect.ClassTag<R> evidence$7)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the group reduce function. |
<R> DataSet<R> |
DataSet.reduceGroup(scala.Function2<scala.collection.Iterator<T>,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$20,
scala.reflect.ClassTag<R> evidence$21)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<R> DataSet<R> |
GroupedDataSet.reduceGroup(GroupReduceFunction<T,R> reducer,
TypeInformation<R> evidence$8,
scala.reflect.ClassTag<R> evidence$9)
Creates a new
DataSet by passing for each group (elements with the same key) the list
of elements to the GroupReduceFunction . |
<R> DataSet<R> |
DataSet.reduceGroup(GroupReduceFunction<T,R> reducer,
TypeInformation<R> evidence$18,
scala.reflect.ClassTag<R> evidence$19)
Creates a new
DataSet by passing all elements in this DataSet to the group reduce function. |
<K> GroupedDataSet<T> |
GroupedDataSet.sortGroup(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$2)
Adds a secondary sort key to this
GroupedDataSet . |
<K> DataSet<T> |
PartitionSortedDataSet.sortPartition(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$2) |
<K> DataSet<T> |
DataSet.sortPartition(scala.Function1<T,K> fun,
Order order,
TypeInformation<K> evidence$40)
Locally sorts the partitions of the DataSet on the extracted key in the specified order.
|
<K> HalfUnfinishedKeyPairOperation<L,R,O> |
UnfinishedKeyPairOperation.where(scala.Function1<L,K> fun,
TypeInformation<K> evidence$1)
Specify the key selector function for the left side of the key based operation.
|
<K> GroupedDataSet<T> |
GroupedDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$3)
Sets a custom partitioner for the grouping.
|
<K> JoinDataSet<L,R> |
JoinDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$9) |
<K> CoGroupDataSet<L,R> |
CoGroupDataSet.withPartitioner(Partitioner<K> partitioner,
TypeInformation<K> evidence$7) |
<K> JoinFunctionAssigner<L,R> |
JoinFunctionAssigner.withPartitioner(Partitioner<K> part,
TypeInformation<K> evidence$19) |
Modifier and Type | Method and Description |
---|---|
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkArrayTypeInfo(TypeDescriptors.ArrayDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$9) |
<T extends scala.Product> |
TypeInformationGen.mkCaseClassTypeInfo(TypeDescriptors.CaseClassDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$3) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkEitherTypeInfo(TypeDescriptors.EitherDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$4) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkEnumValueTypeInfo(TypeDescriptors.EnumValueDescriptor d,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$5) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkGenericTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$14) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkJavaTuple(TypeDescriptors.JavaTupleDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$12) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkOptionTypeInfo(TypeDescriptors.OptionDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$7) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkPojo(TypeDescriptors.PojoDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$13) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkPrimitiveTypeInfo(scala.reflect.macros.Context.universe tpe,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$16) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTraversableTypeInfo(TypeDescriptors.TraversableDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$8) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTryTypeInfo(TypeDescriptors.TryDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$6) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$2) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeInfo(scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeInformationGen.mkTypeParameter(TypeDescriptors.TypeParameterDescriptor typeParameter,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$15) |
<T extends Value> |
TypeInformationGen.mkValueTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$10) |
<T extends org.apache.hadoop.io.Writable> |
TypeInformationGen.mkWritableTypeInfo(TypeDescriptors.UDTDescriptor desc,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$11) |
Modifier and Type | Method and Description |
---|---|
void |
ScalaCsvOutputFormat.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig)
The purpose of this method is solely to check whether the data type to be processed
is in fact a tuple type.
|
Constructor and Description |
---|
AggregatingUdf(TypeInformation<T> typeInfo,
AggregationFunction<Object>[] aggFunctions,
int[] fieldPositions) |
Modifier and Type | Method and Description |
---|---|
Cast |
ImplicitExpressionOperations.cast(TypeInformation<?> toType) |
<T> DataSet<T> |
TableConversions.toDataSet(TypeInformation<T> evidence$1)
Converts the
Table to a DataSet . |
<T> DataStream<T> |
TableConversions.toDataStream(TypeInformation<T> evidence$2)
Converts the
Table to a DataStream . |
<O> DataSet<O> |
ScalaBatchTranslator.translate(PlanNode op,
TypeInformation<O> tpe) |
<O> DataStream<O> |
ScalaStreamingTranslator.translate(PlanNode op,
TypeInformation<O> tpe) |
Modifier and Type | Method and Description |
---|---|
<A> Table |
ScalaBatchTranslator.createTable(DataSet<A> repr,
CompositeType<A> inputType,
Expression[] expressions,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> resultFields) |
<A> Table |
ScalaStreamingTranslator.createTable(DataStream<A> repr,
CompositeType<A> inputType,
Expression[] expressions,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> resultFields) |
Modifier and Type | Class and Description |
---|---|
class |
CaseClassTypeInfo<T extends scala.Product>
TypeInformation for Case Classes.
|
class |
EnumValueTypeInfo<E extends scala.Enumeration>
TypeInformation for
Enumeration values. |
class |
OptionTypeInfo<A,T extends scala.Option<A>>
TypeInformation for
Option . |
class |
ScalaNothingTypeInfo |
class |
TraversableTypeInfo<T extends scala.collection.TraversableOnce<E>,E>
TypeInformation for Scala Collections.
|
class |
TryTypeInfo<A,T extends scala.util.Try<A>>
TypeInformation for
Try . |
class |
UnitTypeInfo |
Modifier and Type | Method and Description |
---|---|
TypeInformation<E> |
TraversableTypeInfo.elementTypeInfo() |
TypeInformation<A> |
TryTypeInfo.elemTypeInfo() |
<X> TypeInformation<X> |
CaseClassTypeInfo.getTypeAt(String fieldExpression) |
TypeInformation<A> |
EitherTypeInfo.leftTypeInfo() |
TypeInformation<B> |
EitherTypeInfo.rightTypeInfo() |
TypeInformation<?>[] |
CaseClassTypeInfo.typeParamTypeInfos() |
Modifier and Type | Method and Description |
---|---|
static <T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeUtils.createTypeInfo(scala.reflect.macros.Context c,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
<T> scala.reflect.api.Exprs.Expr<TypeInformation<T>> |
TypeUtils$.createTypeInfo(scala.reflect.macros.Context c,
scala.reflect.api.TypeTags.WeakTypeTag<T> evidence$1) |
List<TypeInformation<?>> |
TryTypeInfo.getGenericParameters() |
List<TypeInformation<?>> |
CaseClassTypeInfo.getGenericParameters() |
List<TypeInformation<?>> |
OptionTypeInfo.getGenericParameters() |
List<TypeInformation<?>> |
EnumValueTypeInfo.getGenericParameters() |
List<TypeInformation<?>> |
EitherTypeInfo.getGenericParameters() |
List<TypeInformation<?>> |
TraversableTypeInfo.getGenericParameters() |
Constructor and Description |
---|
CaseClassTypeInfo(Class<T> clazz,
TypeInformation<?>[] typeParamTypeInfos,
scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.collection.Seq<String> fieldNames) |
EitherTypeInfo(Class<T> clazz,
TypeInformation<A> leftTypeInfo,
TypeInformation<B> rightTypeInfo) |
EitherTypeInfo(Class<T> clazz,
TypeInformation<A> leftTypeInfo,
TypeInformation<B> rightTypeInfo) |
OptionTypeInfo(TypeInformation<A> elemTypeInfo) |
TraversableTypeInfo(Class<T> clazz,
TypeInformation<E> elementTypeInfo) |
TryTypeInfo(TypeInformation<A> elemTypeInfo) |
Constructor and Description |
---|
CaseClassTypeInfo(Class<T> clazz,
TypeInformation<?>[] typeParamTypeInfos,
scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.collection.Seq<String> fieldNames) |
Modifier and Type | Method and Description |
---|---|
protected String |
ExpressionCodeGenerator.defaultPrimitive(TypeInformation<?> tpe) |
protected String |
ExpressionCodeGenerator.getField(scala.reflect.runtime.universe inputTerm,
CompositeType<?> inputType,
String fieldName,
TypeInformation<?> fieldType) |
protected String |
ExpressionCodeGenerator.typeTermForTypeInfo(TypeInformation<?> tpe) |
protected String |
ExpressionCodeGenerator.typeTermForTypeInfoForCast(TypeInformation<?> tpe) |
Modifier and Type | Method and Description |
---|---|
abstract TypeInformation<?> |
Expression.typeInfo() |
TypeInformation<?> |
BitwiseNot.typeInfo() |
TypeInformation<?> |
BinaryArithmetic.typeInfo() |
TypeInformation<?> |
ResolvedFieldReference.typeInfo() |
TypeInformation<?> |
Literal.typeInfo() |
TypeInformation<?> |
UnaryMinus.typeInfo() |
TypeInformation<?> |
Abs.typeInfo() |
TypeInformation<?> |
Plus.typeInfo() |
TypeInformation<?> |
BitwiseBinaryArithmetic.typeInfo() |
TypeInformation<?> |
Cast.typeInfo() |
TypeInformation<?> |
Aggregation.typeInfo() |
TypeInformation<?> |
Naming.typeInfo() |
Constructor and Description |
---|
Cast(Expression child,
TypeInformation<?> tpe) |
Literal(Object value,
TypeInformation<?> tpe) |
ResolvedFieldReference(String name,
TypeInformation<?> tpe) |
Constructor and Description |
---|
GroupByAnalyzer(scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> inputFields) |
PredicateAnalyzer(scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> inputFields) |
ResolveFieldReferences(scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> inputFields) |
SelectionAnalyzer(scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> inputFields) |
Modifier and Type | Method and Description |
---|---|
abstract scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
PlanNode.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
Root.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<Object>>> |
As.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
GroupBy.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
Filter.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
Select.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
Join.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
Aggregate.outputFields() |
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> |
UnionAll.outputFields() |
Modifier and Type | Method and Description |
---|---|
abstract <A> Object |
PlanTranslator.translate(PlanNode op,
TypeInformation<A> tpe)
Translates the given Table API
PlanNode back to the underlying representation, i.e,
a DataSet or a DataStream. |
Modifier and Type | Method and Description |
---|---|
abstract <A> Table |
PlanTranslator.createTable(Object repr,
CompositeType<A> inputType,
Expression[] expressions,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> resultFields)
Creates a
Table from a DataSet or a DataStream (the underlying representation). |
Constructor and Description |
---|
Root(T input,
scala.collection.Seq<scala.Tuple2<String,TypeInformation<?>>> outputFields) |
Modifier and Type | Class and Description |
---|---|
class |
RenamingProxyTypeInfo<T>
A TypeInformation that is used to rename fields of an underlying CompositeType.
|
class |
RowTypeInfo
TypeInformation for
Row . |
Modifier and Type | Method and Description |
---|---|
<X> TypeInformation<X> |
RenamingProxyTypeInfo.getTypeAt(int pos) |
<X> TypeInformation<X> |
RenamingProxyTypeInfo.getTypeAt(String fieldExpression) |
Modifier and Type | Method and Description |
---|---|
List<TypeInformation<?>> |
RenamingProxyTypeInfo.getGenericParameters() |
Constructor and Description |
---|
RowTypeInfo(scala.collection.Seq<TypeInformation<?>> fieldTypes,
scala.collection.Seq<String> fieldNames) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tweet> |
SimpleTweetInputFormat.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunction<K,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the edge values of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunctionWithVertexValue<K,VV,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the edge values of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunction<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the neighbors (both edges and vertices)
of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunctionWithVertexValue<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> typeInfo)
Groups by vertex and computes a GroupReduce transformation over the neighbors (both edges and vertices)
of each vertex.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(MapFunction<Edge<K,EV>,NV> mapper,
TypeInformation<Edge<K,NV>> returnType)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(MapFunction<Vertex<K,VV>,NV> mapper,
TypeInformation<Vertex<K,NV>> returnType)
Apply a function to the attribute of each vertex in the graph.
|
Modifier and Type | Method and Description |
---|---|
TypeInformation<VV> |
LabelPropagation.SendNewLabelToNeighbors.getProducedType() |
Constructor and Description |
---|
SendNewLabelToNeighbors(TypeInformation<VV> typeInformation) |
Modifier and Type | Method and Description |
---|---|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$23,
scala.reflect.ClassTag<K> evidence$24,
TypeInformation<EV> evidence$25,
scala.reflect.ClassTag<EV> evidence$26)
Creates a Graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$27,
scala.reflect.ClassTag<K> evidence$28,
TypeInformation<VV> evidence$29,
scala.reflect.ClassTag<VV> evidence$30,
TypeInformation<EV> evidence$31,
scala.reflect.ClassTag<EV> evidence$32)
Creates a graph from a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCollection(scala.collection.Seq<Vertex<K,VV>> vertices,
scala.collection.Seq<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$17,
scala.reflect.ClassTag<K> evidence$18,
TypeInformation<VV> evidence$19,
scala.reflect.ClassTag<VV> evidence$20,
TypeInformation<EV> evidence$21,
scala.reflect.ClassTag<EV> evidence$22)
Creates a Graph from a Seq of vertices and a Seq of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromCsvReader(ExecutionEnvironment env,
String pathEdges,
String pathVertices,
String lineDelimiterVertices,
String fieldDelimiterVertices,
Character quoteCharacterVertices,
boolean ignoreFirstLineVertices,
String ignoreCommentsVertices,
boolean lenientVertices,
int[] includedFieldsVertices,
String lineDelimiterEdges,
String fieldDelimiterEdges,
Character quoteCharacterEdges,
boolean ignoreFirstLineEdges,
String ignoreCommentsEdges,
boolean lenientEdges,
int[] includedFieldsEdges,
MapFunction<K,VV> vertexValueInitializer,
TypeInformation<K> evidence$55,
scala.reflect.ClassTag<K> evidence$56,
TypeInformation<VV> evidence$57,
scala.reflect.ClassTag<VV> evidence$58,
TypeInformation<EV> evidence$59,
scala.reflect.ClassTag<EV> evidence$60)
Creates a Graph from a CSV file of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$7,
scala.reflect.ClassTag<K> evidence$8,
TypeInformation<EV> evidence$9,
scala.reflect.ClassTag<EV> evidence$10)
Creates a Graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Edge<K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$11,
scala.reflect.ClassTag<K> evidence$12,
TypeInformation<VV> evidence$13,
scala.reflect.ClassTag<VV> evidence$14,
TypeInformation<EV> evidence$15,
scala.reflect.ClassTag<EV> evidence$16)
Creates a graph from a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromDataSet(DataSet<Vertex<K,VV>> vertices,
DataSet<Edge<K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$1,
scala.reflect.ClassTag<K> evidence$2,
TypeInformation<VV> evidence$3,
scala.reflect.ClassTag<VV> evidence$4,
TypeInformation<EV> evidence$5,
scala.reflect.ClassTag<EV> evidence$6)
Creates a Graph from a DataSet of vertices and a DataSet of edges.
|
<K> Graph<K,NullValue,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$49,
scala.reflect.ClassTag<K> evidence$50)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K> Graph<K,NullValue,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$49,
scala.reflect.ClassTag<K> evidence$50)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV> Graph<K,VV,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV> Graph<K,VV,NullValue> |
Graph$.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K,VV> Graph<K,VV,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
static <K,VV> Graph<K,VV,NullValue> |
Graph.fromTuple2DataSet(DataSet<scala.Tuple2<K,K>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$51,
scala.reflect.ClassTag<K> evidence$52,
TypeInformation<VV> evidence$53,
scala.reflect.ClassTag<VV> evidence$54)
Creates a Graph from a DataSet of Tuple2's representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple2<K,VV>> vertices,
DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$33,
scala.reflect.ClassTag<K> evidence$34,
TypeInformation<VV> evidence$35,
scala.reflect.ClassTag<VV> evidence$36,
TypeInformation<EV> evidence$37,
scala.reflect.ClassTag<EV> evidence$38)
Creates a graph from DataSets of tuples for vertices and for edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,EV> Graph<K,NullValue,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,EV> Graph<K,NullValue,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
ExecutionEnvironment env,
TypeInformation<K> evidence$39,
scala.reflect.ClassTag<K> evidence$40,
TypeInformation<EV> evidence$41,
scala.reflect.ClassTag<EV> evidence$42)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<K,VV,EV> Graph<K,VV,EV> |
Graph$.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
static <K,VV,EV> Graph<K,VV,EV> |
Graph.fromTupleDataSet(DataSet<scala.Tuple3<K,K,EV>> edges,
MapFunction<K,VV> vertexValueInitializer,
ExecutionEnvironment env,
TypeInformation<K> evidence$43,
scala.reflect.ClassTag<K> evidence$44,
TypeInformation<VV> evidence$45,
scala.reflect.ClassTag<VV> evidence$46,
TypeInformation<EV> evidence$47,
scala.reflect.ClassTag<EV> evidence$48)
Creates a Graph from a DataSet of Tuples representing the edges.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunction<K,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> evidence$85,
scala.reflect.ClassTag<T> evidence$86)
Compute an aggregate over the edges of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnEdges(EdgesFunctionWithVertexValue<K,VV,EV,T> edgesFunction,
EdgeDirection direction,
TypeInformation<T> evidence$83,
scala.reflect.ClassTag<T> evidence$84)
Compute an aggregate over the edges of each vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunction<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> evidence$89,
scala.reflect.ClassTag<T> evidence$90)
Compute an aggregate over the neighbors (edges and vertices) of each
vertex.
|
<T> DataSet<T> |
Graph.groupReduceOnNeighbors(NeighborsFunctionWithVertexValue<K,VV,EV,T> neighborsFunction,
EdgeDirection direction,
TypeInformation<T> evidence$87,
scala.reflect.ClassTag<T> evidence$88)
Compute an aggregate over the neighbors (edges and vertices) of each
vertex.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdges(DataSet<scala.Tuple3<K,K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$77)
Joins the edge DataSet with an input DataSet on the composite key of both
source and target IDs and applies a user-defined transformation on the values
of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdges(DataSet<scala.Tuple3<K,K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$78)
Joins the edge DataSet with an input DataSet on the composite key of both
source and target IDs and applies a user-defined transformation on the values
of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnSource(DataSet<scala.Tuple2<K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$79)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnSource(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$80)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnTarget(DataSet<scala.Tuple2<K,T>> inputDataSet,
EdgeJoinFunction<EV,T> edgeJoinFunction,
TypeInformation<T> evidence$81)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithEdgesOnTarget(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<EV,T,EV> fun,
TypeInformation<T> evidence$82)
Joins the edge DataSet with an input Tuple2 DataSet and applies a user-defined transformation
on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithVertices(DataSet<scala.Tuple2<K,T>> inputDataSet,
scala.Function2<VV,T,VV> fun,
TypeInformation<T> evidence$76)
Joins the vertex DataSet of this graph with an input Tuple2 DataSet and applies
a user-defined transformation on the values of the matched records.
|
<T> Graph<K,VV,EV> |
Graph.joinWithVertices(DataSet<scala.Tuple2<K,T>> inputDataSet,
VertexJoinFunction<VV,T> vertexJoinFunction,
TypeInformation<T> evidence$75)
Joins the vertex DataSet of this graph with an input Tuple2 DataSet and applies
a user-defined transformation on the values of the matched records.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(scala.Function1<Edge<K,EV>,NV> fun,
TypeInformation<NV> evidence$73,
scala.reflect.ClassTag<NV> evidence$74)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,VV,NV> |
Graph.mapEdges(MapFunction<Edge<K,EV>,NV> mapper,
TypeInformation<NV> evidence$71,
scala.reflect.ClassTag<NV> evidence$72)
Apply a function to the attribute of each edge in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(scala.Function1<Vertex<K,VV>,NV> fun,
TypeInformation<NV> evidence$69,
scala.reflect.ClassTag<NV> evidence$70)
Apply a function to the attribute of each vertex in the graph.
|
<NV> Graph<K,NV,EV> |
Graph.mapVertices(MapFunction<Vertex<K,VV>,NV> mapper,
TypeInformation<NV> evidence$67,
scala.reflect.ClassTag<NV> evidence$68)
Apply a function to the attribute of each vertex in the graph.
|
<T> T |
Graph.run(GraphAlgorithm<K,VV,EV,T> algorithm,
TypeInformation<T> evidence$91,
scala.reflect.ClassTag<T> evidence$92) |
Constructor and Description |
---|
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Graph(Graph<K,VV,EV> jgraph,
TypeInformation<K> evidence$61,
scala.reflect.ClassTag<K> evidence$62,
TypeInformation<VV> evidence$63,
scala.reflect.ClassTag<VV> evidence$64,
TypeInformation<EV> evidence$65,
scala.reflect.ClassTag<EV> evidence$66) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopReduceFunction.getProducedType() |
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopMapFunction.getProducedType() |
TypeInformation<Tuple2<KEYOUT,VALUEOUT>> |
HadoopReduceCombineFunction.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
HCatInputFormatBase.getProducedType() |
Modifier and Type | Method and Description |
---|---|
static <T> DataSet<Block<T>> |
FlinkMLTools.block(DataSet<T> input,
int numBlocks,
scala.Option<Partitioner<Object>> partitionerOption,
TypeInformation<T> evidence$31,
scala.reflect.ClassTag<T> evidence$32)
Groups the DataSet input into numBlocks blocks.
|
<T> DataSet<Block<T>> |
FlinkMLTools$.block(DataSet<T> input,
int numBlocks,
scala.Option<Partitioner<Object>> partitionerOption,
TypeInformation<T> evidence$31,
scala.reflect.ClassTag<T> evidence$32)
Groups the DataSet input into numBlocks blocks.
|
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D,E> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D,E> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
DataSet<E> ds5,
String path1,
String path2,
String path3,
String path4,
String path5,
scala.reflect.ClassTag<A> evidence$21,
TypeInformation<A> evidence$22,
scala.reflect.ClassTag<B> evidence$23,
TypeInformation<B> evidence$24,
scala.reflect.ClassTag<C> evidence$25,
TypeInformation<C> evidence$26,
scala.reflect.ClassTag<D> evidence$27,
TypeInformation<D> evidence$28,
scala.reflect.ClassTag<E> evidence$29,
TypeInformation<E> evidence$30)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C,D> scala.Tuple4<DataSet<A>,DataSet<B>,DataSet<C>,DataSet<D>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
DataSet<D> ds4,
String path1,
String path2,
String path3,
String path4,
scala.reflect.ClassTag<A> evidence$13,
TypeInformation<A> evidence$14,
scala.reflect.ClassTag<B> evidence$15,
TypeInformation<B> evidence$16,
scala.reflect.ClassTag<C> evidence$17,
TypeInformation<C> evidence$18,
scala.reflect.ClassTag<D> evidence$19,
TypeInformation<D> evidence$20)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B,C> scala.Tuple3<DataSet<A>,DataSet<B>,DataSet<C>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
DataSet<C> ds3,
String path1,
String path2,
String path3,
scala.reflect.ClassTag<A> evidence$7,
TypeInformation<A> evidence$8,
scala.reflect.ClassTag<B> evidence$9,
TypeInformation<B> evidence$10,
scala.reflect.ClassTag<C> evidence$11,
TypeInformation<C> evidence$12)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
<A,B> scala.Tuple2<DataSet<A>,DataSet<B>> |
FlinkMLTools$.persist(DataSet<A> ds1,
DataSet<B> ds2,
String path1,
String path2,
scala.reflect.ClassTag<A> evidence$3,
TypeInformation<A> evidence$4,
scala.reflect.ClassTag<B> evidence$5,
TypeInformation<B> evidence$6)
Writes multiple
DataSet s to the specified paths and returns them as DataSources for
subsequent operations. |
static <T> DataSet<T> |
FlinkMLTools.persist(DataSet<T> dataset,
String path,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Writes a
DataSet to the specified path and returns it as a DataSource for subsequent
operations. |
<T> DataSet<T> |
FlinkMLTools$.persist(DataSet<T> dataset,
String path,
scala.reflect.ClassTag<T> evidence$1,
TypeInformation<T> evidence$2)
Writes a
DataSet to the specified path and returns it as a DataSource for subsequent
operations. |
Modifier and Type | Method and Description |
---|---|
<T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor$.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
<T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor$.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
static <T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
static <T extends Transformer<T>,P extends Predictor<P>,Testing,Intermediate,PredictionValue> |
ChainedPredictor.chainedEvaluationOperation(TransformDataSetOperation<T,Testing,Intermediate> transformOperation,
EvaluateDataSetOperation<P,Intermediate,PredictionValue> evaluateOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation) |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultEvaluateDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
EvaluateDataSetOperation which takes a PredictOperation to calculate a tuple
of true label value and predicted label value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultEvaluateDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
EvaluateDataSetOperation which takes a PredictOperation to calculate a tuple
of true label value and predicted label value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultPredictDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
PredictDataSetOperation which takes a PredictOperation to calculate a tuple
of testing element and its prediction value. |
<Instance extends Estimator<Instance>,Model,Testing,PredictionValue> |
Predictor$.defaultPredictDataSetOperation(PredictOperation<Instance,Model,Testing,PredictionValue> predictOperation,
TypeInformation<Testing> testingTypeInformation,
TypeInformation<PredictionValue> predictionValueTypeInformation)
Default
PredictDataSetOperation which takes a PredictOperation to calculate a tuple
of testing element and its prediction value. |
<Instance extends Estimator<Instance>,Model,Input,Output> |
Transformer$.defaultTransformDataSetOperation(TransformOperation<Instance,Model,Input,Output> transformOperation,
TypeInformation<Output> outputTypeInformation,
scala.reflect.ClassTag<Output> outputClassTag) |
Modifier and Type | Method and Description |
---|---|
<T extends Vector> |
StandardScaler$.fitLabelVectorTupleStandardScaler(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
Trains the
StandardScaler by learning the mean and standard deviation of the training
data which is of type (Vector , Double). |
static <T extends Vector> |
StandardScaler.fitLabelVectorTupleStandardScaler(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
Trains the
StandardScaler by learning the mean and standard deviation of the training
data which is of type (Vector , Double). |
<T extends Vector> |
StandardScaler$.transformTupleVectorDouble(BreezeVectorConverter<T> evidence$10,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
|
static <T extends Vector> |
StandardScaler.transformTupleVectorDouble(BreezeVectorConverter<T> evidence$10,
TypeInformation<T> evidence$11,
scala.reflect.ClassTag<T> evidence$12)
|
static <T extends Vector> |
PolynomialFeatures.transformVectorIntoPolynomialBase(VectorBuilder<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation to map a Vector into the
polynomial feature space. |
<T extends Vector> |
PolynomialFeatures$.transformVectorIntoPolynomialBase(VectorBuilder<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation to map a Vector into the
polynomial feature space. |
<T extends Vector> |
StandardScaler$.transformVectors(BreezeVectorConverter<T> evidence$7,
TypeInformation<T> evidence$8,
scala.reflect.ClassTag<T> evidence$9)
TransformOperation to transform Vector types |
<T extends Vector> |
MinMaxScaler$.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation which scales input data of subtype of Vector with respect to
the calculated minimum and maximum of the training data. |
static <T extends Vector> |
MinMaxScaler.transformVectors(BreezeVectorConverter<T> evidence$1,
TypeInformation<T> evidence$2,
scala.reflect.ClassTag<T> evidence$3)
TransformDataSetOperation which scales input data of subtype of Vector with respect to
the calculated minimum and maximum of the training data. |
static <T extends Vector> |
StandardScaler.transformVectors(BreezeVectorConverter<T> evidence$7,
TypeInformation<T> evidence$8,
scala.reflect.ClassTag<T> evidence$9)
TransformOperation to transform Vector types |
Constructor and Description |
---|
StandardScalerTransformOperation(TypeInformation<T> evidence$4,
scala.reflect.ClassTag<T> evidence$5) |
Constructor and Description |
---|
NoOpBinaryUdfOp(TypeInformation<OUT> type) |
Modifier and Type | Field and Description |
---|---|
TypeInformation<?> |
PythonOperationInfo.types |
Modifier and Type | Method and Description |
---|---|
TypeInformation<OUT> |
PythonMapPartition.getProducedType() |
TypeInformation<OUT> |
PythonCoGroup.getProducedType() |
Constructor and Description |
---|
PythonCoGroup(int id,
TypeInformation<OUT> typeInformation) |
PythonMapPartition(int id,
TypeInformation<OUT> typeInformation) |
Modifier and Type | Method and Description |
---|---|
static TypeInformation<?>[] |
StreamProjection.extractFieldTypes(int[] fields,
TypeInformation<?> inType) |
TypeInformation<T> |
WindowedStream.getInputType() |
TypeInformation<T> |
AllWindowedStream.getInputType() |
TypeInformation<KEY> |
KeyedStream.getKeyType()
Gets the type of the key by which the stream is partitioned.
|
TypeInformation<T> |
DataStream.getType()
Gets the type of the stream.
|
TypeInformation<IN1> |
ConnectedStreams.getType1()
Gets the type of the first input
|
TypeInformation<IN2> |
ConnectedStreams.getType2()
Gets the type of the second input
|
Modifier and Type | Method and Description |
---|---|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(AllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<T> DataStream<T> |
CoGroupedStreams.WithWindow.apply(CoGroupFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<T> DataStream<T> |
JoinedStreams.WithWindow.apply(FlatJoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the join operation with the user function that is executed
for each combination of elements with the same key in a window.
|
<T> DataStream<T> |
JoinedStreams.WithWindow.apply(JoinFunction<T1,T2,T> function,
TypeInformation<T> resultType)
Completes the join operation with the user function that is executed
for each combination of elements with the same key in a window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(ReduceFunction<T> preAggregator,
AllWindowFunction<T,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(ReduceFunction<T> reduceFunction,
WindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
AllWindowFunction<R,R,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
WindowFunction<R,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
<R> SingleOutputStreamOperator<R> |
WindowedStream.apply(WindowFunction<T,R,K,W> function,
TypeInformation<R> resultType)
Applies the given window function to each window.
|
static TypeInformation<?>[] |
StreamProjection.extractFieldTypes(int[] fields,
TypeInformation<?> inType) |
<R> SingleOutputStreamOperator<R> |
WindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> resultType)
Applies the given fold function to each window.
|
<R> SingleOutputStreamOperator<R> |
AllWindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> resultType)
Applies the given fold function to each window.
|
SingleOutputStreamOperator<T> |
SingleOutputStreamOperator.returns(TypeInformation<T> typeInfo)
Adds a type information hint about the return type of this operator.
|
<R> SingleOutputStreamOperator<R> |
KeyedStream.transform(String operatorName,
TypeInformation<R> outTypeInfo,
OneInputStreamOperator<T,R> operator) |
<R> SingleOutputStreamOperator<R> |
DataStream.transform(String operatorName,
TypeInformation<R> outTypeInfo,
OneInputStreamOperator<T,R> operator)
Method for passing user defined operators along with the type
information that will transform the DataStream.
|
<R> SingleOutputStreamOperator<R> |
ConnectedStreams.transform(String functionName,
TypeInformation<R> outTypeInfo,
TwoInputStreamOperator<IN1,IN2,R> operator) |
<F> IterativeStream.ConnectedIterativeStreams<T,F> |
IterativeStream.withFeedbackType(TypeInformation<F> feedbackType)
Changes the feedback type of the iteration and allows the user to apply
co-transformations on the input and feedback stream, as in a
ConnectedStreams . |
Modifier and Type | Method and Description |
---|---|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.addSource(SourceFunction<OUT> function,
String sourceName,
TypeInformation<OUT> typeInfo)
Ads a data source with a custom type information thus opening a
DataStream . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.addSource(SourceFunction<OUT> function,
TypeInformation<OUT> typeInfo)
Ads a data source with a custom type information thus opening a
DataStream . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.createInput(InputFormat<OUT,?> inputFormat,
TypeInformation<OUT> typeInfo)
Generic method to create an input data stream with
InputFormat . |
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromCollection(Collection<OUT> data,
TypeInformation<OUT> typeInfo)
Creates a data stream from the given non-empty collection.
|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromCollection(Iterator<OUT> data,
TypeInformation<OUT> typeInfo)
Creates a data stream from the given iterator.
|
<OUT> DataStreamSource<OUT> |
StreamExecutionEnvironment.fromParallelCollection(SplittableIterator<OUT> iterator,
TypeInformation<OUT> typeInfo)
Creates a new data stream that contains elements in the iterator.
|
Constructor and Description |
---|
ComparableAggregator(int positionToAggregate,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
boolean first,
ExecutionConfig config) |
ComparableAggregator(int positionToAggregate,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
ExecutionConfig config) |
ComparableAggregator(String field,
TypeInformation<T> typeInfo,
AggregationFunction.AggregationType aggregationType,
boolean first,
ExecutionConfig config) |
SumAggregator(int pos,
TypeInformation<T> typeInfo,
ExecutionConfig config) |
SumAggregator(String field,
TypeInformation<T> typeInfo,
ExecutionConfig config) |
Modifier and Type | Method and Description |
---|---|
void |
OutputFormatSinkFunction.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<OUT> |
ConnectorSource.getProducedType() |
Constructor and Description |
---|
FileSourceFunction(InputFormat<OUT,?> format,
TypeInformation<OUT> typeInfo) |
MessageAcknowledgingSourceBase(TypeInformation<UId> idTypeInfo)
Creates a new MessageAcknowledgingSourceBase for IDs of the given type.
|
MultipleIdsMessageAcknowledgingSourceBase(TypeInformation<UId> idTypeInfo)
Creates a new MessageAcknowledgingSourceBase for IDs of the given type.
|
Modifier and Type | Method and Description |
---|---|
void |
FoldApplyWindowFunction.setOutputType(TypeInformation<ACC> outTypeInfo,
ExecutionConfig executionConfig) |
void |
FoldApplyAllWindowFunction.setOutputType(TypeInformation<ACC> outTypeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN1,IN2,OUT> |
StreamGraph.addCoOperator(Integer vertexID,
String slotSharingGroup,
TwoInputStreamOperator<IN1,IN2,OUT> taskOperatorObject,
TypeInformation<IN1> in1TypeInfo,
TypeInformation<IN2> in2TypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addOperator(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addOperator(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSink(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSink(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSource(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<IN,OUT> void |
StreamGraph.addSource(Integer vertexID,
String slotSharingGroup,
StreamOperator<OUT> operatorObject,
TypeInformation<IN> inTypeInfo,
TypeInformation<OUT> outTypeInfo,
String operatorName) |
<OUT> void |
StreamGraph.setOutType(Integer vertexID,
TypeInformation<OUT> outType) |
Modifier and Type | Method and Description |
---|---|
<S> OperatorState<S> |
StreamingRuntimeContext.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState)
Deprecated.
|
void |
StreamGroupedFold.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig) |
void |
OutputTypeConfigurable.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig)
Is called by the
org.apache.flink.streaming.api.graph.StreamGraph#addOperator(Integer, StreamOperator, TypeInformation, TypeInformation, String)
method when the StreamGraph is generated. |
void |
AbstractUdfStreamOperator.setOutputType(TypeInformation<OUT> outTypeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
DataStream.dataType()
Returns the TypeInformation for the elements of this DataStream.
|
TypeInformation<K> |
KeyedStream.getKeyType()
Gets the type of the key by which this stream is keyed.
|
TypeInformation<K> |
KeySelectorWithType.getProducedType() |
TypeInformation<T> |
DataStream.getType()
Deprecated.
Use
dataType instead. |
Modifier and Type | Method and Description |
---|---|
<T> DataStream<T> |
StreamExecutionEnvironment.addSource(scala.Function1<SourceFunction.SourceContext<T>,scala.runtime.BoxedUnit> function,
TypeInformation<T> evidence$8)
Create a DataStream using a user defined source function for arbitrary
source functionality.
|
<T> DataStream<T> |
StreamExecutionEnvironment.addSource(SourceFunction<T> function,
TypeInformation<T> evidence$7)
Create a DataStream using a user defined source function for arbitrary
source functionality.
|
<R> DataStream<R> |
AllWindowedStream.apply(AllWindowFunction<T,R,W> function,
TypeInformation<R> evidence$3)
Applies the given window function to each window.
|
<T> DataStream<T> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(CoGroupFunction<T1,T2,T> function,
TypeInformation<T> evidence$4)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<T> DataStream<T> |
JoinedStreams.Where.EqualTo.WithWindow.apply(FlatJoinFunction<T1,T2,T> function,
TypeInformation<T> evidence$5)
Completes the join operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(scala.Function2<scala.collection.Iterator<T1>,scala.collection.Iterator<T2>,O> fun,
TypeInformation<O> evidence$2)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
JoinedStreams.Where.EqualTo.WithWindow.apply(scala.Function2<T1,T2,O> fun,
TypeInformation<O> evidence$2)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(scala.Function2<T,T,T> preAggregator,
scala.Function3<W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$6)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(scala.Function2<T,T,T> preAggregator,
scala.Function4<K,W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$6)
Applies the given window function to each window.
|
<O> DataStream<O> |
CoGroupedStreams.Where.EqualTo.WithWindow.apply(scala.Function3<scala.collection.Iterator<T1>,scala.collection.Iterator<T2>,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3)
Completes the co-group operation with the user function that is executed
for windowed groups.
|
<O> DataStream<O> |
JoinedStreams.Where.EqualTo.WithWindow.apply(scala.Function3<T1,T2,Collector<O>,scala.runtime.BoxedUnit> fun,
TypeInformation<O> evidence$3)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(scala.Function3<W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> function,
TypeInformation<R> evidence$4)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(scala.Function4<K,W,scala.collection.Iterable<T>,Collector<R>,scala.runtime.BoxedUnit> function,
TypeInformation<R> evidence$4)
Applies the given window function to each window.
|
<T> DataStream<T> |
JoinedStreams.Where.EqualTo.WithWindow.apply(JoinFunction<T1,T2,T> function,
TypeInformation<T> evidence$4)
Completes the join operation with the user function that is executed
for windowed groups.
|
<R> DataStream<R> |
AllWindowedStream.apply(ReduceFunction<T> preAggregator,
AllWindowFunction<T,R,W> windowFunction,
TypeInformation<R> evidence$5)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(ReduceFunction<T> preAggregator,
WindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$5)
Applies the given window function to each window.
|
<R> DataStream<R> |
AllWindowedStream.apply(R initialValue,
FoldFunction<T,R> preAggregator,
AllWindowFunction<R,R,W> windowFunction,
TypeInformation<R> evidence$7)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(R initialValue,
FoldFunction<T,R> foldFunction,
WindowFunction<R,R,K,W> function,
TypeInformation<R> evidence$7)
Applies the given window function to each window.
|
<R> DataStream<R> |
AllWindowedStream.apply(R initialValue,
scala.Function2<R,T,R> preAggregator,
scala.Function3<W,scala.collection.Iterable<R>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$8)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(R initialValue,
scala.Function2<R,T,R> foldFunction,
scala.Function4<K,W,scala.collection.Iterable<R>,Collector<R>,scala.runtime.BoxedUnit> windowFunction,
TypeInformation<R> evidence$8)
Applies the given window function to each window.
|
<R> DataStream<R> |
WindowedStream.apply(WindowFunction<T,R,K,W> function,
TypeInformation<R> evidence$3)
Applies the given window function to each window.
|
<T> DataStream<T> |
StreamExecutionEnvironment.createInput(InputFormat<T,?> inputFormat,
TypeInformation<T> evidence$6)
Generic method to create an input data stream with a specific input format.
|
<S> DataStream<T> |
KeyedStream.filterWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<Object,scala.Option<S>>> fun,
TypeInformation<S> evidence$3)
Creates a new DataStream that contains only the elements satisfying the given stateful filter
predicate.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(CoFlatMapFunction<IN1,IN2,R> coFlatMapper,
TypeInformation<R> evidence$3)
Applies a CoFlatMap transformation on these connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(FlatMapFunction<T,R> flatMapper,
TypeInformation<R> evidence$8)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(scala.Function1<IN1,scala.collection.TraversableOnce<R>> fun1,
scala.Function1<IN2,scala.collection.TraversableOnce<R>> fun2,
TypeInformation<R> evidence$5)
Applies a CoFlatMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(scala.Function1<T,scala.collection.TraversableOnce<R>> fun,
TypeInformation<R> evidence$10)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R> DataStream<R> |
ConnectedStreams.flatMap(scala.Function2<IN1,Collector<R>,scala.runtime.BoxedUnit> fun1,
scala.Function2<IN2,Collector<R>,scala.runtime.BoxedUnit> fun2,
TypeInformation<R> evidence$4)
Applies a CoFlatMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.flatMap(scala.Function2<T,Collector<R>,scala.runtime.BoxedUnit> fun,
TypeInformation<R> evidence$9)
Creates a new DataStream by applying the given function to every element and flattening
the results.
|
<R,S> DataStream<R> |
KeyedStream.flatMapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<scala.collection.TraversableOnce<R>,scala.Option<S>>> fun,
TypeInformation<R> evidence$6,
TypeInformation<S> evidence$7)
Creates a new DataStream by applying the given stateful function to every element and
flattening the results.
|
<R,S> DataStream<R> |
KeyedStream.flatMapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<scala.collection.TraversableOnce<R>,scala.Option<S>>> fun,
TypeInformation<R> evidence$6,
TypeInformation<S> evidence$7)
Creates a new DataStream by applying the given stateful function to every element and
flattening the results.
|
<R> DataStream<R> |
AllWindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> evidence$1)
Applies the given fold function to each window.
|
<R> DataStream<R> |
WindowedStream.fold(R initialValue,
FoldFunction<T,R> function,
TypeInformation<R> evidence$1)
Applies the given fold function to each window.
|
<R> DataStream<R> |
KeyedStream.fold(R initialValue,
FoldFunction<T,R> folder,
TypeInformation<R> evidence$1)
Creates a new
DataStream by folding the elements of this DataStream
using an associative fold function and an initial value. |
<R> DataStream<R> |
AllWindowedStream.fold(R initialValue,
scala.Function2<R,T,R> function,
TypeInformation<R> evidence$2)
Applies the given fold function to each window.
|
<R> DataStream<R> |
WindowedStream.fold(R initialValue,
scala.Function2<R,T,R> function,
TypeInformation<R> evidence$2)
Applies the given fold function to each window.
|
<R> DataStream<R> |
KeyedStream.fold(R initialValue,
scala.Function2<R,T,R> fun,
TypeInformation<R> evidence$2)
Creates a new
DataStream by folding the elements of this DataStream
using an associative fold function and an initial value. |
<T> DataStream<T> |
StreamExecutionEnvironment.fromCollection(scala.collection.Iterator<T> data,
TypeInformation<T> evidence$3)
Creates a DataStream from the given
Iterator . |
<T> DataStream<T> |
StreamExecutionEnvironment.fromCollection(scala.collection.Seq<T> data,
TypeInformation<T> evidence$2)
Creates a DataStream from the given non-empty
Seq . |
<T> DataStream<T> |
StreamExecutionEnvironment.fromElements(scala.collection.Seq<T> data,
TypeInformation<T> evidence$1)
Creates a DataStream that contains the given elements.
|
<T> DataStream<T> |
StreamExecutionEnvironment.fromParallelCollection(SplittableIterator<T> data,
TypeInformation<T> evidence$4)
Creates a DataStream from the given
SplittableIterator . |
<R,F> DataStream<R> |
DataStream.iterate(scala.Function1<ConnectedStreams<T,F>,scala.Tuple2<DataStream<F>,DataStream<R>>> stepFunction,
long maxWaitTimeMillis,
TypeInformation<F> evidence$5)
Initiates an iterative part of the program that creates a loop by feeding
back data streams.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
ConnectedStreams.keyBy(scala.Function1<IN1,K1> fun1,
scala.Function1<IN2,K2> fun2,
TypeInformation<K1> evidence$6,
TypeInformation<K2> evidence$7)
Keys the two connected streams together.
|
<K1,K2> ConnectedStreams<IN1,IN2> |
ConnectedStreams.keyBy(scala.Function1<IN1,K1> fun1,
scala.Function1<IN2,K2> fun2,
TypeInformation<K1> evidence$6,
TypeInformation<K2> evidence$7)
Keys the two connected streams together.
|
<K> KeyedStream<T,K> |
DataStream.keyBy(scala.Function1<T,K> fun,
TypeInformation<K> evidence$1)
Groups the elements of a DataStream by the given K key to
be used with grouped operators like grouped reduce or grouped aggregations.
|
<R> DataStream<R> |
ConnectedStreams.map(CoMapFunction<IN1,IN2,R> coMapper,
TypeInformation<R> evidence$2)
Applies a CoMap transformation on these connected streams.
|
<R> DataStream<R> |
ConnectedStreams.map(scala.Function1<IN1,R> fun1,
scala.Function1<IN2,R> fun2,
TypeInformation<R> evidence$1)
Applies a CoMap transformation on the connected streams.
|
<R> DataStream<R> |
DataStream.map(scala.Function1<T,R> fun,
TypeInformation<R> evidence$6)
Creates a new DataStream by applying the given function to every element of this DataStream.
|
<R> DataStream<R> |
DataStream.map(MapFunction<T,R> mapper,
TypeInformation<R> evidence$7)
Creates a new DataStream by applying the given function to every element of this DataStream.
|
<R,S> DataStream<R> |
KeyedStream.mapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<R,scala.Option<S>>> fun,
TypeInformation<R> evidence$4,
TypeInformation<S> evidence$5)
Creates a new DataStream by applying the given stateful function to every element of this
DataStream.
|
<R,S> DataStream<R> |
KeyedStream.mapWithState(scala.Function2<T,scala.Option<S>,scala.Tuple2<R,scala.Option<S>>> fun,
TypeInformation<R> evidence$4,
TypeInformation<S> evidence$5)
Creates a new DataStream by applying the given stateful function to every element of this
DataStream.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
scala.Function1<T,K> fun,
TypeInformation<K> evidence$4)
Partitions a DataStream on the key returned by the selector, using a custom partitioner.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
int field,
TypeInformation<K> evidence$2)
Partitions a tuple DataStream on the specified key fields using a custom partitioner.
|
<K> DataStream<T> |
DataStream.partitionCustom(Partitioner<K> partitioner,
String field,
TypeInformation<K> evidence$3)
Partitions a POJO DataStream on the specified key fields using a custom partitioner.
|
<T> DataStream<T> |
StreamExecutionEnvironment.readFile(FileInputFormat<T> inputFormat,
String filePath,
TypeInformation<T> evidence$5)
Reads the given file with the given input format.
|
<R> DataStream<R> |
DataStream.transform(String operatorName,
OneInputStreamOperator<T,R> operator,
TypeInformation<R> evidence$11)
Transforms the
DataStream by using a custom OneInputStreamOperator . |
<R> DataStream<R> |
ConnectedStreams.transform(String functionName,
TwoInputStreamOperator<IN1,IN2,R> operator,
TypeInformation<R> evidence$8) |
<KEY> CoGroupedStreams.Where<KEY> |
CoGroupedStreams.where(scala.Function1<T1,KEY> keySelector,
TypeInformation<KEY> evidence$1)
Specifies a
KeySelector for elements from the first input. |
<KEY> JoinedStreams.Where<KEY> |
JoinedStreams.where(scala.Function1<T1,KEY> keySelector,
TypeInformation<KEY> evidence$1)
Specifies a
KeySelector for elements from the first input. |
Constructor and Description |
---|
KeySelectorWithType(scala.Function1<IN,K> fun,
TypeInformation<K> info) |
Where(KeySelector<T1,KEY> keySelector1,
TypeInformation<KEY> keyType) |
Where(KeySelector<T1,KEY> keySelector1,
TypeInformation<KEY> keyType) |
Modifier and Type | Field and Description |
---|---|
protected TypeInformation<T> |
StreamTransformation.outputType |
Modifier and Type | Method and Description |
---|---|
TypeInformation<IN> |
OneInputTransformation.getInputType()
Returns the
TypeInformation for the elements of the input. |
TypeInformation<IN1> |
TwoInputTransformation.getInputType1()
Returns the
TypeInformation for the elements from the first input. |
TypeInformation<IN2> |
TwoInputTransformation.getInputType2()
Returns the
TypeInformation for the elements from the first input. |
TypeInformation<T> |
StreamTransformation.getOutputType()
Returns the output type of this
StreamTransformation as a TypeInformation . |
TypeInformation<?> |
TwoInputTransformation.getStateKeyType() |
TypeInformation<?> |
SinkTransformation.getStateKeyType() |
TypeInformation<?> |
OneInputTransformation.getStateKeyType() |
Modifier and Type | Method and Description |
---|---|
void |
StreamTransformation.setOutputType(TypeInformation<T> outputType)
Tries to fill in the type information.
|
void |
TwoInputTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
void |
SinkTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
void |
OneInputTransformation.setStateKeyType(TypeInformation<?> stateKeyType) |
Constructor and Description |
---|
CoFeedbackTransformation(int parallelism,
TypeInformation<F> feedbackType,
Long waitTime)
Creates a new
CoFeedbackTransformation from the given input. |
OneInputTransformation(StreamTransformation<IN> input,
String name,
OneInputStreamOperator<IN,OUT> operator,
TypeInformation<OUT> outputType,
int parallelism)
Creates a new
OneInputTransformation from the given input and operator. |
SourceTransformation(String name,
StreamSource<T,?> operator,
TypeInformation<T> outputType,
int parallelism)
Creates a new
SourceTransformation from the given operator. |
StreamTransformation(String name,
TypeInformation<T> outputType,
int parallelism)
Creates a new
StreamTransformation with the given name, output type and parallelism. |
TwoInputTransformation(StreamTransformation<IN1> input1,
StreamTransformation<IN2> input2,
String name,
TwoInputStreamOperator<IN1,IN2,OUT> operator,
TypeInformation<OUT> outputType,
int parallelism)
Creates a new
TwoInputTransformation from the given inputs and operator. |
Modifier and Type | Method and Description |
---|---|
<S extends Serializable> |
Trigger.TriggerContext.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState)
Deprecated.
|
Modifier and Type | Method and Description |
---|---|
void |
SequenceFileWriter.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
RollingSink.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<T> |
FlinkKafkaConsumerBase.getProducedType() |
Modifier and Type | Method and Description |
---|---|
TypeInformation<OUT> |
RMQSource.getProducedType() |
Modifier and Type | Method and Description |
---|---|
<S extends Serializable> |
WindowOperator.Context.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState) |
<S extends Serializable> |
NonKeyedWindowOperator.Context.getKeyValueState(String name,
TypeInformation<S> stateType,
S defaultState) |
void |
WindowOperator.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
void |
NonKeyedWindowOperator.setInputType(TypeInformation<?> type,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<F> |
FieldAccessor.getFieldType() |
Modifier and Type | Method and Description |
---|---|
static <R,F> FieldAccessor<R,F> |
FieldAccessor.create(int pos,
TypeInformation<R> typeInfo,
ExecutionConfig config) |
static <R,F> FieldAccessor<R,F> |
FieldAccessor.create(String field,
TypeInformation<R> typeInfo,
ExecutionConfig config) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple> |
KeySelectorUtil.ComparableKeySelector.getProducedType() |
TypeInformation<Tuple> |
KeySelectorUtil.ArrayKeySelector.getProducedType() |
Modifier and Type | Method and Description |
---|---|
static <X> KeySelectorUtil.ArrayKeySelector<X> |
KeySelectorUtil.getSelectorForArray(int[] positions,
TypeInformation<X> typeInfo) |
static <X> KeySelector<X,Tuple> |
KeySelectorUtil.getSelectorForKeys(Keys<X> keys,
TypeInformation<X> typeInfo,
ExecutionConfig executionConfig) |
static <X,K> KeySelector<X,K> |
KeySelectorUtil.getSelectorForOneKey(Keys<X> keys,
Partitioner<K> partitioner,
TypeInformation<X> typeInfo,
ExecutionConfig executionConfig) |
Modifier and Type | Method and Description |
---|---|
TypeInformation<Tuple2<K,V>> |
TypeInformationKeyValueSerializationSchema.getProducedType() |
TypeInformation<T> |
KeyedDeserializationSchemaWrapper.getProducedType() |
TypeInformation<T> |
TypeInformationSerializationSchema.getProducedType() |
TypeInformation<String> |
SimpleStringSchema.getProducedType() |
Constructor and Description |
---|
TypeInformationKeyValueSerializationSchema(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given types.
|
TypeInformationKeyValueSerializationSchema(TypeInformation<K> keyTypeInfo,
TypeInformation<V> valueTypeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given types.
|
TypeInformationSerializationSchema(TypeInformation<T> typeInfo,
ExecutionConfig ec)
Creates a new de-/serialization schema for the given type.
|
Copyright © 2014–2017 The Apache Software Foundation. All rights reserved.