Modifier and Type | Class and Description |
---|---|
class |
AbstractDeserializationSchema<T>
The deserialization schema describes how to turn the byte messages delivered by certain data
sources (for example Apache Kafka) into data types (Java/Scala objects) that are processed by
Flink.
|
class |
SimpleStringSchema
Very simple serialization schema for strings.
|
class |
TypeInformationSerializationSchema<T>
A serialization and deserialization schema that uses Flink's serialization stack to transform
typed from and to byte arrays.
|
Constructor and Description |
---|
DeserializationSchemaAdapter(DeserializationSchema<RowData> deserializationSchema) |
Constructor and Description |
---|
FileSystemTableSource(ObjectIdentifier tableIdentifier,
DataType physicalRowDataType,
List<String> partitionKeys,
ReadableConfig tableOptions,
DecodingFormat<BulkFormat<RowData,FileSourceSplit>> bulkReaderFormat,
DecodingFormat<DeserializationSchema<RowData>> deserializationFormat) |
Modifier and Type | Method and Description |
---|---|
KafkaSourceBuilder<OUT> |
KafkaSourceBuilder.setValueOnlyDeserializer(DeserializationSchema<OUT> deserializationSchema)
Sets the
deserializer of the ConsumerRecord for KafkaSource. |
Modifier and Type | Method and Description |
---|---|
static <V> KafkaRecordDeserializationSchema<V> |
KafkaRecordDeserializationSchema.valueOnly(DeserializationSchema<V> valueDeserializationSchema)
Wraps a
DeserializationSchema as the value deserialization schema of the ConsumerRecords . |
Modifier and Type | Method and Description |
---|---|
static <T> PulsarDeserializationSchema<T> |
PulsarDeserializationSchema.flinkSchema(DeserializationSchema<T> deserializationSchema)
Create a PulsarDeserializationSchema by using the flink's
DeserializationSchema . |
Modifier and Type | Method and Description |
---|---|
static void |
SchemaTestUtils.open(DeserializationSchema<?> schema)
Opens the given schema with a mock initialization context.
|
Modifier and Type | Method and Description |
---|---|
static <K,V> Map<K,V> |
UpsertTestFileUtil.readRecords(BufferedInputStream bis,
DeserializationSchema<K> keyDeserializationSchema,
DeserializationSchema<V> valueDeserializationSchema)
Reads records that were written using the
UpsertTestSinkWriter from the given
InputStream and converts them using the provided DeserializationSchema s. |
static <K,V> Map<K,V> |
UpsertTestFileUtil.readRecords(BufferedInputStream bis,
DeserializationSchema<K> keyDeserializationSchema,
DeserializationSchema<V> valueDeserializationSchema)
Reads records that were written using the
UpsertTestSinkWriter from the given
InputStream and converts them using the provided DeserializationSchema s. |
static <K,V> Map<K,V> |
UpsertTestFileUtil.readRecords(File file,
DeserializationSchema<K> keyDeserializationSchema,
DeserializationSchema<V> valueDeserializationSchema)
Reads records that were written using the
UpsertTestSinkWriter from the given File
and converts them using the provided DeserializationSchema s. |
static <K,V> Map<K,V> |
UpsertTestFileUtil.readRecords(File file,
DeserializationSchema<K> keyDeserializationSchema,
DeserializationSchema<V> valueDeserializationSchema)
Reads records that were written using the
UpsertTestSinkWriter from the given File
and converts them using the provided DeserializationSchema s. |
Modifier and Type | Class and Description |
---|---|
class |
AvroDeserializationSchema<T>
Deserialization schema that deserializes from Avro binary format.
|
class |
AvroRowDataDeserializationSchema
Deserialization schema from Avro bytes to
RowData . |
class |
AvroRowDeserializationSchema
Deprecated.
The format was developed for the Table API users and will not be maintained for
DataStream API users anymore. Either use Table API or switch to Data Stream, defining your
own
DeserializationSchema . |
class |
RegistryAvroDeserializationSchema<T>
Deserialization schema that deserializes from Avro binary format using
SchemaCoder . |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
AvroFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Constructor and Description |
---|
AvroRowDataDeserializationSchema(DeserializationSchema<org.apache.avro.generic.GenericRecord> nestedSchema,
AvroToRowDataConverters.AvroToRowDataConverter runtimeConverter,
TypeInformation<RowData> typeInfo)
Creates a Avro deserialization schema for the given logical type.
|
Modifier and Type | Class and Description |
---|---|
class |
GlueSchemaRegistryAvroDeserializationSchema<T>
AWS Glue Schema Registry Deserialization schema to de-serialize Avro binary format for Flink
Consumer user.
|
Modifier and Type | Class and Description |
---|---|
class |
ConfluentRegistryAvroDeserializationSchema<T>
Deserialization schema that deserializes from Avro binary format using
SchemaCoder that
uses Confluent Schema Registry. |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
RegistryAvroFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
DebeziumAvroDeserializationSchema
Deserialization schema from Debezium Avro to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
DebeziumAvroFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
CsvRowDataDeserializationSchema
Deserialization schema from CSV to Flink Table & SQL internal data structures.
|
class |
CsvRowDeserializationSchema
Deprecated.
The format was developed for the Table API users and will not be maintained for
DataStream API users anymore. Either use Table API or switch to Data Stream, defining your
own
DeserializationSchema . |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
CsvFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
JsonDeserializationSchema<T>
DeserializationSchema that deserializes a JSON String.
|
class |
JsonNodeDeserializationSchema
Deprecated.
Use
new JsonDeserializationSchema(ObjectNode.class) instead |
class |
JsonRowDataDeserializationSchema
Deserialization schema from JSON to Flink Table/SQL internal data structure
RowData . |
class |
JsonRowDeserializationSchema
Deprecated.
The format was developed for the Table API users and will not be maintained for
DataStream API users anymore. Either use Table API or switch to Data Stream, defining your
own
DeserializationSchema . |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
JsonFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
CanalJsonDeserializationSchema
Deserialization schema from Canal JSON to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
CanalJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
CanalJsonFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
DebeziumJsonDeserializationSchema
Deserialization schema from Debezium JSON to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
DebeziumJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
DebeziumJsonFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
GlueSchemaRegistryJsonDeserializationSchema<T>
AWS Glue Schema Registry Deserialization schema to de-serialize JSON Schema binary format for
Flink Consumer user.
|
Modifier and Type | Class and Description |
---|---|
class |
MaxwellJsonDeserializationSchema
Deserialization schema from Maxwell JSON to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
MaxwellJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType,
int[][] projections) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
MaxwellJsonFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
OggJsonDeserializationSchema
Deserialization schema from Ogg JSON to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
OggJsonDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType physicalDataType) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
OggJsonFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
PbDecodingFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
PbFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Class and Description |
---|---|
class |
PbRowDataDeserializationSchema
Deserialization schema from Protobuf to Flink types.
|
Modifier and Type | Class and Description |
---|---|
class |
RawFormatDeserializationSchema
Deserialization schema from raw (byte based) value to Flink Table/SQL internal data structure
RowData . |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
RawFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Modifier and Type | Method and Description |
---|---|
<OUT> PubSubSource.ProjectNameBuilder<OUT> |
PubSubSource.DeserializationSchemaBuilder.withDeserializationSchema(DeserializationSchema<OUT> deserializationSchema)
Set the DeserializationSchema used to deserialize incoming PubSubMessages.
|
Constructor and Description |
---|
FlinkKafkaConsumer(List<String> topics,
DeserializationSchema<T> deserializer,
Properties props)
Deprecated.
Creates a new Kafka streaming source consumer.
|
FlinkKafkaConsumer(Pattern subscriptionPattern,
DeserializationSchema<T> valueDeserializer,
Properties props)
Deprecated.
Creates a new Kafka streaming source consumer.
|
FlinkKafkaConsumer(String topic,
DeserializationSchema<T> valueDeserializer,
Properties props)
Deprecated.
Creates a new Kafka streaming source consumer.
|
Constructor and Description |
---|
KafkaDeserializationSchemaWrapper(DeserializationSchema<T> deserializationSchema) |
Modifier and Type | Field and Description |
---|---|
protected DecodingFormat<DeserializationSchema<RowData>> |
KafkaDynamicSource.keyDecodingFormat
Optional format for decoding keys from Kafka.
|
protected DecodingFormat<DeserializationSchema<RowData>> |
KafkaDynamicSource.valueDecodingFormat
Format for decoding values from Kafka.
|
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
UpsertKafkaDynamicTableFactory.DecodingFormatWrapper.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
protected KafkaSource<RowData> |
KafkaDynamicSource.createKafkaSource(DeserializationSchema<RowData> keyDeserialization,
DeserializationSchema<RowData> valueDeserialization,
TypeInformation<RowData> producedTypeInfo) |
protected KafkaSource<RowData> |
KafkaDynamicSource.createKafkaSource(DeserializationSchema<RowData> keyDeserialization,
DeserializationSchema<RowData> valueDeserialization,
TypeInformation<RowData> producedTypeInfo) |
Modifier and Type | Method and Description |
---|---|
protected KafkaDynamicSource |
KafkaDynamicTableFactory.createKafkaTableSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
String tableIdentifier) |
protected KafkaDynamicSource |
KafkaDynamicTableFactory.createKafkaTableSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
String tableIdentifier) |
Constructor and Description |
---|
DecodingFormatWrapper(DecodingFormat<DeserializationSchema<RowData>> innerDecodingFormat) |
KafkaDynamicSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
boolean upsertMode,
String tableIdentifier) |
KafkaDynamicSource(DataType physicalDataType,
DecodingFormat<DeserializationSchema<RowData>> keyDecodingFormat,
DecodingFormat<DeserializationSchema<RowData>> valueDecodingFormat,
int[] keyProjection,
int[] valueProjection,
String keyPrefix,
List<String> topics,
Pattern topicPattern,
Properties properties,
StartupMode startupMode,
Map<KafkaTopicPartition,Long> specificStartupOffsets,
long startupTimestampMillis,
boolean upsertMode,
String tableIdentifier) |
Constructor and Description |
---|
FlinkDynamoDBStreamsConsumer(String stream,
DeserializationSchema<T> deserializer,
Properties config)
Constructor of FlinkDynamoDBStreamsConsumer.
|
FlinkKinesisConsumer(String stream,
DeserializationSchema<T> deserializer,
Properties configProps)
Creates a new Flink Kinesis Consumer.
|
Constructor and Description |
---|
KinesisDeserializationSchemaWrapper(DeserializationSchema<T> deserializationSchema) |
Constructor and Description |
---|
RowDataKinesisDeserializationSchema(DeserializationSchema<RowData> physicalDeserializer,
TypeInformation<RowData> producedTypeInfo,
List<RowDataKinesisDeserializationSchema.Metadata> requestedMetadataFields) |
Constructor and Description |
---|
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat) |
KinesisDynamicSource(DataType physicalDataType,
String stream,
Properties consumerProperties,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType,
List<RowDataKinesisDeserializationSchema.Metadata> requestedMetadataFields) |
Constructor and Description |
---|
RMQSource(RMQConnectionConfig rmqConnectionConfig,
String queueName,
boolean usesCorrelationId,
DeserializationSchema<OUT> deserializationSchema)
Creates a new RabbitMQ source.
|
RMQSource(RMQConnectionConfig rmqConnectionConfig,
String queueName,
DeserializationSchema<OUT> deserializationSchema)
Creates a new RabbitMQ source with at-least-once message processing guarantee when
checkpointing is enabled.
|
Modifier and Type | Class and Description |
---|---|
class |
EventDeSerializationSchema
A serializer and deserializer for the
Event type. |
Modifier and Type | Class and Description |
---|---|
class |
KafkaEventSchema
The serialization schema for the
KafkaEvent type. |
Modifier and Type | Interface and Description |
---|---|
interface |
DeserializationSchema<T>
Deprecated.
Use
DeserializationSchema instead. |
Modifier and Type | Class and Description |
---|---|
class |
ChangelogCsvDeserializer
The
ChangelogCsvDeserializer contains a simple parsing logic for converting bytes into
Row of Integer and String with a RowKind . |
Modifier and Type | Method and Description |
---|---|
DeserializationSchema<RowData> |
ChangelogCsvFormat.createRuntimeDecoder(DynamicTableSource.Context context,
DataType producedDataType) |
Modifier and Type | Method and Description |
---|---|
DecodingFormat<DeserializationSchema<RowData>> |
ChangelogCsvFormatFactory.createDecodingFormat(DynamicTableFactory.Context context,
ReadableConfig formatOptions) |
Constructor and Description |
---|
SocketSourceFunction(String hostname,
int port,
byte byteDelimiter,
DeserializationSchema<RowData> deserializer) |
Constructor and Description |
---|
SocketDynamicTableSource(String hostname,
int port,
byte byteDelimiter,
DecodingFormat<DeserializationSchema<RowData>> decodingFormat,
DataType producedDataType) |
Copyright © 2014–2022 The Apache Software Foundation. All rights reserved.