This documentation is for an out-of-date version of Apache Flink. We recommend you use the latest stable version.

Linking with modules not contained in the binary distribution

The binary distribution contains jar packages in the lib folder that are automatically provided to the classpath of your distributed programs. Almost all of Flink classes are located there with a few exceptions, for example the streaming connectors and some freshly added modules. To run code depending on these modules you need to make them accessible during runtime, for which we suggest two options:

  1. Either copy the required jar files to the lib folder onto all of your TaskManagers. Note that you have to restart your TaskManagers after this.
  2. Or package them with your code.

The latter version is recommended as it respects the classloader management in Flink.

Packaging dependencies with your usercode with Maven

To provide these dependencies not included by Flink we suggest two options with Maven.

  1. The maven assembly plugin builds a so-called uber-jar (executable jar) containing all your dependencies. The assembly configuration is straight-forward, but the resulting jar might become bulky. See maven-assembly-plugin for further information.
  2. The maven unpack plugin unpacks the relevant parts of the dependencies and then packages it with your code.

Using the latter approach in order to bundle the Kafka connector, flink-connector-kafka you would need to add the classes from both the connector and the Kafka API itself. Add the following to your plugins section.

            <!-- executed just before the package phase -->
                    <!-- For Flink connector classes -->
                    <!-- For Kafka API classes -->

Now when running mvn clean package the produced jar includes the required dependencies.