diff --git a/docs/content/connectors/db2-cdc.md b/docs/content/connectors/db2-cdc.md new file mode 100644 index 00000000000..c3a6abc6720 --- /dev/null +++ b/docs/content/connectors/db2-cdc.md @@ -0,0 +1,369 @@ +# Db2 CDC Connector + +The Db2 CDC connector allows for reading snapshot data and incremental data from Db2 database. This document +describes how to setup the db2 CDC connector to run SQL queries against Db2 databases. + + +## Supported Databases + +| Connector | Database | Driver | +|-----------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------| +| [Db2-cdc](db2-cdc.md) |
  • [Db2](https://www.ibm.com/products/db2): | JDBC Driver: 8.0.21 | + +Dependencies +------------ + +In order to set up the Db2 CDC connector, the following table provides dependency information for both projects +using a build automation tool (such as Maven or SBT) and SQL Client with SQL JAR bundles. + +### Maven dependency + +``` + + com.ververica + flink-connector-db2-cdc + + 2.3-SNAPSHOT + +``` + +### SQL Client JAR + +```Download link is available only for stable releases.``` + +Download [flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar](https://repo1.maven. +org/maven2/com/ververica/flink-sql-connector-db2-cdc/2.3-SNAPSHOT/flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar) and +put it under `/lib/`. + +**Note:** flink-sql-connector-db2-cdc-XXX-SNAPSHOT version is the code corresponding to the development branch. Users +need to download the source code and compile the corresponding jar. Users should use the released version, such as +[flink-sql-connector-db2-cdc-2.3.0.jar](https://mvnrepository.com/artifact/com.ververica/flink-connector-db2-cdc), +the released version will be available in the Maven central warehouse. + +Setup Db2 server +---------------- + +Follow the steps in the [Debezium Db2 Connector](https://debezium.io/documentation/reference/1.6/connectors/db2.html#setting-up-db2). + + +Notes +---------------- + +### Not support BOOLEAN type in SQL Replication on Db2 + +Only snapshots can be taken from tables with BOOLEAN type columns. Currently SQL Replication on Db2 does not support BOOLEAN, so Debezium can not perform CDC on those tables. +Consider using a different type. + + +How to create a Db2 CDC table +---------------- + +The Db2 CDC table can be defined as following: + +```sql +-- checkpoint every 3 seconds +Flink SQL> SET 'execution.checkpointing.interval' = '3s'; + +-- register a Db2 table 'products' in Flink SQL +Flink SQL> CREATE TABLE products ( + ID INT NOT NULL, + NAME STRING, + DESCRIPTION STRING, + WEIGHT DECIMAL(10,3) + ) WITH ( + 'connector' = 'db2-cdc', + 'hostname' = 'localhost', + 'port' = '50000', + 'username' = 'root', + 'password' = '123456', + 'database-name' = 'mydb', + 'schema-name' = 'myschema', + 'table-name' = 'products'); + +-- read snapshot and binlogs from products table +Flink SQL> SELECT * FROM products; +``` + +Connector Options +---------------- + +
    + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    OptionRequiredDefaultTypeDescription
    connectorrequired(none)StringSpecify what connector to use, here should be 'db2-cdc'.
    hostnamerequired(none)StringIP address or hostname of the Db2 database server.
    usernamerequired(none)StringName of the Db2 database to use when connecting to the Db2 database server.
    passwordrequired(none)StringPassword to use when connecting to the Db2 database server.
    database-namerequired(none)StringDatabase name of the Db2 server to monitor.
    schema-namerequired(none)StringSchema name of the Db2 database to monitor.
    table-namerequired(none)StringTable name of the Db2 database to monitor.
    portoptional50000IntegerInteger port number of the Db2 database server.
    scan.startup.modeoptionalinitialStringOptional startup mode for Db2 CDC consumer, valid enumerations are "initial" + and "latest-offset". Please see Startup Reading Positionsection +for more detailed information.
    server-time-zoneoptional(none)StringThe session time zone in database server, e.g. "Asia/Shanghai". + It controls how the TIMESTAMP type in Db2 converted to STRING. + See more here. + If not set, then ZoneId.systemDefault() is used to determine the server time zone. +
    debezium.*optional(none)StringPass-through Debezium's properties to Debezium Embedded Engine which is used to capture data changes from +Db2 server. + For example: 'debezium.snapshot.mode' = 'never'. + See more about the Debezium's Db2 Connector properties
    +
    + +Features +-------- +### Startup Reading Position + +The config option `scan.startup.mode` specifies the startup mode for MySQL CDC consumer. The valid enumerations are: + +- `initial` (default): Performs an initial snapshot on the monitored database tables upon first startup, and continue to read the latest binlog. +- `latest-offset`: Never to perform snapshot on the monitored database tables upon first startup, just read from + the end of the binlog which means only have the changes since the connector was started. + +_Note: the mechanism of `scan.startup.mode` option relying on Debezium's `snapshot.mode` configuration. So please do not using them together. If you speicifying both `scan.startup.mode` and `debezium.snapshot.mode` options in the table DDL, it may make `scan.startup.mode` doesn't work._ + +### DataStream Source + +```java +import org.apache.flink.api.common.eventtime.WatermarkStrategy; +import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment; +import org.apache.kafka.connect.source.SourceRecord; +import com.ververica.cdc.debezium.JsonDebeziumDeserializationSchema; +import com.ververica.cdc.connectors.db2.Db2Source; + +public class MySqlSourceExample { + public static void main(String[] args) throws Exception { + Db2Source db2Source = Db2Source.builder() + .hostname("yourHostname") + .port(yourPort) + .database("yourDatabaseName") // set captured database + .tableList("yourSchemaName.yourTableName") // set captured table + .username("yourUsername") + .password("yourPassword") + .deserializer(new JsonDebeziumDeserializationSchema()) // converts SourceRecord to JSON String + .build(); + + StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); + + // enable checkpoint + env.enableCheckpointing(3000); + + env + .fromSource(db2Source, WatermarkStrategy.noWatermarks(), "Db2 Source") + // set 4 parallel source tasks + .setParallelism(1) + .print().setParallelism(1); // use parallelism 1 for sink to keep message ordering + + env.execute("Print Db2 Snapshot + Binlog"); + } +} +``` + +**Note:** Please refer [Deserialization](../about.html#deserialization) for more details about the JSON deserialization. + +Data Type Mapping +---------------- + +
    + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
    Db2 typeFlink SQL typeNOTE
    + SMALLINT
    +
    SMALLINT
    + INTEGER + INT
    + BIGINT + BIGINT
    + REAL + FLOAT
    + DOUBLE + DOUBLE
    + NUMERIC(p, s)
    + DECIMAL(p, s) +
    DECIMAL(p, s)
    DATEDATE
    TIMETIME
    TIMESTAMP [(p)] + TIMESTAMP [(p)] +
    + CHAR(n) + CHAR(n)
    + VARCHAR(n) + VARCHAR(n)
    + BINARY(n) + BINARY(n)
    + VARBINARY(N) + VARBINARY(N)
    + BLOB
    + CLOB
    + DBCLOB
    +
    BYTES
    + VARGRAPHIC
    + XML +
    STRING
    +
    + +FAQ +-------- +* [FAQ(English)](https://github.com/ververica/flink-cdc-connectors/wiki/FAQ) +* [FAQ(中文)](https://github.com/ververica/flink-cdc-connectors/wiki/FAQ(ZH)) \ No newline at end of file diff --git a/pom.xml b/pom.xml index 629b9c2f117..4023c42f848 100644 --- a/pom.xml +++ b/pom.xml @@ -41,6 +41,7 @@ under the License. flink-connector-oceanbase-cdc flink-connector-sqlserver-cdc flink-connector-tidb-cdc + flink-connector-db2-cdc flink-sql-connector-mysql-cdc flink-sql-connector-postgres-cdc flink-sql-connector-mongodb-cdc @@ -48,6 +49,7 @@ under the License. flink-sql-connector-oceanbase-cdc flink-sql-connector-sqlserver-cdc flink-sql-connector-tidb-cdc + flink-sql-connector-db2-cdc flink-cdc-e2e-tests @@ -261,6 +263,7 @@ under the License. **/*.txt flink-connector-mysql-cdc/src/test/resources/file/*.json + flink-connector-db2-cdc/src/test/resources/db2_server/Dockerfile