Skip to content

Releases: apache/seatunnel

2.2.0-beta-Release

03 Oct 09:06
Compare
Choose a tag to compare

[Feature & Improve]

  • Connector V2 API, Decoupling connectors from compute engines
    • [Translation] Support Flink 1.13.x
    • [Translation] Support Spark 2.4
    • [Connector-V2] [Fake] Support FakeSource (#1864)
    • [Connector-V2] [Console] Support ConsoleSink (#1864)
    • [Connector-V2] [ElasticSearch] Support ElasticSearchSink (#2330)
    • [Connector-V2] [ClickHouse] Support ClickHouse Source & Sink (#2051)
    • [Connector-V2] [JDBC] Support JDBC Source & Sink (#2048)
    • [Connector-V2] [JDBC] [Greenplum] Support Greenplum Source & Sink(#2429)
    • [Connector-V2] [JDBC] [DM] Support DaMengDB Source & Sink(#2377)
    • [Connector-V2] [File] Support Source & Sink for Local, HDFS & OSS File
    • [Connector-V2] [File] [FTP] Support FTP File Source & Sink (#2774)
    • [Connector-V2] [Hudi] Support Hudi Source (#2147)
    • [Connector-V2] [Icebreg] Support Icebreg Source (#2615)
    • [Connector-V2] [Kafka] Support Kafka Source (#1940)
    • [Connector-V2] [Kafka] Support Kafka Sink (#1952)
    • [Connector-V2] [Kudu] Support Kudu Source & Sink (#2254)
    • [Connector-V2] [MongoDB] Support MongoDB Source (#2596)
    • [Connector-V2] [MongoDB] Support MongoDB Sink (#2649)
    • [Connector-V2] [Neo4j] Support Neo4j Sink (#2434)
    • [Connector-V2] [Phoenix] Support Phoenix Source & Sink (#2499)
    • [Connector-V2] [Redis] Support Redis Source (#2569)
    • [Connector-V2] [Redis] Support Redis Sink (#2647)
    • [Connector-V2] [Socket] Support Socket Source (#1999)
    • [Connector-V2] [Socket] Support Socket Sink (#2549)
    • [Connector-V2] [HTTP] Support HTTP Source (#2012)
    • [Connector-V2] [HTTP] Support HTTP Sink (#2348)
    • [Connector-V2] [HTTP] [Wechat] Support Wechat Source Sink(#2412)
    • [Connector-V2] [Pulsar] Support Pulsar Source (#1984)
    • [Connector-V2] [Email] Support Email Sink (#2304)
    • [Connector-V2] [Sentry] Support Sentry Sink (#2244)
    • [Connector-V2] [DingTalk] Support DingTalk Sink (#2257)
    • [Connector-V2] [IotDB] Support IotDB Source (#2431)
    • [Connector-V2] [IotDB] Support IotDB Sink (#2407)
    • [Connector-V2] [Hive] Support Hive Source & Sink(#2708)
    • [Connector-V2] [Datahub] Support Datahub Sink(#2558)
  • [Catalog] MySQL Catalog (#2042)
  • [Format] JSON Format (#2014)
  • [Spark] [ClickHouse] Support unauthorized ClickHouse (#2393)
  • [Binary-Package] Add script to automatically download plugins (#2831)
  • [License] Update binary license (#2798)
  • [e2e] Improved e2e start sleep (#2677)
  • [e2e] Container only copy required connector jars (#2675)
  • [build] delete connectors*-dist modules (#2709)
  • [build] Dependency management split (#2606)
  • [build] The e2e module don't depend on the connector*-dist module (#2702)
  • [build] Improved scope of maven-shade-plugin (#2665)
  • [build] make sure flatten-maven-plugin runs after maven-shade-plugin (#2603)
  • [Starter] Use the public CommandLine util class to parse the args (#2470)
  • [Spark] [Redis] Self-Achieved Redis Proxy which is not support redis function of "info replication" (#2389)
  • [Flink] [Transform] support multi split,and add custome split function name (#2268)
  • [Test] Upgrade junit to 5.+ (#2305)

[Bugfix]

  • [Starter] Ensure that output paths constructed from zip archive entries are validated to prevent writing files to
    unexpected locations (#2843)
  • [Starter] Let the SparkCommandArgs do not split the variable value with comma (#2523)
  • [Spark] fix the problem of calling the getData() method twice (#2764)
  • [e2e] Fix path split exception in win10,not check file existed (#2715)

[Docs]

  • [Kafka] Update Kafka.md (#2863)
  • [JDBC] Fix inconsistency between document (#2776)
  • [Flink-SQL] [ElasticSearch] Updated prepare section (#2634)
  • [Contribution] add CheckStyle-IDEA Plugin introduction (#2535)
  • [Contribution] Update new-license.md (#2494)

2.1.3 Release

10 Aug 02:16
Compare
Choose a tag to compare

[Feature & Improvement]

[Connector][Flink][Fake] Supported BigInteger Type (#2118)

[Connector][Spark][TiDB] Refactored config parameters (#1983)

[Connector][Flink]add AssertSink connector (#2022)

[Connector][Spark][ClickHouse]Support Rsync to transfer clickhouse data file (#2074)

[Connector & e2e][Flink] add IT for Assert Sink in e2e module (#2036)

[Transform][Spark] data quality for null data rate (#1978)

[Transform][Spark] Add a module to set default value for null field #1958

[Chore]a more understandable code,and code warning will disappear #2005

[Spark] Use higher version of the libthrift dependency (#1994)

[Core][Starter] Change jar connector load logic (#2193)

[Core]Add plugin discovery module (#1881)

[BUG]

[Connector][Hudi] Source loads the data twice

[Connector][Doris]Fix the bug Unrecognized field "TwoPhaseCommit" after doris 0.15 (#2054)

[Connector][Jdbc]Fix the data output exception when accessing Hive using Spark JDBC #2085

[Connector][Jdbc]Fix JDBC data loss occurs when partition_column (partition mode) is set #2033

[Connector][Kafka]KafkaTableStream schema json parse #2168

[seatunnel-core] Failed to get APP_DIR path bug fixed (#2165)

[seatunnel-api-flink] Connectors dependencies repeat additions (#2207)

[seatunnel-core] Failed to get APP_DIR path bug fixed (#2165)

[seatunnel-core-flink] Updated FlinkRunMode enum to get the proper help message for run modes. (#2008)

[seatunnel-core-flink]fix same source and sink registerplugin librarycache error (#2015)

[Command]fix commandArgs -t(--check) conflict with flink deployment target (#2174)

[Core][Jackson]fix jackson type convert error (#2031)

[Core][Starter] When use cluster mode, but starter app root dir also should same as client mode. (#2141)

Docs

source socket connector docs update (#1995)

Add uuid, udf, replace transform to doc (#2016)

Update Flink engine version requirements (#2220)

Add Flink SQL module to website. (#2021)

[kubernetes] update seatunnel doc on kubernetes (#2035)

Dependency upgrade

Upgrade common-collecions4 to 4.4

Upgrade common-codec to 1.13

2.1.2 Release

18 Jun 12:50
Compare
Choose a tag to compare

[Feature]

  • Add Spark webhook source
  • Support Flink application mode
  • Split connector jar from core jar
  • Add Replace transforms for Spark
  • Add Uuid transform for Spark
  • Support Flink dynamic configurations
  • Flink JDBC source support Oracle database
  • Add Flink connector Http
  • Add Flink transform for register user define function
  • Add Flink SQL Kafka, ElasticSearch connector

[Bugfix]

  • Fixed ClickHouse sink data type convert error
  • Fixed first execute Spark start shell can't run problem
  • Fixed can not get config file when use Spark on yarn cluster mode
  • Fixed Spark extraJavaOptions can't be empty
  • Fixed the "plugins.tar.gz" decompression failure in Spark standalone cluster mode
  • Fixed Clickhouse sink can not work correctly when use multiple hosts
  • Fixed Flink sql conf parse exception
  • Fixed Flink JDBC Mysql datatype mapping incomplete
  • Fixed variables cannot be set in Flink mode
  • Fixed SeaTunnel Flink engine cannot check source config

[Improvement]

  • Update Jackson version to 2.12.6
  • Add guide on how to Set Up SeaTunnel with Kubernetes
  • Optimize some generic type code
  • Add Flink SQL e2e module
  • Flink JDBC connector add pre sql and post sql
  • Use @autoservice to generate SPI file
  • Support Flink FakeSourceStream to mock data
  • Support Read Hive by Flink JDBC source
  • ClickhouseFile support ReplicatedMergeTree
  • Support use Hive sink to save table as ORCFileFormat
  • Support Spark Redis sink custom expire time
  • Add Spark JDBC isolationLevel config
  • Use Jackson replace Fastjson

2.1.1 Release

27 Apr 04:10
Compare
Choose a tag to compare

[Feature]

  • Support json format config file
  • Jdbc connector support partition
  • Add ClickhouseFile sink on Spark engine
  • Support compile with jdk11
  • Add elasticsearch 7.x plugin on Flink engine
  • Add Feishu plugin on Spark engine
  • Add Spark http source plugin
  • Add Clickhouse sink plugin on Flink engine

[Bugfix]

  • Fix flink ConsoleSink not printing results
  • Fix various jdbc type of dialect compatibility between JdbcSource and JdbcSink
  • Fix when have empty data source, transform not execute
  • Fix datetime/date string can't convert to timestamp/date
  • Fix tableexits not contain TemporaryTable
  • Fix FileSink cannot work in flink stream mode
  • Fix config param issues of spark redis sink
  • Fix sql parse table name error
  • Fix not being able to send data to Kafka
  • Fix resource lake of file.
  • Fix When outputting data to doris, a ClassCastException was encountered

[Improvement]

  • Change jdbc related dependency scope to default
  • Use different command to execute task
  • Automatic identify spark hive plugin, add enableHiveSupport
  • Print config in origin order
  • Remove useless job name from JobInfo
  • Add console limit and batch flink fake source
  • Add Flink e2e module
  • Add Spark e2e module
  • Optimize plugin load, rename plugin package name
  • Rewrite Spark, Flink start script with code.
  • To quickly locate the wrong SQL statement in flink sql transform
  • Upgrade log4j version to 2.17.1
  • Unified version management of third-party dependencies
  • USe revision to manage project version
  • Add sonar check
  • Add ssl/tls parameter in spark email connector
  • Remove return result of sink plugin
  • Add flink-runtime-web to flink example
    Please go to the official channel to download: https://seatunnel.apache.org/download

2.1.0 Release (First apache version)

20 Mar 04:03
Compare
Choose a tag to compare
  • Use JCommander to do command line parameter parsing, making developers focus on the logic itself.
  • Flink is upgraded from 1.9 to 1.13.5, keeping compatibility with older versions and preparing for subsequent CDC.
  • Support for Doris, Hudi, Phoenix, Druid, and other Connector plugins, and you can find complete plugin support here plugins-supported-by-seatunnel.
  • Local development extremely fast starts environment support. It can be achieved by using the example module without modifying any code, which is convenient for local debugging.
  • Support for installing and trying out Apache SeaTunnel(Incubating) via Docker containers.
  • SQL component supports SET statements and configuration variables.
  • Config module refactoring to facilitate understanding for the contributors while ensuring code compliance (License) of the project.
  • Project structure realigned to fit the new Roadmap.
  • CI&CD support, code quality automation control (more plans will be carried out to support CI&CD development).
    Please go to the official channel to download: https://seatunnel.apache.org/download

[Stable]v1.5.7

28 Dec 15:38
Compare
Choose a tag to compare

[Stable]v1.5.6

23 Dec 00:52
Compare
Choose a tag to compare

What's Changed

[Stable]v1.5.3

11 Aug 01:23
Compare
Choose a tag to compare

注意:Waterdrop 提供可直接执行的软件包,没有必要自行编译源代码,请点击下面waterdrop-1.5.3.zip 下载。
spark 版本要求 (>= 2.3, < 3.0)

[Stable] 1.5.2

09 Aug 01:23
ab7f201
Compare
Choose a tag to compare

[Feature] Add redis input plugin, see documentation


注意:Waterdrop 提供可直接执行的软件包,没有必要自行编译源代码,请点击下面waterdrop-1.5.2.zip 下载。
spark 版本要求 (>= 2.3, < 3.0)

[Stable]v2.0.4

13 Oct 06:47
c2abd39
Compare
Choose a tag to compare
  • published api modules to maven central repo.

  • added waterdrop-config source code.

  • added build.md guide.

  • refined project code and pom.xml structure.