Skip to content

Commit 7866d64

Browse files
committed
Version bump to 6.6.0 and doc change
1 parent 299ac67 commit 7866d64

File tree

3 files changed

+29
-7
lines changed

3 files changed

+29
-7
lines changed

CHANGELOG

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,28 @@
1616

1717
# spark-redshift Changelog
1818

19+
## 6.6.0 (2026-1-15)
20+
- Support for Apache Spark 4.0.0, which includes Scala 2.13 as its bundled Scala version [Ruei-Yang Huang]
21+
- Updated deprecated syntax and APIs for compatibility with Scala 2.13 [Ruei-Yang Huang]
22+
- Added retry logic for DataAPI throttling exceptions (e.g., ThrottlingException), automatically retrying affected operations with configurable backoff strategies to improve resilience during rate-limited scenarios [Ruei-Yang Huang]
23+
24+
## 6.5.1 (2025-11-17)
25+
- Excluded unused dependency jackson-mapper-asl dependency due to CVEs [Luis Garza]
26+
- Upgraded Apache Spark dependency to version 3.5.7 [Luis Garza]
27+
- Upgraded to JDBC version 2.2.0 [Luis Garza]
28+
29+
## 6.5.0 (2025-10-08)
30+
- Upgraded to AWS SDK v2 [Luis Garza]
31+
- Improved AWS credentials handling by deferring credential resolution until actually needed [Luis Garza]
32+
- Fixed string comparison in integration tests to handle cross-platform line ending differences [Luis Garza]
33+
34+
## 6.4.3 (2025-07-01)
35+
- Upgraded Apache Spark dependency to version 3.5.6 [Luis Garza]
36+
- Upgraded to JDBC version 2.1.0.33 [Luis Garza]
37+
- Upgraded sbt build tool to version 1.11.2 [Ruei-Yang Huang]
38+
- Updated Sonatype publishing from OSSRH to Central Publisher Portal [Ruei-Yang Huang]
39+
- Fixed getDefaultRegion handling to return null on exceptions, aligning with AWS SDK behavior [Ruei-Yang Huang]
40+
1941
## 6.4.2 (2025-04-09)
2042
- Add spark configurations for enforcing secure JDBC connections and usage of aws_iam_role for authorizing Redshift COPY/UNLOAD operations.
2143
- Verified the connector is compatible with the latest Spark patch release 3.5.5.

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@ You may use this library in your applications with the following dependency info
105105
spark-submit \
106106
--deploy-mode cluster \
107107
--master yarn \
108-
--packages com.amazon.redshift:redshift-jdbc42:2.2.0,org.apache.spark:spark-avro_2.12:3.5.7,io.github.spark-redshift-community:spark-redshift_2.12:6.5.1-spark_3.5 \
108+
--packages com.amazon.redshift:redshift-jdbc42:2.2.0,org.apache.spark:spark-avro_2.12:3.5.7,io.github.spark-redshift-community:spark-redshift_2.12:6.6.0-spark_4.0 \
109109
my_script.py
110110
```
111111

@@ -116,14 +116,14 @@ You may use this library in your applications with the following dependency info
116116
<dependency>
117117
<groupId>io.github.spark-redshift-community</groupId>
118118
<artifactId>spark-redshift_2.12</artifactId>
119-
<version>6.5.1-spark_3.5</version>
119+
<version>6.6.0-spark_4.0</version>
120120
</dependency>
121121
```
122122

123123
- **In SBT**:
124124

125125
```SBT
126-
libraryDependencies += "io.github.spark-redshift-community" %% "spark-redshift_2.12" % "6.5.1-spark_3.5"
126+
libraryDependencies += "io.github.spark-redshift-community" %% "spark-redshift_2.12" % "6.6.0-spark_4.0"
127127
```
128128

129129
### Local builds
@@ -132,7 +132,7 @@ You may also build the connector locally by following the below steps.
132132
2. Install Java 1.8
133133
3. Install scala (https://www.scala-lang.org/download/)
134134
4. Install sbt (https://www.scala-sbt.org/download/)
135-
5. Modify the value `sparkVersion` within `build.sbt` to the target version of Spark. The connector supports Spark 3.3.x, 3.4.x, and 3.5.x
135+
5. Modify the value `sparkVersion` within `build.sbt` to the target version of Spark. The connector supports Spark 3.3.x, 3.4.x, 3.5.x, and 4.0.x
136136
6. Build the connector `sbt clean package`
137137
7. The jar file can be found in `target\scala-2.12\`
138138
@@ -901,7 +901,7 @@ for more information.</p>
901901
<td>""</td>
902902
<td>
903903
An identifier to include in the query group set when running queries with the connector. Should be 100 or fewer characters and all characters must be valid unicodeIdentifierParts. Characters in excess of 100 will be trimmed.
904-
When running a query with the connector a json formatted string will be set as the query group (for example `{"spark-redshift-connector":{"svc":"","ver":"6.5.1-spark_3.5","op":"Read","lbl":"","tid":""}}`).
904+
When running a query with the connector a json formatted string will be set as the query group (for example `{"spark-redshift-connector":{"svc":"","ver":"6.6.0-spark_4.0","op":"Read","lbl":"","tid":""}}`).
905905
This option will be substituted for the value of the `lbl` key.
906906
</td>
907907
</tr>
@@ -1074,7 +1074,7 @@ var sparkConf = new SparkConf().set("spark.datasource.redshift.community.reject_
10741074
### trace_id
10751075
A new tracing identifier field that is added to the existing `label` parameter. When set, the provided string value will be used as part of label. Otherwise, it will default to the Spark application identifier. For example:
10761076
1077-
`{"spark-redshift-connector":{"svc":"","ver":"6.5.1-spark_3.5","op":"Read","lbl":"","tid":"..."}}`)
1077+
`{"spark-redshift-connector":{"svc":"","ver":"6.6.0-spark_4.0","op":"Read","lbl":"","tid":"..."}}`)
10781078
10791079
To set the value, run the following command:
10801080
```sparksql

version.sbt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,4 +14,4 @@
1414
* limitations under the License.
1515
*/
1616

17-
ThisBuild / version := "6.5.1"
17+
ThisBuild / version := "6.6.0"

0 commit comments

Comments
 (0)