Releases: snowflakedb/spark-snowflake
Releases · snowflakedb/spark-snowflake
v3.1.8
Improvements
- Fixed UnsupportedOperationException: Unexpected type: NullType when writing DataFrames with structured columns (StructType) containing all-null values via the Parquet write path.
Full Changelog: v3.1.7...v3.1.8
v3.1.7
v3.1.6
Bug Fixes
- Avro Generation: Fixed handling of nested data structures in Avro data generation.
Improvements
- Custom Stages: Added support for user-specified stages during data loading and unloading.
- Iceberg FGAC Support: Enabled R/W access for tables protected by Fine-Grained Access Control.
Full Changelog: v3.1.5...v3.1.6
v3.1.5
Bug Fixes
- S3 Temporary Credentials: Resolved an issue where temporary AWS credentials could not be used to access the S3 bucket and directory designated for data exchange between Spark and Snowflake.
Full Changelog: v3.1.4...v3.1.5
v3.1.4
Bug Fixes
- The S3 client now explicitly uses BASIC authentication for proxy connections when a user and password are provided, preventing potential connection failures with unsupported methods.
Improvements
- Upgrade the suggested commons-lang version
v3.1.3
New feature
- A force option can be specified now when writing DataFrames to tables.
v3.1.2
Enhancements
- Updated JDBC to version 3.24.2 to incorporate a bug fix for the Java TrustManager.
- Upgraded the parquet-avro library to mitigate security vulnerabilities.
v3.1.1
- Bug Fixes
- Fixed a URL resolution issue with China deployments.
v3.1.0
Improvements
- Upgraded JDBC to
3.19.0. - Changed the internal file format from Json to Parquet when loading structured data.
- Introduced a new parameter
use_json_in_structured_data, which default tofalse. Once enabled, this change will be revoked.
- Introduced a new parameter
New Features
- Supported Parquet file format when loading data from Spark to Snowflake.
- Introduced a new parameter
use_parquet_in_write, which default tofalse. When enabled, Spark connector will only use Parquet file format when loading data from Spark to Snowflake. - Introduced a new dependency
parquet-avro. The default version is1.13.1. Since its dependency,parquet-column, is a Spark built-in lib, an incompatible issue may be occurred during runtime. Please manually adjust the version ofparquet-avroto fix this issue.
- Introduced a new parameter
v3.0.0
- Improvements
- Upgraded JDBC to 3.17.0 to Support LOB
- Supports Spark 3.5.0
- Removed the Advanced Query Pushdown feature
- Since version
3.0.0, Spark connector will only have one artifact in each release, which will be compatible with most Spark versions. - The old version of Spark connector (2.x.x) will continue to be supported up to 2 years.
- A conversion tool which can convert DataFrames between Spark and Snowpark will be introduced in the future Spark connector release soon. It will be an alternative of Advanced Query Pushdown feature.
- Since version
- Bug Fixes
- Remove the requirement of
SFUSERparameter when using OAUTH
- Remove the requirement of