@@ -74,13 +74,13 @@ You can link against this library in your program at the following coordinates:
7474</tr >
7575<tr >
7676<td >
77- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.8.1 </pre >
77+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.11<br >version: 2.8.2 </pre >
7878</td >
7979<td >
80- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.8.1 </pre >
80+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.12<br >version: 2.8.2 </pre >
8181</td >
8282<td >
83- <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.8.1 </pre >
83+ <pre >groupId: za.co.absa.cobrix<br >artifactId: spark-cobol_2.13<br >version: 2.8.2 </pre >
8484</td >
8585</tr >
8686</table >
@@ -91,17 +91,17 @@ This package can be added to Spark using the `--packages` command line option. F
9191
9292### Spark compiled with Scala 2.11
9393```
94- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.8.1
94+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.11:2.8.2
9595```
9696
9797### Spark compiled with Scala 2.12
9898```
99- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.8.1
99+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.8.2
100100```
101101
102102### Spark compiled with Scala 2.13
103103```
104- $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.8.1
104+ $SPARK_HOME/bin/spark-shell --packages za.co.absa.cobrix:spark-cobol_2.13:2.8.2
105105```
106106
107107## Usage
@@ -237,18 +237,18 @@ Cobrix's `spark-cobol` data source depends on the COBOL parser that is a part of
237237
238238The jars that you need to get are:
239239
240- * spark-cobol_2.12-2.8.1 .jar
241- * cobol-parser_2.12-2.8.1 .jar
240+ * spark-cobol_2.12-2.8.2 .jar
241+ * cobol-parser_2.12-2.8.2 .jar
242242
243243> Versions older than 2.8.0 also need ` scodec-core_2.12-1.10.3.jar ` and ` scodec-bits_2.12-1.1.4.jar ` .
244244
245245> Versions older than 2.7.1 also need ` antlr4-runtime-4.8.jar ` .
246246
247247After that you can specify these jars in ` spark-shell ` command line. Here is an example:
248248```
249- $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.8.1
249+ $ spark-shell --packages za.co.absa.cobrix:spark-cobol_2.12:2.8.2
250250or
251- $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.8.1 .jar,cobol-parser_2.12-2.8.1 .jar
251+ $ spark-shell --master yarn --deploy-mode client --driver-cores 4 --driver-memory 4G --jars spark-cobol_2.12-2.8.2 .jar,cobol-parser_2.12-2.8.2 .jar
252252
253253Setting default log level to "WARN".
254254To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
@@ -319,7 +319,7 @@ The fat jar will have '-bundle' suffix. You can also download pre-built bundles
319319
320320Then, run ` spark-shell` or ` spark-submit` adding the fat jar as the option.
321321` ` ` sh
322- $ spark-shell --jars spark-cobol_2.12_3.3-2.8.2 -SNAPSHOT-bundle.jar
322+ $ spark-shell --jars spark-cobol_2.12_3.3-2.8.3 -SNAPSHOT-bundle.jar
323323` ` `
324324
325325> < b> A note for building and running tests on Windows< /b>
@@ -1791,6 +1791,15 @@ at org.apache.hadoop.io.nativeio.NativeIO$POSIX.getStat(NativeIO.java:608)
17911791A: Update hadoop dll to version 3.2.2 or newer.
17921792
17931793## Changelog
1794+ - #### 2.8.2 released 25 February 2025.
1795+ - [ #744 ] ( https://github.com/AbsaOSS/cobrix/issues/744 ) Added the ability to specify default record length for the record length field mapping:
1796+ The default record length can be specified by assigning a value to the underscore key ` "_" ` . For example:
1797+ ``` scala
1798+ .option(" record_format" , " F" )
1799+ .option(" record_length_field" , " RECORD_TYPE" )
1800+ .option(" record_length_map" , """ {"A":100,"B":200,"_":500}""" )
1801+ ```
1802+
17941803- #### 2.8.1 released 27 January 2025 .
17951804 - [# 730 ](https:// github.com/ AbsaOSS / cobrix/ issues/ 730 ) Added more code pages with euro character in https:// github.com/ AbsaOSS / cobrix/ pull/ 741 .
17961805 - [# 740 ](https:// github.com/ AbsaOSS / cobrix/ issues/ 740 ) Extended binary type support to make sure unsigned binary fields can fit Spark data types in https:// github.com/ AbsaOSS / cobrix/ pull/ 742 .
0 commit comments