Skip to content

Commit ae8495c

Browse files
committed
version bump to 0.4.1
1 parent 1af02d8 commit ae8495c

File tree

4 files changed

+9
-9
lines changed

4 files changed

+9
-9
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@ See [Our Features](https://dataflint.gitbook.io/dataflint-for-spark/overview/our
5858

5959
Install DataFlint via sbt:
6060
```sbt
61-
libraryDependencies += "io.dataflint" %% "spark" % "0.4.0"
61+
libraryDependencies += "io.dataflint" %% "spark" % "0.4.1"
6262
```
6363

6464
Then instruct spark to load the DataFlint plugin:
@@ -76,7 +76,7 @@ Add these 2 configs to your pyspark session builder:
7676
```python
7777
builder = pyspark.sql.SparkSession.builder
7878
...
79-
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.4.0") \
79+
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.4.1") \
8080
.config("spark.plugins", "io.dataflint.spark.SparkDataflintPlugin") \
8181
...
8282
```
@@ -87,14 +87,14 @@ Alternatively, install DataFlint with **no code change** as a spark ivy package
8787

8888
```bash
8989
spark-submit
90-
--packages io.dataflint:spark_2.12:0.4.0 \
90+
--packages io.dataflint:spark_2.12:0.4.1 \
9191
--conf spark.plugins=io.dataflint.spark.SparkDataflintPlugin \
9292
...
9393
```
9494

9595
### Additional installation options
9696

97-
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.4.0
97+
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.4.1
9898
* For more installation options, including for **python** and **k8s spark-operator**, see [Install on Spark docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark)
9999
* For installing DataFlint in **spark history server** for observability on completed runs see [install on spark history server docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark-history-server)
100100
* For installing DataFlint on **DataBricks** see [install on databricks docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-databricks)

spark-plugin/build.sbt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
import xerial.sbt.Sonatype._
22

3-
lazy val versionNum: String = "0.4.1-SNAPSHOT"
3+
lazy val versionNum: String = "0.4.1"
44
lazy val scala212 = "2.12.18"
55
lazy val scala213 = "2.13.12"
66
lazy val supportedScalaVersions = List(scala212, scala213)
@@ -22,7 +22,7 @@ lazy val dataflint = project
2222

2323
lazy val plugin = (project in file("plugin"))
2424
.settings(
25-
name := "dataflint-spark-dbr-14-plus",
25+
name := "spark",
2626
organization := "io.dataflint",
2727
crossScalaVersions := supportedScalaVersions,
2828
version := (if (git.gitCurrentTags.value.exists(_.startsWith("v"))) {

spark-ui/package-lock.json

Lines changed: 2 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

spark-ui/package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "dataflint-ui",
3-
"version": "0.4.0",
3+
"version": "0.4.1",
44
"homepage": "./",
55
"private": true,
66
"dependencies": {

0 commit comments

Comments
 (0)