Skip to content

Commit

Permalink
version bump to 0.2.1
Browse files Browse the repository at this point in the history
  • Loading branch information
menishmueli committed Jun 3, 2024
1 parent 1922969 commit 142a96d
Show file tree
Hide file tree
Showing 5 changed files with 9 additions and 9 deletions.
2 changes: 1 addition & 1 deletion .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"version": "0.2.1",
"configurations": [
{
"type": "chrome",
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ See [Our Features](https://dataflint.gitbook.io/dataflint-for-spark/overview/our

Install DataFlint via sbt:
```sbt
libraryDependencies += "io.dataflint" %% "spark" % "0.2.0"
libraryDependencies += "io.dataflint" %% "spark" % "0.2.1"
```

Then instruct spark to load the DataFlint plugin:
Expand All @@ -65,7 +65,7 @@ Add these 2 configs to your pyspark session builder:
```python
builder = pyspark.sql.SparkSession.builder
...
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.2.0") \
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.2.1") \
.config("spark.plugins", "io.dataflint.spark.SparkDataflintPlugin") \
...
```
Expand All @@ -76,7 +76,7 @@ Alternatively, install DataFlint with **no code change** as a spark ivy package

```bash
spark-submit
--packages io.dataflint:spark_2.12:0.2.0 \
--packages io.dataflint:spark_2.12:0.2.1 \
--conf spark.plugins=io.dataflint.spark.SparkDataflintPlugin \
...
```
Expand All @@ -89,7 +89,7 @@ After the installations you will see a "DataFlint" button in Spark UI, click on

### Additional installation options

* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.2.0
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.2.1
* For more installation options, including for **python** and **k8s spark-operator**, see [Install on Spark docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark)
* For installing DataFlint in **spark history server** for observability on completed runs see [install on spark history server docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark-history-server)
* For installing DataFlint on **DataBricks** see [install on databricks docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-databricks)
Expand Down
2 changes: 1 addition & 1 deletion spark-plugin/build.sbt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import xerial.sbt.Sonatype._

lazy val versionNum: String = "0.2.0"
lazy val versionNum: String = "0.2.1"
lazy val scala212 = "2.12.18"
lazy val scala213 = "2.13.12"
lazy val supportedScalaVersions = List(scala212, scala213)
Expand Down
4 changes: 2 additions & 2 deletions spark-ui/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion spark-ui/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "dataflint-ui",
"version": "0.2.0",
"version": "0.2.1",
"homepage": "./",
"private": true,
"dependencies": {
Expand Down

0 comments on commit 142a96d

Please sign in to comment.