From 1d515281cee6f9f7c1518c43350f7f8f3127327b Mon Sep 17 00:00:00 2001 From: Jaroslaw Grabowski Date: Mon, 21 Aug 2023 10:31:29 +0200 Subject: [PATCH] Prepare 3.4.1 release --- CHANGES.txt | 3 +++ README.md | 14 +++++++------- doc/0_quick_start.md | 8 ++++---- doc/13_spark_shell.md | 2 +- doc/15_python.md | 2 +- 5 files changed, 16 insertions(+), 13 deletions(-) diff --git a/CHANGES.txt b/CHANGES.txt index f32c1702c..653db5e35 100644 --- a/CHANGES.txt +++ b/CHANGES.txt @@ -1,3 +1,6 @@ +3.4.1 + * Scala 2.13 support (SPARKC-686) + 3.4.0 * Spark 3.4.x support (SPARKC-702) * Fix complex field extractor after join on CassandraDirectJoinStrategy (SPARKC-700) diff --git a/README.md b/README.md index 209dbb8e6..460e28639 100644 --- a/README.md +++ b/README.md @@ -9,8 +9,8 @@ | What | Where | | ---------- |---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Community | Chat with us at [Apache Cassandra](https://cassandra.apache.org/_/community.html#discussions) | -| Scala Docs | Most Recent Release (3.4.0): [Connector API docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/connector/com/datastax/spark/connector/index.html), [Connector Driver docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/driver/com/datastax/spark/connector/index.html) | -| Latest Production Release | [3.4.0](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.4.0/jar) | +| Scala Docs | Most Recent Release (3.4.1): [Connector API docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/connector/com/datastax/spark/connector/index.html), [Connector Driver docs](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/driver/com/datastax/spark/connector/index.html) | +| Latest Production Release | [3.4.1](https://search.maven.org/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.4.1/jar) | ## Features @@ -53,8 +53,8 @@ Currently, the following branches are actively supported: 2.5.x ([b2.5](https://github.com/datastax/spark-cassandra-connector/tree/b2.5)). | Connector | Spark | Cassandra | Cassandra Java Driver | Minimum Java Version | Supported Scala Versions | -|-----------|---------------|-----------------------| --------------------- | -------------------- | ----------------------- | -| 3.4 | 3.4 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12 | +|-----------|---------------|-----------------------| --------------------- | -------------------- |--------------------------| +| 3.4 | 3.4 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12, 2.13 | | 3.3 | 3.3 | 2.1.5*, 2.2, 3.x, 4.x | 4.13 | 8 | 2.12 | | 3.2 | 3.2 | 2.1.5*, 2.2, 3.x, 4.0 | 4.13 | 8 | 2.12 | | 3.1 | 3.1 | 2.1.5*, 2.2, 3.x, 4.0 | 4.12 | 8 | 2.12 | @@ -77,8 +77,8 @@ Currently, the following branches are actively supported: ## Hosted API Docs API documentation for the Scala and Java interfaces are available online: -### 3.4.0 -* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.0/connector/com/datastax/spark/connector/index.html) +### 3.4.1 +* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.4.1/connector/com/datastax/spark/connector/index.html) ### 3.3.0 * [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.3.0/connector/com/datastax/spark/connector/index.html) @@ -105,7 +105,7 @@ This project is available on the Maven Central Repository. For SBT to download the connector binaries, sources and javadoc, put this in your project SBT config: - libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.4.0" + libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.4.1" * The default Scala version for Spark 3.0+ is 2.12 please choose the appropriate build. See the [FAQ](doc/FAQ.md) for more information. diff --git a/doc/0_quick_start.md b/doc/0_quick_start.md index 94a5bb892..f046bc73b 100644 --- a/doc/0_quick_start.md +++ b/doc/0_quick_start.md @@ -15,14 +15,14 @@ Configure a new Scala project with the Apache Spark and dependency. The dependencies are easily retrieved via Maven Central - libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.4.0" + libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.4.1" The spark-packages libraries can also be used with spark-submit and spark shell, these commands will place the connector and all of its dependencies on the path of the Spark Driver and all Spark Executors. - $SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 - $SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 + $SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 + $SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 For the list of available versions, see: - https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.12/ @@ -42,7 +42,7 @@ and *all* of its dependencies on the Spark Class PathTo configure the default Spark Configuration pass key value pairs with `--conf` $SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \ - --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 + --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 --conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions This command would set the Spark Cassandra Connector parameter diff --git a/doc/13_spark_shell.md b/doc/13_spark_shell.md index cd72e035c..49303d24c 100644 --- a/doc/13_spark_shell.md +++ b/doc/13_spark_shell.md @@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://repo1.maven.org/maven2/com/ ```bash cd spark/install/dir #Include the --master if you want to run against a spark cluster and not local mode -./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp +./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 --conf spark.cassandra.connection.host=yourCassandraClusterIp ``` By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file diff --git a/doc/15_python.md b/doc/15_python.md index 4a76f8df9..e1016c080 100644 --- a/doc/15_python.md +++ b/doc/15_python.md @@ -14,7 +14,7 @@ shell similarly to how the spark shell is started. The preferred method is now t ```bash ./bin/pyspark \ - --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.0 \ + --packages com.datastax.spark:spark-cassandra-connector_2.12:3.4.1 \ --conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions ```