Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ All notable changes to this project will be documented in this file.
- Add `SparkApplication.spec.job.retryOnFailureCount` field with a default of `0`.
This has the effect that applications where the `spark-submit` Pod fails are not resubmitted.
Previously, Jobs were retried at most 6 times by default ([#647]).
- Support for Spark `3.5.8` ([#650]).

### Changed

Expand All @@ -24,12 +25,14 @@ All notable changes to this project will be documented in this file.
### Removed

- Support for Spark `3.5.6` ([#642]).
- Deprecated support for Spark `3.5.7` ([#650]).

[#640]: https://github.com/stackabletech/spark-k8s-operator/pull/640
[#642]: https://github.com/stackabletech/spark-k8s-operator/pull/642
[#647]: https://github.com/stackabletech/spark-k8s-operator/pull/647
[#648]: https://github.com/stackabletech/spark-k8s-operator/pull/648
[#649]: https://github.com/stackabletech/spark-k8s-operator/pull/649
[#650]: https://github.com/stackabletech/spark-k8s-operator/pull/650
[#651]: https://github.com/stackabletech/spark-k8s-operator/pull/651

## [25.11.0] - 2025-11-07
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/spark-k8s/examples/example-history-app.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: spark-pi-s3-1
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
pullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: spark-history
spec:
image:
productVersion: 3.5.7
productVersion: 3.5.8
logFileDirectory: # <1>
s3:
prefix: eventlogs/ # <2>
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/spark-k8s/examples/example-spark-connect.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: spark-connect # <1>
spec:
image:
productVersion: "3.5.7" # <2>
productVersion: "3.5.8" # <2>
pullPolicy: IfNotPresent
args:
- "--package org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.8.1" # <3>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.1.0.jar # <3>
mainClass: tech.stackable.demo.spark.NYTLCReport
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ metadata:
spec:
image: oci.stackable.tech/stackable/ny-tlc-report:0.2.0 # <1>
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py # <2>
args:
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/spark-k8s/examples/example-sparkapp-pvc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: s3a://my-bucket/app.jar # <1>
mainClass: org.example.App # <2>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: example-sparkapp-s3-private
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: s3a://my-bucket/spark-examples.jar # <1>
mainClass: org.apache.spark.examples.SparkPi # <2>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/streaming/hdfs_wordcount.py
args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
sparkImage: # <2>
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster # <3>
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py # <4>
job: # <5>
Expand Down
10 changes: 5 additions & 5 deletions docs/modules/spark-k8s/pages/usage-guide/job-dependencies.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ Below is an example of a custom image that includes a JDBC driver:

[source, Dockerfile]
----
FROM oci.stackable.tech/sdp/spark-k8s:3.5.7-stackable0.0.0-dev # <1>
FROM oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev # <1>

RUN curl --fail -o /stackable/spark/jars/postgresql-42.6.0.jar "https://jdbc.postgresql.org/download/postgresql-42.6.0.jar" # <2>
----
Expand All @@ -41,8 +41,8 @@ Build your custom image and push it to your container registry.

[source, bash]
----
docker build -t my-registry/spark-k8s:3.5.7-psql .
docker push my-registry/spark-k8s:3.5.7-psql
docker build -t my-registry/spark-k8s:3.5.8-psql .
docker push my-registry/spark-k8s:3.5.8-psql
----

And the following snippet showcases an application that uses the custom image:
Expand All @@ -55,8 +55,8 @@ metadata:
name: spark-jdbc
spec:
sparkImage:
custom: "my-registry/spark-k8s:3.5.7-psql" # <1>
productVersion: "3.5.7" # <2>
custom: "my-registry/spark-k8s:3.5.8-psql" # <1>
productVersion: "3.5.8" # <2>
...
----

Expand Down
9 changes: 5 additions & 4 deletions docs/modules/spark-k8s/partials/supported-versions.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,12 @@
// Stackable Platform documentation.
// Please sort the versions in descending order (newest first)

- 4.1.1 (Hadoop 3.4.2, Scala 2.13, Python 3.12, Java 21) (Experimental)
- 4.0.1 (Hadoop 3.4.2, Scala 2.13, Python 3.12, Java 21)
- 3.5.7 (Hadoop 3.4.2, Scala 2.12, Python 3.11, Java 17) (LTS)
- 4.1.1 (Hadoop 3.4.2, Scala 2.13, Python 3.12, Java 21)
- 4.0.1 (Hadoop 3.4.2, Scala 2.13, Python 3.12, Java 21) (Deprecated)
- 3.5.8 (Hadoop 3.4.2, Scala 2.12, Python 3.11, Java 17) (LTS)
- 3.5.7 (Hadoop 3.4.2, Scala 2.12, Python 3.11, Java 17) (Deprecated)

Some reasons why Spark 4.1.1 is considered experimental (as of January 2026):
Apache Spark 4.1.1 has the following known issues (as of February 2026):

- Missing HBase compatibility (See: https://github.com/apache/hbase-connectors/pull/130)
- No Iceberg Spark runtime release with support for Spark 4.1 available yet.
Expand Down
4 changes: 2 additions & 2 deletions examples/README-examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,10 +50,10 @@ Several resources are needed in this store. These can be loaded like this:

```text
kubectl exec minio-mc-0 -- sh -c 'mc alias set test-minio http://test-minio:9000/'
kubectl cp tests/templates/kuttl/spark-ny-public-s3/ny-tlc-report-1.1.0-3.5.7.jar minio-mc-0:/tmp
kubectl cp tests/templates/kuttl/spark-ny-public-s3/ny-tlc-report-1.1.0-3.5.8.jar minio-mc-0:/tmp
kubectl cp apps/ny_tlc_report.py minio-mc-0:/tmp
kubectl cp examples/yellow_tripdata_2021-07.csv minio-mc-0:/tmp
kubectl exec minio-mc-0 -- mc cp /tmp/ny-tlc-report-1.1.0-3.5.7.jar test-minio/my-bucket
kubectl exec minio-mc-0 -- mc cp /tmp/ny-tlc-report-1.1.0-3.5.8.jar test-minio/my-bucket
kubectl exec minio-mc-0 -- mc cp /tmp/ny_tlc_report.py test-minio/my-bucket
kubectl exec minio-mc-0 -- mc cp /tmp/yellow_tripdata_2021-07.csv test-minio/my-bucket
```
Expand Down
2 changes: 1 addition & 1 deletion examples/ny-tlc-report-external-dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
pullPolicy: IfNotPresent
mode: cluster
mainApplicationFile: s3a://my-bucket/ny_tlc_report.py
Expand Down
2 changes: 1 addition & 1 deletion examples/ny-tlc-report-image.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ metadata:
spec:
# everything under /jobs will be copied to /stackable/spark/jobs
image: oci.stackable.tech/stackable/ny-tlc-report:0.2.0
sparkImage: oci.stackable.tech/sdp/spark-k8s:3.5.7-stackable0.0.0-dev
sparkImage: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py
Expand Down
4 changes: 2 additions & 2 deletions examples/ny-tlc-report.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@ metadata:
name: spark-ny-cm
spec:
sparkImage:
productVersion: 3.5.7
productVersion: 3.5.8
mode: cluster
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.5.7.jar
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.5.8.jar
mainClass: tech.stackable.demo.spark.NYTLCReport
volumes:
- name: cm-job-arguments
Expand Down
2 changes: 1 addition & 1 deletion rust/operator-binary/src/crd/affinity.rs
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ mod test {
name: spark-history
spec:
image:
productVersion: 3.5.7
productVersion: 3.5.8
logFileDirectory:
s3:
prefix: eventlogs/
Expand Down
2 changes: 1 addition & 1 deletion rust/operator-binary/src/crd/history.rs
Original file line number Diff line number Diff line change
Expand Up @@ -452,7 +452,7 @@ mod test {
name: spark-history
spec:
image:
productVersion: 3.5.7
productVersion: 3.5.8
logFileDirectory:
s3:
prefix: eventlogs/
Expand Down
4 changes: 2 additions & 2 deletions rust/operator-binary/src/history/config/jvm.rs
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ mod tests {
name: spark-history
spec:
image:
productVersion: 3.5.7
productVersion: 3.5.8
logFileDirectory:
s3:
prefix: eventlogs/
Expand Down Expand Up @@ -98,7 +98,7 @@ mod tests {
name: spark-history
spec:
image:
productVersion: 3.5.7
productVersion: 3.5.8
logFileDirectory:
s3:
prefix: eventlogs/
Expand Down
Binary file not shown.
Binary file not shown.
6 changes: 6 additions & 0 deletions tests/test-definition.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ dimensions:
- name: spark
values:
- 3.5.7
- 3.5.8
- 4.0.1
- 4.1.1
# Alternatively, if you want to use a custom image, append a comma and the full image name to the product version
Expand All @@ -14,30 +15,35 @@ dimensions:
- name: spark-logging
values:
- 3.5.7
- 3.5.8
- 4.0.1
- 4.1.1
- name: spark-hbase-connector
values:
- 3.5.7
- 3.5.8
# No hbase-connector release with support for Spark 4 yet.
# - 4.0.1
# - 4.1.1
- name: spark-delta-lake
values:
- 3.5.7
- 3.5.8
- 4.0.1
# No delta-lake release with support for Spark 4.1 yet
# - 4.1.1
# - 3.5.6,oci.stackable.tech/sandbox/spark-k8s:3.5.6-stackable0.0.0-dev
- name: spark-iceberg
values:
- 3.5.7
- 3.5.8
- 4.0.1
# No iceberg release with support for Spark 4.1 yet
# - 4.1.1
- name: spark-connect
values:
- 3.5.7
- 3.5.8
- 4.0.1
- 4.1.1
# - 3.5.6,oci.stackable.tech/sandbox/spark-k8s:3.5.6-stackable0.0.0-dev
Expand Down