Apache Spark is a fast and general-purpose cluster computing system.
This chart will do the following:
To install the chart with the release name my-release
:
$ helm install --name my-release stable/spark
The following table lists the configurable parameters of the Spark chart and their default values.
Parameter | Description | Default |
---|---|---|
master.name |
Spark master name | master |
master.image |
Container image name | cloudpg/spark-py |
master.imageTag |
Container image tag | dodas-2.4.3-bigdl |
master.replicas |
k8s deployment replicas | 1 |
master.component |
k8s selector key | spark-master |
master.cpu |
container requested cpu | 100m |
master.memory |
container requested memory | 1024Mi |
master.servicePort |
k8s service port | 7077 |
master.containerPort |
Container listening port | 7077 |
master.daemonMemory |
Master JVM Xms and Xmx option | 1g |
master.serviceType |
Kubernetes Service type | nodePort |
master.jupyter.nodePort |
k8s node port | 30888 |
master.jupyter.token |
Token for jupyter notebook | testme |
hostpath.path |
Path where to store volumes | /tmp |
Parameter | Description | Default |
---|---|---|
webUi.name |
Spark webui name | webui |
webUi.servicePort |
k8s service port | 8080 |
webUi.containerPort |
Container listening port | 8080 |
webUi.serviceType |
Kubernetes Service type | nodePort |
webUi.nodePort |
k8s node port | 30808 |
Specify each parameter using the --set key=value[,key=value]
argument to helm install
.
Alternatively, a YAML file that specifies the values for the parameters can be provided while installing the chart. For example,
$ helm repo add dodas https://dodas-ts.github.io/helm_charts
$ helm repo update
$ helm install --name my-release -f values.yaml dodas/spark
Tip: You can use the default values.yaml