我能够在 k8s 中执行 SparkPi 并部署(在 GKE 中)。
但是,当我尝试将 PI 值广播到我的微服务时toys-broadcast-svc.toys.svc.cluster.local
我无法解析 DNS(获取UnknownHostException)。任何人都可以帮忙吗?我在这里错过了什么吗?
供您参考:
我已经用 helm 安装了操作员
helm install sparkoperator incubator/sparkoperator --namespace toys-spark-operator --set sparkJobNamespace=toys-spark,enableWebhook=true
我正在使用 spark-operator(微服务在命名空间中
toys
,火花在命名空间中toys-spark
)
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: spark-pi
namespace: toys-spark #apps namespace
spec:
type: Java
mode: cluster
image: toysindia/spark:3.0.1
imagePullPolicy: Always
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.12-3.0.1.jar
sparkVersion: 3.0.1
restartPolicy:
type: Never
volumes:
- name: "toys-spark-test-volume-driver"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
type: Directory
- name: "toys-spark-test-volume-executor"
hostPath:
path: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
type: Directory
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
labels:
version: 3.0.1
serviceAccount: spark
volumeMounts:
- name: "toys-spark-test-volume-driver"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/driver"
executor:
cores: 1
instances: 1
memory: "512m"
labels:
version: 3.0.1
volumeMounts:
- name: "toys-spark-test-volume-executor"
mountPath: "/host_mnt/usr/local/storage/k8s/dock-storage/spark/executor"
sparkConf:
spark.eventLog.dir:
spark.eventLog.enabled: "true"
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark-operator
---
apiVersion: v1
kind: Namespace
metadata:
name: toys-spark #apps namespace
---
apiVersion: v1
kind: ServiceAccount
metadata:
name: spark
namespace: toys-spark #apps namespace
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
name: spark-operator-role
namespace: toys-spark #apps namespace
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: ClusterRole
name: edit
subjects:
- kind: ServiceAccount
name: spark
namespace: toys-spark #apps namespace