0

如何在独立集群上终止具有作业名称的 Spark 作业?如何在沙盒上列出 Spark 作业 ID?有没有类似的命令yarn application -list

4

1 回答 1

1
variable=$1
jar=$2
ps -ef | grep -w ${variable} | grep -w 'org.apache.spark.deploy.SparkSubmit' | grep -w ${jar}>t.txt
 sed -n 1p t.txt>t1.txt
 awk '{print $3}' t1.txt >kill.txt
 while read x;do
  kill -9 $x
  echo Process Id $x and Application "$variable" killed
done <kill.txt
于 2019-02-07T21:23:54.247 回答