我正在尝试通过 docker 或从https://almond.sh在线启动 Almond Jupiter 。在 spark.ipynb 镜像中,与 NotebookSparkSession 的行出现错误
import $ivy.`org.apache.spark::spark-sql:2.4.0`
import $ivy.`sh.almond::almond-spark:0.3.0`
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)
import org.apache.spark.sql._
val spark = {
NotebookSparkSession.builder()
.master("local[*]")
.getOrCreate()
}
docker有一个例外:
java.lang.NoSuchMethodError: coursier.package$Resolution$.apply$default$13()Lscala/collection/immutable/Map;
org.apache.spark.sql.ammonitesparkinternals.SparkDependencies$.sparkJars(SparkDependencies.scala:134)
org.apache.spark.sql.ammonitesparkinternals.AmmoniteSparkSessionBuilder.getOrCreate(AmmoniteSparkSessionBuilder.scala:234)
org.apache.spark.sql.almondinternals.NotebookSparkSessionBuilder.getOrCreate(NotebookSparkSessionBuilder.scala:62)
我尝试使用具有相同 spark.ipynb 的在线版本,但有一个例外
java.lang.AssertionError: assertion failed:
NotebookSparkSession.builder()
while compiling: cmd3.sc
during phase: superaccessors
library version: version 2.12.8
compiler version: version 2.12.8
reconstructed args: -nowarn -Yresolve-term-conflict:object
last tree to typer: This(class cmd3)
tree position: line 19 of cmd3.sc
tree tpe: cmd3.this.type
symbol: final class cmd3 in package $sess
symbol definition: final class cmd3 extends Serializable (a ClassSymbol)
symbol package: ammonite.$sess
symbol owners: class cmd3
call site: class Helper in class cmd3 in package $sess