我magellan-1.0.4-s_2.11
在 spark 笔记本中导入时遇到问题。我已经从https://spark-packages.org/package/harsha2010/magellan下载了 jar,并尝试将其放入SPARK_HOME/bin/spark-shell --packages harsha2010:magellan:1.0.4-s_2.11
binStart of Customized Settings
文件夹的 spark-notebook 文件部分。
这是我的进口
import magellan.{Point, Polygon, PolyLine}
import magellan.coord.NAD83
import org.apache.spark.sql.magellan.MagellanContext
import org.apache.spark.sql.magellan.dsl.expressions._
import org.apache.spark.sql.Row
import org.apache.spark.sql.types._
而我的错误...
<console>:71: error: object Point is not a member of package org.apache.spark.sql.magellan
import magellan.{Point, Polygon, PolyLine}
^
<console>:72: error: object coord is not a member of package org.apache.spark.sql.magellan
import magellan.coord.NAD83
^
<console>:73: error: object MagellanContext is not a member of package org.apache.spark.sql.magellan
import org.apache.spark.sql.magellan.MagellanContext
然后,我尝试像其他任何库一样导入新库,方法是将其放入main script
类似的位置:
$lib_dir/magellan-1.0.4-s_2.11.jar"
这没有用,我只能挠头想知道我做错了什么。如何将 magellan 等库导入 spark notebook?