我正在尝试运行 Sedona Spark Visualization 教程代码。示例链接:https ://sedona.apache.org/tutorial/viz/
以下数据集是我的baseDF
:
纬度 | 经度 |
---|---|
-88.331492 | 32.324142 |
-88.175933 | 32.360763 |
-88.388954 | 32.357073 |
-88.221102 | 32.35078 |
我正在关注文档中的示例代码,我的代码如下:
var sparkSession = SparkSession.builder()
.master("local[*]") // Delete this if run in cluster mode
.appName("Sedona Viz") // Change this to a proper name
.config("spark.serializer", classOf[KryoSerializer].getName) // org.apache.spark.serializer.KryoSerializer
.config("spark.kryo.registrator", classOf[SedonaVizKryoRegistrator].getName) // org.apache.sedona.viz.core.Serde.SedonaVizKryoRegistrator
.getOrCreate()
SedonaSQLRegistrator.registerAll(sparkSession)
SedonaVizRegistrator.registerAll(sparkSession)
baseDF.createOrReplaceTempView("baseDF")
var myNewDF = sparkSession.sql(
"""
SELECT ST_Point(cast(baseDF.latitude as Decimal(24,20)),cast(baseDF.longitude as Decimal(24,20))) as shape
FROM baseDF
""".stripMargin)
myNewDF.createOrReplaceTempView("pointtable")
var pixelizer = sparkSession.sql(
"""
SELECT ST_Envelope_Aggr(shape) as bound FROM pointtable
""".stripMargin)
pixelizer.createOrReplaceTempView("boundtable")
var stPixelize = sparkSession.sql(
"""
SELECT pixel, shape FROM pointtable
LATERAL VIEW explode(ST_Pixelize(ST_Transform(shape, 'epsg:4326','epsg:3857'), 256, 256, (SELECT ST_Transform(bound, 'epsg:4326','epsg:3857') FROM boundtable))) AS pixel
"""
)
stPixelize.createOrReplaceTempView("pixels")
//Aggregate
var agregate = sparkSession.sql(
"""
SELECT pixel, count(*) as weight
FROM pixels
GROUP BY pixel
"""
)
agregate.createOrReplaceTempView("pixelaggregates")
//COLORIZE
var colorize = sparkSession.sql(
"""
SELECT pixel, ST_Colorize(weight, (SELECT max(weight) FROM pixelaggregates)) as color
FROM pixelaggregates
"""
)
colorize.createOrReplaceTempView("pixelaggregates")
var render = sparkSession.sql(
"""
SELECT ST_Render(pixel,color) AS image, (SELECT ST_AsText(bound) FROM boundtable) AS boundary
FROM pixelaggregates
"""
)
render.createOrReplaceTempView("images")
var image = sparkSession.table("images").take(1)(0)(0).asInstanceOf[ImageSerializableWrapper].getImage
var imageGenerator = new ImageGenerator
imageGenerator.SaveRasterImageAsLocalFile(image, System.getProperty("user.dir")+"/target/points", ImageType.PNG)
然后,我收到以下错误:
21/07/15 16:03:52 ERROR Executor: Exception in task 0.0 in stage 15.0 (TID 433)
java.lang.AssertionError: assertion failed
at scala.Predef$.assert(Predef.scala:208)
at org.apache.spark.sql.sedona_viz.expressions.ST_Render.merge(Render.scala:98)
at org.apache.spark.sql.execution.aggregate.ScalaUDAF.merge(udaf.scala:444)
at org.apache.spark.sql.execution.aggregate.AggregationIterator$$anonfun$1.$anonfun$applyOrElse$3(AggregationIterator.scala:199)
我查看了代码,显然,它正在通过一个断言,在那里,分辨率小于或等于 0,但是,我创建了一些打印并测试了一些值并将分辨率设置为 256,它仍然是仍然给出同样的错误。我做错了什么还是一个错误?