我正在尝试使用 Spark 从 Cassandra 读取数据。
DataFrame rdf = sqlContext.read().option("keyspace", "readypulse")
.option("table", "ig_posts")
.format("org.apache.spark.sql.cassandra").load();
rdf.registerTempTable("cassandra_table");
System.out.println(sqlContext.sql("select count(external_id) from cassandra_table").collect()[0].getLong(0));
任务失败并出现以下错误。我无法理解为什么调用 ShuffleMaptask 以及为什么将其转换为 Task 是一个问题。
16/03/30 02:27:15 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, ip-10-165-180-22.ec2.internal):
java.lang.ClassCastException:
org.apache.spark.scheduler.ShuffleMapTask
cannot be cast to org.apache.spark.scheduler.Task
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
16/03/30 02:27:15 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on executor ip-10-165-180-22.ec2.internal:
java.lang.ClassCastException (org.apache.spark.scheduler.Shuf
fleMapTask
cannot be cast to org.apache.spark.scheduler.Task) [duplicate 1]
16/03/30 02:27:15 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
我正在使用 EMR 4.4、Spark 1.6、Cassandra 2.2(Datastax 社区)和 spark-cassandra-connector-java_2.10 1.6.0-M1(也尝试过 1.5.0)
我也尝试了以下代码,但得到了同样的错误。
CassandraJavaRDD<CassandraRow> cjrdd = functions.cassandraTable(
KEYSPACE, tableName).select(columns);
logger.info("Got rows from cassandra " + cjrdd.count());
JavaRDD<Double> jrdd2 = cjrdd.map(new Function<CassandraRow, Double>() {
@Override
public Double call(CassandraRow trainingRow) throws Exception {
Object fCount = trainingRow.getRaw("follower_count");
double count = 0;
if (fCount != null) {
count = (Long) fCount;
}
return count;
}
});
logger.info("Mapper done : " + jrdd2.count());
logger.info("Mapper done values : " + jrdd2.collect());