0

I am trying the MLlib examples from this page (on Spark using Scala): MLlib Page

All the examples are throwing the same error error. I have given the one I am getting for Linear Regression:

    scala> val model = LinearRegressionWithSGD.train(parsedData, numIterations)
    java.lang.RuntimeException: Error in configuring object
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.spark.rdd.HadoopRDD.getInputFormat(HadoopRDD.scala:123)
    at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:136)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:207)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)

Could someone please guide on what is causing this error? Thank you.

4

1 回答 1

0

刚刚找到答案……显然,bashrc中的一些设置与Spark冲突。删除 bashrc 文件解决了这个问题。

于 2014-07-08T08:13:20.057 回答