0

我正在编写一个简单的 Apache Spark 实用程序,它会根据提供的初始值自动创建一个 AccumulatorV2:

  import java.lang

  type Acc[T] = AccumulatorV2[T, T]

  implicit val long1: Long => Acc[lang.Long] = _ => new LongAccumulator()LongAccumulator()

  implicit def double1: Double => Acc[lang.Double] = _ => new DoubleAccumulator()

  def accumulator[T1, T2](
                           initialValue: T1,
                           name: String
                         )(implicit
                           canBuildFrom: T1 => Acc[T2],
                           converter: T1 => T2,
                           sc: SparkContext = SparkContext.getOrCreate()
                         ): Acc[T2] = {

    val acc: Acc[T2] = canBuildFrom(initialValue)
    acc.reset()
    acc.add(initialValue)

    sc.register(acc, name)

    acc
  }

2个隐式参数:canBuildFrom和converter应该分别由隐式函数和编译器预定义的类型转换提供。但是,当我尝试调用它时:

Metrics.accumulator(0L, "webDriverDispatched")

我收到以下错误信息:

Error:(102, 84) No implicit view available from Long => com.tribbloids.spookystuff.Metrics.Acc[T2].
                          webDriverDispatched: Acc[lang.Long] = Metrics.accumulator(0L, "webDriverDispatched"),
Error:(102, 84) not enough arguments for method accumulator: (implicit canBuildFrom: Long => com.tribbloids.spookystuff.Metrics.Acc[T2], implicit converter: Long => T2, implicit sc: org.apache.spark.SparkContext)com.tribbloids.spookystuff.Metrics.Acc[T2].
Unspecified value parameters canBuildFrom, converter.
                          webDriverDispatched: Acc[lang.Long] = Metrics.accumulator(0L, "webDriverDispatched"),

为什么 scala 编译器找不到正确的隐式值?我应该怎么做才能解决它?

4

0 回答 0