我是 Scala (2.10) 的新手,目前正在开发 POC 以在 HBase 中存储一些数据。为了存储数据,我尝试使用 scala 酸洗库将我的案例类序列化为二进制格式
"org.scala-lang.modules" %% "scala-pickling" % "0.10.1"
我有这两个简单的类:
case class Location(source: Source,
country: String,
region: String,
metro: String,
postalcode: String) {
}
和
case class Source(name: String,
trust: Float,
created: String) {
/** compares this Source with the other source and returns the difference in their trust levels */
def compare(other: Source): Float = {
trust - other.trust
}
/** returns whether you should prefer this source (true) or the other source (false) */
def prefer(other: Source): Boolean = {
trust >= other.trust
}
}
object Source {
def apply(name: String, trust: Float) = new Source(name, trust, DateTime.now.toString)
def apply(row: Row) = {
val name = row.getAs[String](0)
val trust = row.getAs[Float](1)
val created = row.getAs[String](2)
new Source(name, trust, created)
}
}
我正在使用 ScalaTest 类测试序列化
import scala.pickling._
import binary._
class DebuggingSpec extends UnitSpec {
"debugging" should "Allow the serialisation and deserialisation of a Link class" in {
val loc = new Location(Source("Source1", 1), "UK", "Wales", "Cardiff", "CF99 1PP")
val bytes = loc.pickle
bytes.value.length should not be(0)
}
it should "Allow the serialisation and deserialisation of a Location class" in {
val link = Link(Source("Source1", 1), "MyId1", 3)
val bytes = link.pickle
bytes.value.length should not be(0)
}
}
但是当我在 IntelliJ 中或通过 sbt 包在命令行上编译它时,我收到以下错误消息
Error:(12, 9) macro implementation not found: pickle (the most common reason for that is that you cannot use macro implementations in the same compilation run that defines them)
val bytes = loc.pickle
^
编辑:我已经在 spark-shell (1.3.1) 中成功运行了这段代码,它会很高兴地腌制和解开这些类......但是相同的代码和导入会在编译时产生错误