我试图在单元测试中这样做:
val sConf = new SparkConf()
.setAppName("RandomAppName")
.setMaster("local")
val sc = new SparkContext(sConf)
val sqlContext = new TestHiveContext(sc) // tried new HiveContext(sc) as well
但我明白了:
[scalatest] Exception encountered when invoking run on a nested suite - java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient *** ABORTED ***
[scalatest] java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
[scalatest] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
[scalatest] at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:120)
[scalatest] at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:163)
[scalatest] at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
[scalatest] at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
[scalatest] at org.apache.spark.sql.hive.test.TestHiveContext.<init>(TestHive.scala:72)
[scalatest] at mypackage.NewHiveTest.beforeAll(NewHiveTest.scala:48)
[scalatest] at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[scalatest] at mypackage.NewHiveTest.beforeAll(NewHiveTest.scala:35)
[scalatest] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[scalatest] at mypackage.NewHiveTest.run(NewHiveTest.scala:35)
[scalatest] at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1491)
当我使用 spark-submit 运行时,代码运行良好,但在单元测试中却不行。如何为单元测试解决这个问题?