我已经部署了一个cloudera CDH 5.13.1
带有SAP Vora 1.4 Patch 4
.
当我启动Vora thrift
服务器时,一切看起来都很好,但是一旦我启动SAP Vora tools
并登录,就会出现以下错误:
17/12/20 11:26:52 ERROR thriftserver.SparkExecuteStatementOperation: Error executing query, currentState RUNNING,
org.apache.spark.sql.catalyst.errors.package$DialectException: Instantiating dialect 'sapsql' failed.
Reverting to default dialect 'sapsql'
at org.apache.spark.sql.SQLContext.getSQLDialect(SQLContext.scala:225)
at org.apache.spark.sql.hive.HiveContext.getSQLDialect(HiveContext.scala:577)
at org.apache.spark.sql.hive.SapHiveContext$$anonfun$1.apply(SapHiveContext.scala:54)
at org.apache.spark.sql.hive.SapHiveContext$$anonfun$1.apply(SapHiveContext.scala:54)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
at scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:34)
at org.apache.spark.sql.hive.SapHiveContext$$anonfun$2.apply(SapHiveContext.scala:58)
at org.apache.spark.sql.hive.SapHiveContext$$anonfun$2.apply(SapHiveContext.scala:58)
at org.apache.spark.sql.execution.datasources.DDLParser.parse(DDLParser.scala:43)
at org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.parseSql(HiveContext.scala:334)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:829)
at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:211)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.extension.SapSQLDialect
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:177)
at org.apache.spark.sql.SQLContext.getSQLDialect(SQLContext.scala:215)
... 54 more
在安装指南中它说我需要为 Hive Metastore 分配 vora 用户授权。
由于这只是 Hive 中禁用的测试设置授权,vora 用户可以在默认数据库中创建和删除表,并对 Hive 的仓库位置具有写入权限。
我该如何解决?