我正在使用 Apache Spark 2.1.1,我想使用外部 Hive 元存储(专门针对 Spark Thrift Server)进行设置。
我已添加hive-site.xml
到$SPARK_HOME/conf
如下文件夹:
<?xml version="1.0"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://home.cu:3306/hive_metastore?createDatabaseIfNotExist=true&useLegacyDatetimeCode=false&serverTimezone=Europe/Berlin&nullNamePatternMatchesAll=true </value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
<description>Driver class name for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
<description>username to use against metastore database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>password to use against metastore database</description>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
<description>password to use against metastore database</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>hdfs://spark-master.cu:9000/value_iq/hive_warehouse/</value>
<description>Warehouse Location</description>
</property>
</configuration>
每当我尝试运行spark-shell
或 Spark Thrift Server 时,他们都会尝试在 MySQL 上创建 Hive 元存储(因为还没有元存储)并且它们失败并出现以下错误:
17/07/13 19:57:55 ERROR Datastore: Error thrown executing ALTER TABLE `PARTITIONS` ADD COLUMN `TBL_ID` BIGINT NULL : Table 'hive_metastore.partitions' doesn't exist
java.sql.SQLSyntaxErrorException: Table 'hive_metastore.partitions' doesn't exist
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:536)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:513)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:115)
at com.mysql.cj.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:1983)
at com.mysql.cj.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:1936)
at com.mysql.cj.jdbc.StatementImpl.executeInternal(StatementImpl.java:891)
at com.mysql.cj.jdbc.StatementImpl.execute(StatementImpl.java:795)
at com.jolbox.bonecp.StatementHandle.execute(StatementHandle.java:254)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:760)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatementList(AbstractTable.java:711)
at org.datanucleus.store.rdbms.table.TableImpl.validateColumns(TableImpl.java:259)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3393)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:3190)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2841)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:122)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:1605)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:954)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:679)
at org.datanucleus.store.rdbms.query.RDBMSQueryUtils.getStatementForCandidates(RDBMSQueryUtils.java:408)
at org.datanucleus.store.rdbms.query.JDOQLQuery.compileQueryFull(JDOQLQuery.java:947)
at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:370)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
at org.datanucleus.store.query.Query.execute(Query.java:1654)
at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)