2

我在通过 oozie 运行 hive 时遇到了一些问题。在 oozie 控制台中,我收到以下错误:

2013-05-03 04:48:24,248  WARN HiveActionExecutor:542 - USER[ambari_qa] GROUP[-] TOKEN[]  APP[hive-wf] JOB[0000013-130502155316029-oozie-oozi-W] 
ACTION[0000013-130502155316029-  oozie-oozi-W@hive-node] Launcher exception:    org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException:    
org.apache.hcatalog.security.HdfsAuthorizationProvider
java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:  java.lang.ClassNotFoundException: org.apache.hcatalog.security.HdfsAuthorizationProvider
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:293)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:669)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:467)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1178)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: org.apache.hcatalog.security.HdfsAuthorizationProvider
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:287)
... 19 more
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.security.HdfsAuthorizationProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:335)
... 20 more

我在 job.properties 文件中提到了系统类路径。这是我的工作属性文件:

nameNode=hdfs://ip-10-0-0-92:8020
jobTracker=ip-10-0-0-93:50300
queueName=default
wfeRoot=wfe

oozie.use.system.libpath=true
oozie.libpath=/user/oozie/share/lib/hive

oozie.wf.application.path=${nameNode}/user/${user.name}/${wfeRoot}/hive-oozie

这是我的 workflow.xml 文件:

<action name="hive-node">
    <hive xmlns="uri:oozie:hive-action:0.2">
     <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
     <prepare>
            <delete path="${nameNode}/user/${wf:user()}/${wfeRoot}/output-data/hive"/>
            <mkdir path="${nameNode}/user/${wf:user()}/${wfeRoot}/output-data"/>
        </prepare>
     <job-xml>hive-site.xml</job-xml>
        <configuration>
            <property>
                <name>mapred.job.queue.name</name>
                <value>${queueName}</value>
            </property>
      <property>
                <name>oozie.log.hive.level</name>
                <value>DEBUG</value>
            </property>
            <property>
                <name>oozie.hive.defaults</name>
                <value>hive-default.xml</value>
            </property>
        </configuration>
        <script>script.q</script>
    </hive>
    <ok to="end"/>
    <error to="fail"/>
</action>

<kill name="fail">
    <message>Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>

我已将 hive-site.xml 和 hive-default.xml 文件复制到 hdfs 中。

知道这里发生了什么吗?

4

1 回答 1

2

当我的 Oozie ShareLib 文件夹中没有 HCatalog 的 jar 时,我遇到了同样的错误。错误输出中的这一行就是它泄露的原因。

java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException:  java.lang.ClassNotFoundException: org.apache.hcatalog.security.HdfsAuthorizationProvider

要验证,请运行

hadoop fs -ls /user/oozie/share/lib/hive | grep hcatalog

您应该会同时看到 hcatalog-core-0.5.0 和 hcatalog-server-extensions-0.5.0(版本可能会有所不同,具体取决于您的发行版打包方式)。

解决方案是运行以​​下命令。您需要通过在集群上运行if来以hdfs(或任何一个用户是集群的 root 用户)身份运行它。sudo su hdfsdfs.permissions = true

cd /usr/lib/hcatalog/share/hcatalog
ls
hadoop fs -put hcatalog-core-0.5.0.jar /user/oozie/share/lib/hive
hadoop fs -put hcatalog-server-extensions-0.5.0.jar /user/oozie/share/lib/hive
hadoop fs -chmod 777 /user/oozie/share/lib/hive/hcatalog-core-0.5.0.jar
hadoop fs -chmod 777 /user/oozie/share/lib/hive/hcatalog-server-extensions-0.5.0.jar

同样,确切的文件名和文件路径会因 Hadoop 发行版而异。我建议使用 777 权限(所有所有者、组和所有其他用户的读取、写入和执行权限),因为它是最宽松的,并且可以保证让您通过此错误。如果您有安全问题,可能需要不同的权限。您可以查看文件夹中的其他文件以获得合理的默认值。

如果您的发行版升级工具/文档错过了此步骤,则在设置或升级新的 Hadoop 集群时可能会出现此问题。

于 2013-05-03T15:01:34.323 回答