1

在 YARN 的设置文档上关注了 Flink。但是当我运行时./bin/yarn-session.sh -n 2 -jm 1024 -tm 2048,在通过 Kerberos 进行身份验证时,我收到以下错误:

2016-06-16 17:46:47,760 WARN  org.apache.hadoop.util.NativeCodeLoader                       - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-06-16 17:46:48,518 INFO  org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl     - Timeline service address: https://**host**:8190/ws/v1/timeline/
2016-06-16 17:46:48,814 INFO  org.apache.flink.yarn.FlinkYarnClient                         - Using values:
2016-06-16 17:46:48,815 INFO  org.apache.flink.yarn.FlinkYarnClient                         -   TaskManager count = 2
2016-06-16 17:46:48,815 INFO  org.apache.flink.yarn.FlinkYarnClient                         -   JobManager memory = 1024
2016-06-16 17:46:48,815 INFO  org.apache.flink.yarn.FlinkYarnClient                         -   TaskManager memory = 2048
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.s3a.S3AFileSystem could not be instantiated
    at java.util.ServiceLoader.fail(ServiceLoader.java:224)
    at java.util.ServiceLoader.access$100(ServiceLoader.java:181)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:377)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:445)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2623)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2634)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2651)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170)
    at org.apache.flink.yarn.FlinkYarnClientBase.deployInternal(FlinkYarnClientBase.java:531)
    at org.apache.flink.yarn.FlinkYarnClientBase$1.run(FlinkYarnClientBase.java:342)
    at org.apache.flink.yarn.FlinkYarnClientBase$1.run(FlinkYarnClientBase.java:339)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.flink.yarn.FlinkYarnClientBase.deploy(FlinkYarnClientBase.java:339)
    at org.apache.flink.client.FlinkYarnSessionCli.run(FlinkYarnSessionCli.java:419)
    at org.apache.flink.client.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:362)
Caused by: java.lang.NoClassDefFoundError: com/amazonaws/AmazonServiceException
    at java.lang.Class.getDeclaredConstructors0(Native Method)
    at java.lang.Class.privateGetDeclaredConstructors(Class.java:2532)
    at java.lang.Class.getConstructor0(Class.java:2842)
    at java.lang.Class.newInstance(Class.java:345)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:373)
    ... 18 more
Caused by: java.lang.ClassNotFoundException: com.amazonaws.AmazonServiceException
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 23 more

我在./flink-1.0.3/conf/flink-conf.yaml中设置了以下属性

fs.hdfs.hadoopconf: /etc/hadoop/conf/
fs.hdfs.hdfssite: /etc/hadoop/conf/hdfs-site.xml

如何使用 HDFS 而不是 Amazon 的 S3?

谢谢。

4

2 回答 2

1

我猜问题是 Flink 没有获取你的配置文件。

fs.hdfs.hdfssite您可以从配置中删除以开头的行吗?fs.hdfs.hadoopconf如果已设置,则不需要。

另外,你能检查一下 in 的设置是否设置为fs.defaultFscore-site.xml开头的东西hdfs://吗?

于 2016-06-20T12:01:32.740 回答
1

实际上,我必须按照已删除答案中的建议设置 env var HADOOP_CLASSPATH。

@rmezger:fs.defaultFS已设置。

结果命令:

HADOOP_CLASSPATH=... ./bin/yarn-session.sh -n 2 -jm 1024 -tm 2048
于 2016-06-20T14:08:58.723 回答