0

The problem is that, the jar file uses Spring ORM for loading the persistance configurations, and based on these configurations, files are moved to suitable folders in HDFS. Now If i use, 'java -cp' instead of 'hadoop jar', it fails to copy to HDFS, with FileSystem error.

While invoking the jar with hadoop jar command (having spring orm injected) the exception is as:

Exception in thread "main" org.springframework.beans.factory.BeanCreationException: Error creating bean with name

'org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor#0' defined in class path resource [applicationContext.xml

Error creating bean with name 'entityManagerFactory' defined in class path resource [applicationContext.xml]: Invocation of init method failed; nested exception is java.lang.IllegalStateException: Conflicting persistence unit definitions for name 'Persistance': file:/home/user/Desktop/ABC/apnJar.jar, file:/tmp/hadoop-ABC/hadoop-unjar2841422106164401019/

Caused by: java.lang.IllegalStateException: Conflicting persistence unit definitions for name 'Persistance'

Seems like Hadoop is unpacking the jar file to some tmp folder, is this really required? Can we skip this step by any configuration change?

Any thoughts on this are welcome.

4

2 回答 2

0

作为一种解决方法,我从 jar 中提取了配置 xml,并将它们放在工作目录中。

这有效,但是正在寻找适当的解决方案。

因此,如果您遇到类似的问题,请删除所有配置 xml,并仅将 jar 与已编译的类文件一起放置。

于 2013-06-04T13:49:54.583 回答
0

如果您使用“hadoop jar”,hadoop 将运行org.apache.hadoop.util.RunJarRunJar会将您的 jar 解包到一个临时文件夹中(在您的情况下是 /tmp/hadoop-ABC/hadoop-unjar2841422106164401019/)并将其加载到当前的类加载器中。最后,它将调用您的主类来运行您的 MapReduce 应用程序。

您是否在 CLASSPATH 中添加了您的 jar?如果是这样,您将在类加载器中拥有您的 jar 和未打包的文件夹。我想这就是春天抱怨它的原因。

于 2013-06-03T08:23:07.723 回答