0

根据官方指南http://sqoop.apache.org/docs/1.99.2/Sqoop5MinutesDemo.html ,我成功创建了一个工作。

但是,当我执行命令时submission start --jid 1,我收到了以下错误消息:

Exception has occurred during processing command 
Server has returned exception: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement

这是我的工作信息。

数据库配置

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 

输出配置

Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log

限制资源

Extractors: 
Loaders: 

由于官方指南中没有关于如何设置上述值的信息,所以有人知道我的工作设置有什么问题吗?

这是日志:

Stack trace:
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:59)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
Caused by: Exception: java.lang.Throwable Message: ERROR: schema "invoice" does not exist
  Position: 46
Stack trace:
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:2102)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:1835)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:257)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:500)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:374)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:254)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:56)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
4

2 回答 2

1

“表列名称”中的值“*”不是必需的,因为默认值是“所有列”。如果您可以共享服务器日志以查看问题所在,这也会很有帮助。

您可以通过将 shell 切换到详细模式来获取其他信息,例如异常的整个堆栈跟踪:

set option --name verbose --value true
于 2013-09-03T08:51:32.390 回答
0

表列名称:*

您不能使用 *,而是使用逗号分隔的列名。您应该给一个列名称作为分区列,您可以使用任何列进行分区。(用于将导入作业分离/分解为多个任务以进行并行处理)。您可以将未记录的参数保留为空。给出用于选择 hdfs(存储)和文件格式(序列文件/文本文件)的整数。

这是创建的示例作业(显示作业 --jid yourjob-id)

sqoop:000> 显示作业--jid 146

1 个要显示的职位:

ID 为 146 且名称为 ImportJob 的作业(创建于 2013 年 10 月 10 日下午 3:46,更新于 2013 年 10 月 10 日下午 3:46)

使用连接 ID 149 和连接器 ID 1

数据库配置

Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 

输出配置

Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/

限制资源

Extractors: 

Loaders: 

这是我的 sqoop java 客户端博客:

http://devslogics.blogspot.in/2013/09/sqoop-java-client.html

于 2013-11-04T11:36:48.573 回答