我运行标准 hbase 类来计算 BigTable 表中的行数 (RowCounter)。使用 Google Console 中的 DataProc gui。它工作得很好,但几周后我尝试运行类似的 jar,但由于难以解释的原因,工作失败了。这看起来不像是连接参数问题,因为如果我使用不存在的 Hbase 表名,它会理解这一点。
1.1.2 和 1.0.1.1 Hbase 客户端的结果相同。1.0.1.1 来自示例。我发现由 bdutils 设置的集群使用 1.1.2 版本。
6/02/08 14:35:34 INFO mapreduce.Job: map 100% reduce 0%
16/02/08 14:35:34 INFO mapreduce.Job: Task Id : attempt_1454940934781_0001_m_000000_0, Status : FAILED
Error: java.io.IOException: Cannot create a record reader because of a previous error. Please look at the previous logs lines from the task's full log for more details.
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:174)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:515)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:758)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.IllegalStateException: The input format instance has not been properly initialized. Ensure you call initializeTable either in your constructor or initialize method
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getTable(TableInputFormatBase.java:585)
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:169)
... 8 more