我正在使用 hbase 客户端 2.1.7 连接到我的服务器(相同版本 2.1.7)。
<groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>2.1.7</version>
现在有一个用户有权对服务器中的表进行读/写。
用户 = LTzm@yA$U
为此,我的代码如下所示:
String hadoop_user_key = "HADOOP_USER_NAME";
String user = "LTzm@yA$U";
System.setProperty(hadoop_user_key, token);
现在,当我尝试从表中读取密钥时,出现以下错误:
错误日志:!原因:org.apache.hadoop.hbase.security.AccessDeniedException:org.apache.hadoop.hbase.security.AccessDeniedException:用户“LTzm”权限不足(表=表名,操作=读取)
奇怪的部分是写入工作正常。为了验证正确的用户是否通过写入,我删除了用户并尝试重新运行代码,写入失败并出现错误:
错误日志:!org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.security.AccessDeniedException: 权限不足 (user=LTzm@yA$U, scope=table_name, family=d:visitId, params=[table= table_name,family=d:visitId],action=WRITE)
再次阅读也失败了:
错误日志:!org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.security.AccessDeniedException: 用户 'LTzm' 权限不足 (table=table_name, action=READ)
不知何故,Ltzm 正在通过读取调用传递,而 LTzm@yA$U 正在通过写入传递。有没有人帮助我这里有什么问题,hbase 用户中是否不允许使用 @ 或特殊符号(那么它如何用于 write 调用)。
编辑1:这是创建连接的功能:
public static Connection createConnection() { String hadoop_user_key = "HADOOP_USER_NAME"; String user = "LTzm@yA$U"; Map<String, String> configMap = new HashMap<>(); configMap.put("hbase.rootdir", "hdfs://session/apps/hbase/data")); configMap.put("hbase.zookeeper.quorum", "ip1, ip2"); configMap.put("zookeeper.znode.parent", "/hbase"); configMap.put("hbase.rpc.timeout", "400"); configMap.put("hbase.rpc.shortoperation.timeout", "400"); configMap.put("hbase.client.meta.operation.timeout", "5000"); configMap.put("hbase.rpc.engine", "org.apache.hadoop.hbase.ipc.SecureRpcEngine"); configMap.put("hbase.client.retries.number", "3"); configMap.put("hbase.client.operation.timeout", "3000")); configMap.put(HConstants.HBASE_CLIENT_IPC_POOL_SIZE, "30")); configMap.put("hbase.client.pause", "50")); configMap.put("hbase.client.pause.cqtbe", "1000")); configMap.put("hbase.client.max.total.tasks", "500")); configMap.put("hbase.client.max.perserver.tasks", "50")); configMap.put("hbase.client.max.perregion.tasks", "10")); configMap.put("hbase.client.ipc.pool.type", "RoundRobinPool"); configMap.put("hbase.rpc.read.timeout", "200")); configMap.put("hbase.rpc.write.timeout", "200")); configMap.put("hbase.client.write.buffer", "20971520")); System.setProperty(hadoop_user_key, token); Configuration hConfig = HBaseConfiguration.create(); for (String key : configMap.keySet()) hConfig.set(key, configMap.get(key)); UserGroupInformation.setConfiguration(hConfig); Connection hbaseConnection; hbaseConnection = ConnectionFactory.createConnection(config); return connection; }
以下是读取和写入调用:
protected Result read(String tableName, String rowKey) throws IOException { Get get = new Get(Bytes.toBytes(rowKey)); get.addFamily(COLUMN_FAMILY_BYTES); Result res; Table hTable = null; try { hTable = getHbaseTable(tableName); res = hTable.get(get); } finally { if (hTable != null) { releaseHbaseTable(hTable); } } return res; } protected void writeRow(String tableName, String rowKey, Map<String, byte[]> columnData) throws IOException { Put cellPut = new Put(Bytes.toBytes(rowKey)); for (String qualifier : columnData.keySet()) { cellPut.addColumn(COLUMN_FAMILY_BYTES, Bytes.toBytes(qualifier), columnData.get(qualifier)); } Table hTable = null; try { hTable = getHbaseTable(tableName); if (hTable != null) { hTable.put(cellPut); } } finally { if (hTable != null) { releaseHbaseTable(hTable); } } } private Table getTable(String tableName) { try { Table table = hbaseConnection.getTable(TableName.valueOf(tableName)); } catch (IOException e) { LOGGER.error("Exception while adding table in factory.", e); } }