我正在table1
Big SQL 中创建一个表(例如),并将数据从 HDFS 加载到table1
. 现在table1
,我需要table2
根据某些条件将数据加载到另一个表中,并且每天将更多数据添加到此表table2
中。每日新数据将被加载table1
,相应的新数据也应进入table2
。
我尝试了以下方法
第一的
insert append into table table2 as select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;
遇到 SQL 异常:[状态:42601][代码:-104]:解析错误:
insert append into table table2 as select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;
第二
insert into table table2 as select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;
遇到 SQL 异常: [状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:
“无法执行查询insert into table table2 as select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;': expected keyword values
”。
第三
create table if not exists table2(URL_NAME,TODAY_DATE,COUNT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' as select uri,localtimestamp,count(*) from table1 where request_timestamp=localtimestamp group by uri order by uri LIMIT 100;
在这种情况下,每天都会创建新表,而我希望保留旧数据并添加新数据。
第四
创建表table2
CREATE EXTERNAL TABLE table2 (URL_NAME VARCHAR(500),DATE varchar(50),COUNT INT) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',';
0 rows affected (total: 0.22s)
insert overwrite table table2 select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;
遇到 SQL 异常:
[状态:42601][代码:-104]:解析错误:
<query>insert overwrite table table2 select uri,localtimestamp,count(*) from table1 group by uri order by uri LIMIT 100;</query> Expecting token <into> after token <insert>
第五
Load from sql query 'select uri, request_timestamp,1 from table1 where $conditions' split column uri into table table2;
遇到 SQL 异常:
[状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:'未能执行查询'
Load from sql query 'select uri, request_timestamp,1 from table1 where $conditions' split column uri into table table2
':解析错误:关键字 hbase 或 hive 预期'。
如果我使用关键字 hive
遇到 SQL 异常:
[状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:'无法执行查询'
Load hive from sql query 'select uri, request_timestamp,1 from table1 where $conditions' split column uri into table table2
'`:原始表达式结束于(行:1,列:143):由于 Hive MetaStore 中的错误,语句失败。Hadoop 日志条目标识符:“[4d4e59269]”:com.ibm.biginsights.catalog.translator.hive.HiveExceptionTranslator$HiveNestedException:失败:ParseException 行 1:5 不匹配输入“来自”,在加载语句中期待“加载”附近的数据
知道如何使用INSERT INTO
语句或如何使用 IBM BigSQL(版本 1)将数据从表加载到另一个
更新
我也试过了LOAD
,但得到了例外
LOAD FROM SQL QUERY 'select t1.uri, t1.request_timestamp,t1.cell_lac from sample.web3 t1 where $conditions' split column t1.uri into table sample.u2_table;
遇到 SQL 异常:[状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:'无法执行查询
'LOAD FROM SQL QUERY 'select t1.uri, t1. request_timestamp,t1.cell_lac from sample.web3 t1 where $conditions' split column t1.uri into table sample.u2_table'
:解析错误:关键字 hbase 或 hive 预期'。
LOAD FROM SQL QUERY 'select t1.uri, t1.request_timestamp,t1.cell_lac from sample.web3 t1 where $conditions' split column t1.uri into hive table sample.u2_table;
遇到 SQL 异常:[状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:'无法执行查询
'LOAD FROM SQL QUERY 'select t1.uri, t1.request_timestamp,t1.cell_lac from sample.web3 t1 where $conditions' split column t1.uri into hive table sample.u2_table'
:解析错误:关键字 hbase 或 hive 预期'。
LOAD FROM TABLE sample.web3 COLUMNS (uri,request_timestamp, cell_lac) INTO hive TABLE sample.u2_table APPEND WITH LOAD PROPERTIES (num.map.tasks = 1);
遇到 SQL 异常:[状态:58004][代码:15]:BIGSQL-GEN-0010 发现内部错误:'无法执行查询
'LOAD FROM TABLE sample.web3 COLUMNS (uri,request_timestamp, cell_lac) INTO hive TABLE sample.u2_table APPEND WITH LOAD PROPERTIES (num.map.tasks = 1)'
:解析错误:关键字 hbase 或 hive 预期'。