如何将我的本地机器上的文本文件加载到远程 Hbase。我参考了上面的命令,但我真的对命令感到困惑
hadoop jar <path to hbase jar> importtsv -Dimporttsv.columns=a,b,c '-Dimporttsv.separator=,' <tablename> <inputdir>
文本文件的路径和表名和列的地址将在文本文件中。在文本文件中,我有 create 和 put 语句,如何在 Hbase shell 中加载和执行该文件。如果有任何人知道,请打破我的困惑。
脚本文件:
create 'blogpostss', 'post', 'image'
Run in HBase shell the following to add some data:
put 'blogpostss', 'post1', 'post:title', 'Hello World'
put 'blogpostss', 'post1', 'post:author', 'The Author'
put 'blogpostss', 'post1', 'post:body', 'This is a blog post'
put 'blogpostss', 'post1', 'image:header', 'image1.jpg'
put 'blogpostss', 'post1', 'image:bodyimage', 'image2.jpg'
put 'blogpostss', 'post2', 'post:title', 'Another Post'
put 'blogpostss', 'post2', 'post:title', 'My Second Post'
put 'blogpostss', 'post1', 'post:body', 'This is an updated blog postss'
Following commands retrieve data:
get 'blogpostss', 'post1'
get 'blogpostss', 'post1', { COLUMN => 'post:title' }
get 'blogpostss', 'post1', { COLUMN => 'post:title', VERSIONS => 4 }
get 'blogpostss', 'post1', { COLUMNS => 'post:body', VERSIONS => 3 }
get 'blogpostss', 'post2'
get 'blogpostss', 'post2', { COLUMN => 'post:title' }
get 'blogpostss', 'post2', { COLUMN => 'post:title', VERSIONS => 4 }