新手来了 尝试使用 Pail 运行 Nathan Marz 的书 Big Data DFS Datastore 中的代码。我究竟做错了什么?尝试连接到 HDFS 虚拟机。尝试用文件替换 hdfs。任何帮助表示赞赏。
public class AppTest
{
private App app = new App();
private String path = "hdfs:////192.168.0.101:8080/mypail";
@Before
public void init() throws IllegalArgumentException, IOException{
FileSystem fs = FileSystem.get(new Configuration());
fs.delete(new Path(path), true);
}
@Test public void testAppAccess() throws IOException{
Pail pail = Pail.create(path);
TypedRecordOutputStream os = pail.openWrite();
os.writeObject(new byte[] {1, 2, 3});
os.writeObject(new byte[] {1, 2, 3, 4});
os.writeObject(new byte[] {1, 2, 3, 4, 5});
os.close();
}
}
得到一个错误 -
java.lang.IllegalArgumentException: Wrong FS: hdfs:/192.168.0.101:8080/mypail, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:80)
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:529)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:747)
将 HDFS 替换为 file:///
java.io.IOException: Mkdirs failed to create file:/192.168.0.101:8080/mypail (exists=false, cwd=file:/Users/joshi/git/projectcsr/projectcsr)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:442)
at