1

我正在使用 Spotify 的 Scio 库在 scala 中编写 apache 光束管道。我想在可以是hdfs、alluxio或GCS的文件系统上以递归方式搜索目录下的文件。像 *.j​​ar 应该找到提供的目录和子目录下的所有文件。

Apache beam sdkorg.apache.beam.sdk.io.FileIO为此目的提供了类,我可以使用pipeline.apply(FileIO.match().filepattern(filesPattern)).

如何递归搜索与提供的模式匹配的所有文件?

目前,我正在尝试另一种方法,我正在创建所提供模式的 resourceId 并获取所提供模式的当前目录,然后我正在尝试使用resourceId.resolve()方法解析当前目录中的所有子目录。但它为它抛出了一个异常。

    val currentDir = FileSystems.matchNewResource(filesPattern, false).getCurrentDirectory
    val childDir = currentDir.resolve("{@literal *}", StandardResolveOptions.RESOLVE_DIRECTORY)

因为currentDir.resolve我收到以下异常:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
        at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:427)
        at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:813)
        at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:287)
        at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:213)
        at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1050)
        at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1126)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
        at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
        at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1126)
Caused by: java.lang.IllegalArgumentException: Illegal character in path at index 0: {@literal *}/
        at java.net.URI.create(URI.java:852)
        at java.net.URI.resolve(URI.java:1036)
        at org.apache.beam.sdk.io.hdfs.HadoopResourceId.resolve(HadoopResourceId.java:46)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.FileOperations$.findFiles(BinaryFilesSink.scala:110)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink$.main(BinaryFilesSink.scala:39)
        at com.sparkcognition.foundation.ingest.jobs.copyjob.BinaryFilesSink.main(BinaryFilesSink.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
        ... 12 more
Caused by: java.net.URISyntaxException: Illegal character in path at index 0: {@literal *}/
        at java.net.URI$Parser.fail(URI.java:2848)
        at java.net.URI$Parser.checkChars(URI.java:3021)
        at java.net.URI$Parser.parseHierarchical(URI.java:3105)
        at java.net.URI$Parser.parse(URI.java:3063)
        at java.net.URI.<init>(URI.java:588)
        at java.net.URI.create(URI.java:850)
        ... 22 more

请建议使用 apache Beam 递归搜索文件的正确方法是什么?

参考资料: https ://beam.apache.org/releases/javadoc/2.11.0/index.html?org/apache/beam/sdk/io/fs/ResourceId.html

4

1 回答 1

1

看起来您从一些错误的 javadoc中复制了一些代码。一些旧版本的示例代码在发布时带有星号周围的错误。

查找 currentDir 中的所有文件:

val childDir = currentDir.resolve("**", StandardResolveOptions.RESOLVE_FILES)
于 2019-12-13T14:50:43.453 回答