6

我正在尝试org.apache.hadoop.fs.FileUtil.unTar直接从 pyspark shell 访问。

我知道我可以访问底层虚拟机(通过 py4j)sc._jvm来执行此操作,但我很难真正连接到 hdfs(尽管我的 pyspark 会话在其他方面完全正常,并且能够在集群中针对集群内的作业运行作业) .

例如:

hdpUntar = sc._jvm.org.apache.hadoop.fs.FileUtil.unTar
hdpFile = sc._jvm.java.io.File

root    = hdpFile("hdfs://<url>/user/<file>")
target  = hdpFile("hdfs://<url>/user/myuser/untar")

hdpUntar(root, target)

不幸的是,这不起作用:

Py4JJavaError: An error occurred while calling z:org.apache.hadoop.fs.FileUtil.unTar.

: ExitCodeException exitCode=128: tar: Cannot connect to hdfs: resolve failed
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.fs.FileUtil.unTarUsingTar(FileUtil.java:675)
    at org.apache.hadoop.fs.FileUtil.unTar(FileUtil.java:651)
    at sun.reflect.GeneratedMethodAccessor12.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:259)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:745)
4

1 回答 1

0

这应该适用于数据块:

URI           = sc._gateway.jvm.java.net.URI
Path          = sc._gateway.jvm.org.apache.hadoop.fs.Path
FileSystem    = sc._gateway.jvm.org.apache.hadoop.fs.FileSystem
Configuration = sc._gateway.jvm.org.apache.hadoop.conf.Configuration


fs = FileSystem.get(URI("dbfs:///"), Configuration())

status = fs.listStatus(Path('dbfs:/mnt/gobidatagen/tpch/sf100_parquet/'))

for fileStatus in status:
    print(fileStatus.getPath())

于 2021-11-30T16:41:01.270 回答