我正在使用 Spark 1.1。我有一个 Spark 作业,它仅在存储桶下寻找特定模式的文件夹(即以...开头的文件夹),并且应该只处理那些。我通过执行以下操作来实现这一点:
FileSystem fs = FileSystem.get(new Configuration(true));
FileStatus[] statusArr = fs.globStatus(new Path(inputPath));
List<FileStatus> statusList = Arrays.asList(statusArr);
List<String> pathsStr = convertFileStatusToPath(statusList);
JavaRDD<String> paths = sc.parallelize(pathsStr);
但是,在 Google 云存储路径上运行此作业时:gs://rsync-1/2014_07_31*(使用最新的谷歌云存储连接器 1.2.9),我收到以下错误:
4/10/13 10:28:38 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/10/13 10:28:38 INFO util.Utils: Successfully started service 'Driver' on port 60379.
14/10/13 10:28:38 INFO worker.WorkerWatcher: Connecting to worker akka.tcp://sparkWorker@hadoop-w-9.c.taboola-qa-01.internal:45212/user/Worker
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:40)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
Caused by: java.lang.IllegalArgumentException: Wrong bucket: rsync-1, in path: gs://rsync-1/2014_07_31*, expected bucket: hadoop-config
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem.checkPath(GoogleHadoopFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.makeQualified(FileSystem.java:294)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.makeQualified(GoogleHadoopFileSystemBase.java:457)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem.getGcsPath(GoogleHadoopFileSystem.java:163)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.globStatus(GoogleHadoopFileSystemBase.java:1052)
at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.globStatus(GoogleHadoopFileSystemBase.java:1027)
at com.doit.customer.dataconverter.Phase0.main(Phase0.java:578)
... 6 more
当我在本地文件夹上运行此作业时,一切正常。
hadoop-config 是我用于在 Google Compute Engine 上部署 Spark 集群的存储桶(使用 bdutil 0.35.2 工具)