0

I'm using spark on google cloud platform. Apparently I'm reading a file from the filesystem gs://<bucket>/dir/file, but the log output prompts

FileNotFoundException: `gs:/bucket/dir/file (No such file or dir exist)

The missing / is obviously the problem. How can I solve this?

error message

This is my code:

val files = Array(("call 1","gs://<bucket>/google-cloud-dataproc-metainfo/test/123.wav"))
val splitAudioFiles = sc.parallelize(files.map(x => splitAudio(x, 5, sc)))

def splitAudio(path: (String, String), interval: Int, sc: SparkContext): (String, Seq[(String,Int)]) = {
   val stopWords = sc.broadcast(loadTxtAsSet("gs://<bucket>/google-cloud-dataproc-metainfo/test/stopword.txt")).value
   val keyWords = sc.broadcast(loadTxtAsSet("gs://<bucket>/google-cloud-dataproc-metainfo/test/KeywordList.txt")).value

   val file = new File((path._2))
   val audioTitle = path._1
   val fileFormat: AudioFileFormat = AudioSystem.getAudioFileFormat(file)
   val format = fileFormat.getFormat
4

1 回答 1

2

看来您正在使用不支持 gs:// URI 的 AudioSystem.getAudioFileFormat(URL)。相反,您需要使用 Hadoop FileSystem 接口来获取文件的 InputStream 并使用 AudioSystem.getAudioFileFormat(InputStream)。我认为这样的事情应该有效:

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.Configuration;

val sc: SparkContext = ...
val urls : RDD[URL] = ...
val formats : RDD[AudioFileFormat] = urls.map(url => {
    val asUri = url.toURI  
    val conf = new Configuration()
    val hadoopPath = new Path(asUri)
    val hadoopFs = hadooPath.getFileSystem(conf)
    val inputStream = hadoopFs.open(hadoopPath)
    AudioSystem.getAudioFileFormat(inputStream)
})
于 2016-03-04T17:18:22.530 回答