我刚刚开始使用 EMR Hadoop/spark 等,我正在尝试使用 spark-shell 运行 scala 代码以将文件上传到 EMRFS S3 位置但是我收到以下错误 -
没有任何导入如果我运行 =>
val bucketName = "bucket"
val outputPath = "test.txt"
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:27: error: not found: value PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
一旦我为 PutObjectRequest 添加了 Import 包,我仍然会得到一个不同的错误。
scala> import com.amazonaws.services.s3.model.PutObjectRequest
导入 com.amazonaws.services.s3.model.PutObjectRequest
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:28: error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
我不确定我错过了什么。任何帮助,将不胜感激!
注:Spark 版本为 2.4.5