我正在尝试在 Spark 上执行以下 scala 代码,但由于某种原因,没有调用函数选择性
var lines = sc.textFile(inputPath+fileName,1)
val lines2 =lines.map(l=>selective(
func,List(2,3),List(1,26),l,";",0
))
lines2.toArray().foreach(l=>out.write(l))
.......
选择函数定义如下
def selective(f: (String,Boolean) => String , phoneFields: Seq[Int], codeFields: Seq[Int], in: String, delimiter:String, iMode:Int /* 0 for enc, 1 for dec */) :String =
in.split(delimiter,-1).zipWithIndex
.map {
case (str, ix)
if( phoneFields.contains(ix)||codeFields.contains(ix)) =>
var output=f(str,codeFields.contains(ix))
var sTemp=str+":"+output+"\n"
if((iMode==0)&&codeFields.contains(ix)&&(str.compareTo("")!=0) )
CodeDictString+=sTemp
else if(str.compareTo("")!=0)
PhoneDictString+=sTemp
output
case other => other._1
}.mkString(";").+("\n")
println 语句未执行。此外,该功能不返回任何东西。sc 是火花上下文对象