我正在尝试学习加密,我想出了以下代码
import java.util.Base64
object JavaCryptoEncryption{
val Algorithm = "AES/CBC/PKCS5Padding"
val IvSpec = new IvParameterSpec(new Array[Byte](16))
def encrypt(text: String, b64secret: String): String = {
val cipher = Cipher.getInstance(Algorithm)
val key = new SecretKeySpec(Base64.getDecoder.decode(b64secret), "AES")
cipher.init(Cipher.ENCRYPT_MODE, key, IvSpec)
new String(Base64.getEncoder.encode(cipher.doFinal(text.getBytes("utf-8"))), "utf-8")
}
def decrypt(text: String, b64secret: String): String = {
val cipher = Cipher.getInstance(Algorithm)
val key = new SecretKeySpec(Base64.getDecoder.decode(b64secret), "AES")
cipher.init(Cipher.DECRYPT_MODE, key, IvSpec)
new String(cipher.doFinal(Base64.getDecoder.decode(text.getBytes("utf-8"))), "utf-8")
}
}
在我系统的其他地方,我定义并存储了密钥。然后我将JavaCryptoEncryption.encrypt
andJavaCryptoEncryption.decrypt
应用于一个字符串值,它工作正常。但是,当我想将它们转换为 UDF 并应用于 DataFrame 的列时,我得到org.apache.spark.SparkException: Task not serializable
. 类似的代码(没有 iv)适用于 AES/ECB/PKCS5Padding。某些模式不支持并行性吗?有办法吗?或者也许有不同的原因?