我正在尝试用 Gradle 构建一个 Spark fat jar。构建成功,但文件以一种微妙的方式损坏:尝试运行它会产生:
Error: Could not find or load main class shadow_test.Main
Caused by: java.lang.ClassNotFoundException: shadow_test.Main
JAR 本身看起来不错:缺少的类就在那里,当我解压缩它时,我可以正常运行项目。
这是gradle.build
文件:
plugins {
id "scala"
id 'com.github.johnrengelman.shadow' version '7.1.2'
}
ext {
ver = [
scala : '2.11.12',
scala_rt: '2.11',
spark : '2.4.4'
]
}
configurations {
// Dependencies that will be provided at runtime in the cloud execution
provided
compileOnly.extendsFrom(provided)
testImplementation.extendsFrom provided
}
repositories {
mavenCentral()
}
dependencies {
implementation "org.scala-lang:scala-library:$ver.scala"
provided "org.apache.xbean:xbean-asm6-shaded:4.10"
provided "org.apache.spark:spark-sql_$ver.scala_rt:$ver.spark"
provided "org.apache.spark:spark-hive_$ver.scala_rt:$ver.spark"
testImplementation "org.testng:testng:6.14.3"
}
tasks.register("allJar", com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar) {
manifest {
attributes "Main-Class": "shadow_test.Main"
}
from sourceSets.main.output
configurations = [project.configurations.runtimeClasspath, project.configurations.provided]
zip64 true
mergeServiceFiles()
with jar
}
test {
useTestNG()
}
Gradle 版本是 7.3.3
可以在https://github.com/SashaOv/shadow-jar-repro找到重现此问题的最小项目的完整代码
感谢任何线索