我正在尝试解决我的 Maven 项目中的库之间的冲突。我将以下插件添加到插件部分:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.4.1</version>
<configuration>
<rules><dependencyConvergence/></rules>
</configuration>
</plugin>
</plugins>
当我运行时mvn enforcer:enforce
,我得到不同的依赖错误,比如这个:
Dependency convergence error for org.codehaus.jackson:jackson-mapper-asl:1.9.13 paths to dependency are:
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.apache.avro:avro-ipc:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.apache.avro:avro-ipc:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.spark:spark-core_2.11:2.2.0
+-org.apache.avro:avro-mapred:1.7.7
+-org.codehaus.jackson:jackson-mapper-asl:1.9.13
and
+-org.test:service:1.0-SNAPSHOT
+-org.apache.spark:spark-sql_2.11:2.2.0
+-org.apache.parquet:parquet-hadoop:1.8.2
+-org.codehaus.jackson:jackson-mapper-asl:1.9.11
那么,在打包 JAR 时如何解决这些错误呢?在 SBT 中它更容易,但我被 Maven 卡住了。