1

我正在使用Scala IDE 4.6.0并使用从书中获得的原型创建了一个 Maven 项目:Spark In Action.

我必须使用Scala 2.10.4and Spark 1.6.2

我使用此原型创建了一个基本项目并将其添加spark-hive dependencyPOM. 结果POM如下:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
 <modelVersion>4.0.0</modelVersion>
 <groupId>com.toto</groupId>
 <artifactId>hive_test</artifactId>
 <version>0.0.1-SNAPSHOT</version>
 <name>hive_test</name>
 <description></description>
 <inceptionYear>2017</inceptionYear>

 <properties>
   <maven.compiler.source>1.8</maven.compiler.source>
   <maven.compiler.target>1.8</maven.compiler.target>
   <encoding>UTF-8</encoding>
   <scala.tools.version>2.10</scala.tools.version>
   <scala.version>2.10.4</scala.version>
   <spark.version>1.6.2</spark.version>
 </properties>

 <dependencies>
   <dependency>
     <groupId>org.scala-lang</groupId>
     <artifactId>scala-library</artifactId>
     <version>${scala.version}</version>
   </dependency>
   <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-core_${scala.tools.version}</artifactId>
     <version>${spark.version}</version>
     <scope>provided</scope>
   </dependency>
   <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-sql_${scala.tools.version}</artifactId>
     <version>${spark.version}</version>
     <scope>provided</scope>
   </dependency>
   <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-hive_${scala.tools.version}</artifactId>
     <version>${spark.version}</version>
     <scope>provided</scope>
   </dependency>

   <!-- Test -->
   <dependency>
     <groupId>junit</groupId>
     <artifactId>junit</artifactId>
     <version>4.11</version>
     <scope>test</scope>
   </dependency>
   <dependency>
     <groupId>org.specs2</groupId>
     <artifactId>specs2-core_${scala.tools.version}</artifactId>
     <version>3.7.2</version>
     <scope>test</scope>
   </dependency>
   <dependency>
     <groupId>org.specs2</groupId>
     <artifactId>specs2-junit_${scala.tools.version}</artifactId>
     <version>3.7.2</version>
     <scope>test</scope>
   </dependency>
   <dependency>
     <groupId>org.scalatest</groupId>
     <artifactId>scalatest_${scala.tools.version}</artifactId>
     <version>3.0.0-M15</version>
     <scope>test</scope>
   </dependency>

 </dependencies>

 <build>
   <sourceDirectory>src/main/scala</sourceDirectory>
   <testSourceDirectory>src/test/scala</testSourceDirectory>
   <plugins>

     <plugin>
       <!-- see http://davidb.github.com/scala-maven-plugin -->
       <groupId>net.alchim31.maven</groupId>
       <artifactId>scala-maven-plugin</artifactId>
       <version>3.2.0</version>
       <executions>
         <execution>
           <goals>
             <goal>compile</goal>
             <goal>testCompile</goal>
           </goals>
           <configuration>
             <args>
               <arg>-dependencyfile</arg>
               <arg>${project.build.directory}/.scala_dependencies</arg>
             </args>
           </configuration>
         </execution>
       </executions>
     </plugin>

     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-shade-plugin</artifactId>
       <version>2.4.2</version>
       <configuration>
       </configuration>
       <executions>
         <execution>
           <phase>package</phase>
           <goals>
             <goal>shade</goal>
           </goals>
         </execution>
       </executions>
     </plugin>

     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-surefire-plugin</artifactId>
       <version>2.18.1</version>
       <configuration>
         <useFile>false</useFile>
         <disableXmlReport>true</disableXmlReport>
         <!-- If you have classpath issue like NoDefClassError,... -->
         <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
         <includes>
           <include>**/*Test.*</include>
           <include>**/*Suite.*</include>
         </includes>
         <filters>
           <filter>
             <artifact>*:*</artifact>
             <excludes>
               <exclude>META-INF/*.SF</exclude>
               <exclude>META-INF/*.DSA</exclude>
               <exclude>META-INF/*.RSA</exclude>
             </excludes>
           </filter>
         </filters>
       </configuration>
     </plugin>

   </plugins>
 </build>

我还有一个App.scala仅用于测试且未使用的源文件Hive

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

object App {

def main(args : Array[String]) {
  val conf = new SparkConf().setAppName("Temp")/*.setMaster("local[*]")*/
  val sc = new SparkContext(conf)

  val col = sc.parallelize(0 to 100 by 5)
  val smp = col.sample(true, 4)
  val colCount = col.count
  val smpCount = smp.count

  println("orig count = " + colCount)
  println("sampled count = " + smpCount)
  }

}

当我这样做时,maven install我有如下错误:

[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building hive_test 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ hive_test ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive_test ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- scala-maven-plugin:3.2.0:compile (default) @ hive_test ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.toto:hive_test:0.0.1-SNAPSHOT requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.6.2 requires scala version: 2.10.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ hive_test ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory C:\Workspace\toto\hive_test\src\test\resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive_test ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- scala-maven-plugin:3.2.0:testCompile (default) @ hive_test ---
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.toto:hive_test:0.0.1-SNAPSHOT requires scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-remote_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-actor_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  com.typesafe.akka:akka-slf4j_2.10:2.3.11 requires scala version: 2.10.4
[WARNING]  org.apache.spark:spark-core_2.10:1.6.2 requires scala version: 2.10.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] C:\Workspace\toto\hive_test\src\test\scala:-1: info: compiling
[INFO] Compiling 3 source files to C:\Workspace\toto\hive_test\target\test-classes at 1497018999140
[ERROR] error: error while loading <root>, invalid CEN header (bad signature)
[ERROR] error: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.
[ERROR]     at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR]     at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[ERROR]     at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[ERROR]     at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[ERROR]     at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
[ERROR]     at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[ERROR]     at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[ERROR]     at scala.tools.nsc.Driver.process(Driver.scala:54)
[ERROR]     at scala.tools.nsc.Driver.main(Driver.scala:67)
[ERROR]     at scala.tools.nsc.Main.main(Main.scala)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR]     at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
[ERROR]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
[ERROR]     at java.lang.reflect.Method.invoke(Unknown Source)
[ERROR]     at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[ERROR]     at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[ERROR] 
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9.773 s
[INFO] Finished at: 2017-06-09T16:36:40+02:00
[INFO] Final Memory: 25M/488M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile (default) on project hive_test: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

如果我使用相同POM但没有spark-hive_2.10依赖关系,它可以正常工作。但在我的情况下,我想读取并创建一个表,spark所以我必须使用spark-hive依赖项。

你能帮我解决这个问题吗?

4

1 回答 1

0

警告信息

[WARNING]  org.apache.spark:spark-core_2.10:1.6.2 requires scala version: 2.10.5
[WARNING] Multiple versions of scala libraries detected!

建议您需要定义scala version为,但2.10.5您已将其定义为2.10.4

<scala.version>2.10.4</scala.version>

应该

<scala.version>2.10.5</scala.version>
于 2017-06-09T16:34:17.720 回答