5

我正在使用 Apache Spark 1.2.1 版和 Scala 2.10.4 版。我正在尝试让示例MovieLensALS工作。但是,我遇到了scopt库的错误,这是代码中的要求。任何帮助,将不胜感激。我的 build.sbt 如下:

name := "Movie Recommender System"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.1"

libraryDependencies += "org.apache.spark"  % "spark-mllib_2.10" % "1.2.1"

libraryDependencies += "com.github.scopt" %% "scopt" % "3.2.0"

resolvers += Resolver.sonatypeRepo("public")

我得到的错误如下:

   Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
    at MovieLensALS.main(MovieLensALS.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Caused by: java.lang.ClassNotFoundException: scopt.OptionParser
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
    ... 8 more
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

在运行 sbt 程序集来构建 jar 时,我收到以下错误:

[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error]         ^

编辑:根据 Justin Piphony 的建议,sbt 的GitHub页面中列出的解决方案有助于修复此错误。基本上在 project/ 目录中创建一个文件 assembly.sbt 并添加该行 addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

注意版本要根据使用的版本添加。

4

2 回答 2

3

你需要打包scopt在你的罐子里。sbt 默认不这样做。要创建这个胖 jar,你需要使用sbt-assembly

于 2015-03-29T02:26:50.187 回答
0

如果您使用 maven 打包您的 spark 项目,您需要添加maven-assembly-plugin有助于打包依赖项的插件:

<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.5</version>
<configuration>
  <descriptorRefs>
    <descriptorRef>jar-with-dependencies</descriptorRef>
  </descriptorRefs>
</configuration>
<executions>
  <execution>
    <id>make-assembly</id>
    <!-- this is used for inheritance merges -->
    <phase>package</phase>
    <!-- bind to the packaging phase -->
    <goals>
      <goal>single</goal>
    </goals>
  </execution>
</executions>

于 2018-12-07T02:59:46.590 回答