我正在尝试使用 Spark Java 框架在 Cloudbees 上部署最简单的应用程序。这会产生一个 Jar 文件,我尝试通过 Jenkins push->deploy 进行部署,但它警告我部署插件无法部署 jar 文件...
不管怎样,我通过 CloudBees SDK 和 CLI 部署了我的 jar:
bees app:deploy -t java -R java_version=1.7 target\myapp-with-dependencies.jar
然后它告诉我应用程序已部署到我的 URL。但是当我尝试访问此 URL 时,我收到 502 Bad Gateway Error...
但是,无论是通过 IntelliJ 还是使用 maven 生成的 Jar 文件运行我的主类,URL 127.0.0.1:8080 都会返回我预期的Hello Spark
.
这是我的主要课程:
public class HelloSpark {
public static void main(String[] args) {
String port = System.getProperty("app.port","8080");
//So that the port is the one used by CloudBees
Spark.setPort(Integer.parseInt(port));
Spark.get(new Route("/") {
@Override
public Object handle(Request request, Response response) {
return "Hello Spark";
}
});
}
}
这是我的 pom 文件:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>spark-from-scratch</groupId>
<artifactId>spark-from-scratch</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>spark</groupId>
<artifactId>spark</artifactId>
<version>0.9.9.4-SNAPSHOT</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>HelloSpark</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>