我是使用 Spark 开发的新手。我有一个程序只有一个主要方法,
public static void main(String[] args) throws UnknownHostException {
final Configuration configuration = new Configuration();
configuration.setClassForTemplateLoading(
SparkHomework.class, "/");
Spark.get(new Route("/") {
@Override
public Object handle(final Request request,
final Response response) {
StringWriter writer = new StringWriter();
try {
Template helloTemplate = configuration.getTemplate("answer.ftl");
Map<String, String> answerMap = new
HashMap<String, String>();
answerMap.put("answer", createAnswer());
helloTemplate.process(answerMap, writer);
} catch (Exception e) {
logger.error("Failed", e);
halt(500);
}
return writer;
}
});
}
但是当我运行这个程序时,我得到了以下错误:
而且,当我用 Maven 编译这个表单时,我得到以下错误:
[WARNING] The POM for com.sparkjava:spark-core:jar:1.1.1 is missing, no dependency information available:
Connected to the target VM, address: '127.0.0.1:51871', transport: 'socket'
Exception in thread "Thread-1" java.lang.NoClassDefFoundError: javax/servlet/Filter
at spark.Spark$1.run(Spark.java:303)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Filter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
Disconnected from the target VM, address: '127.0.0.1:51871', transport: 'socket'