I am using Spark Framework in my application, and use
staticFileLocation("/META-INF/resources/");
so that I can use webjars, which contain css and js files in there. I also have my own resources put in my projects src/main/resources/META-INF/resources
folder because my gradle build picks them up from there.
My build uses a fat-jar approach, where everything ends up in a single jar and all files are served perfectly by Spark.
My problem is that when I run some unit tests standalone from Eclipse, even though I ensured that the webjars are on classpath, they are not served by Spark, only my own project static resources are.
@Test
public void testStartup() throws InterruptedException {
InputStream schemaIS = this.getClass().getClassLoader().getResourceAsStream("META-INF/resources/webjars/bootstrap/3.2.0/js/bootstrap.min.js");
System.out.println(schemaIS == null);
staticFileLocation("/META-INF/resources/");
// depending on the trailing / the bootstrap js is found, but Spark never serves it
}
I think this has something to do with classloaders, but I am not finding the way to make this work. Looking at Spark code, it says The thread context class loader will be used for loading the resource.
I also see that the code itself removes the trailing slash, which makes big difference in the plain getResourceAsStream
.
Is it a bug in Spark, or is there any way to make it work properly?