Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark Java and the classpath

I'm trying to start up with http://www.sparkjava.com/, a small Java web framework. The instructions tell you to add it as a Maven dependency (done), but when I mvn package, I get a class def not found for spark/Route.

I assume this is from Spark not being in my classpath. How can I add it? Would it go in pom.xml?

EDIT: Sorry, here is my pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.bernsteinbear.myapp</groupId>
  <artifactId>myapp</artifactId>
  <packaging>jar</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>myapp</name>
  <url>http://maven.apache.org</url>
  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>3.8.1</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>com.sparkjava</groupId>
      <artifactId>spark-core</artifactId>
      <version>1.1</version>
    </dependency>
  </dependencies>
</project>

EDIT: Trace

λ chaos myapp → java -cp target/myapp-1.0-SNAPSHOT.jar com.bernsteinbear.myapp.App
Exception in thread "main" java.lang.NoClassDefFoundError: spark/Route
Caused by: java.lang.ClassNotFoundException: spark.Route
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

aaaand the source (the example from the homepage):

λ chaos myapp → cat src/main/java/com/bernsteinbear/myapp/App.java
/**
 * Hello world!
 *
 */

package com.bernsteinbear.myapp;
import spark.*;
import static spark.Spark.*;

public class App {

    public static void main(String[] args) {

    get(new Route("/hello") {
        @Override
        public Object handle(Request request, Response response) {
            return "Hello World!";
        }
        });

    }

}
like image 556
tekknolagi Avatar asked Sep 27 '13 16:09

tekknolagi


3 Answers

What works for me to make it run:

mvn package
mvn exec:java -Dexec.mainClass="com.your.class.with.main.method"
like image 200
Bijan Avatar answered Sep 22 '22 16:09

Bijan


I was facing the same problem when trying to deploy the application to Heroku. I added the following to my POM.xml. This plugin ensures that the maven dependencies are copied over to your application.

<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-dependency-plugin</artifactId>
            <version>2.4</version>
            <executions>
                <execution>
                    <id>copy-dependencies</id>
                    <phase>package</phase>
                    <goals><goal>copy-dependencies</goal></goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

After this you can go ahead and run

java -cp target/classes:"target/dependency/*" com.bernsteinbear.myapp.App

to run your application

like image 20
Shekhar Avatar answered Sep 22 '22 16:09

Shekhar


Ok, so the maven package itself did not throw the exception; it was execution. The package produced by Maven must not contain everything required to run the app. (You can unzip the jar if you're curious about exactly what it contains.) So now it's a matter of either including Maven dependencies to your packaged classpath (I wouldn't necessarily recommend bothering with that yet), or instead simply including additional JARs in your runtime classpath (for Unix looks like -cp a.jar:b.jar:...). I suspect the spark-dependencies module has all the missing dependencies. (Unfortunately the readme is not very clear on this.)

Assuming the spark-dependencies module is sufficient, you'd just do:

java -cp target/myapp-1.0-SNAPSHOT.jar:lib/jetty-webapp-7.3.0.v20110203.jar:lib/log4j-1.2.14.jar:lib/slf4j-api-1.6.1.jar:lib/servlet-ap‌​i-3.0.pre4.jar:lib/slf4j-log4j12-1.6.1.jar com.bernsteinbear.myapp.App

Note you have to get the paths right. This is assuming the spark-dependencies zip file is unzipped to a lib folder.

If that still doesn't do it, or for additional information or to give feedback, you might also ping the author directly.

like image 40
Will Avatar answered Sep 23 '22 16:09

Will