Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiple Spark servers in a single JVM

Is there any way to run multiple instances of Sparkjava server in the same JVM? I am using it in a "plugin" software and based on external circumstances multiple instances of my plugin might be started up which then cause

java.lang.IllegalStateException: This must be done before route mapping has begun
at spark.SparkBase.throwBeforeRouteMappingException(SparkBase.java:256)
at spark.SparkBase.port(SparkBase.java:101)
at com.foo.bar.a(SourceFile:59)

It seems to me by looking at the code that it is heavily built around static fields in the code, so I am thinking about a classloader trick or working with SparkServerFactory somehow eliminating SparkBase.

like image 906
jabal Avatar asked Jan 03 '17 20:01

jabal


3 Answers

From Spark 2.5 you can use ignite():

http://sparkjava.com/news.html#spark25released

Example:

public static void main(String[] args) {
    igniteFirstSpark();
    igniteSecondSpark();
}

static void igniteSecondSpark() {
    Service http = ignite();

    http.get("/basicHello", (q, a) -> "Hello from port 4567!");
}

static void igniteFirstSpark() {
    Service http = ignite()
                      .port(8080)
                      .threadPool(20);

    http.get("/configuredHello", (q, a) -> "Hello from port 8080!");
}

I personally initialize them something like this:

import spark.Service

public static void main(String[] args) {
    Service service1 = Service.ignite().port(8080).threadPool(20)
    Service service2 = Service.ignite().port(8081).threadPool(10)
}

I recommend to read about how to use those services outside your main method, which I think would be a great use here.

like image 136
lepe Avatar answered Oct 21 '22 22:10

lepe


The trick is to ignore the external static shell around Spark implemented in spark.Spark and directly work with the internal spark.webserver.SparkServer. There are some obstackles in the code that require workaround, e.g. spark.webserver.JettyHandler is not public, so you can't instantiate it from your code, but you can extend that with your own class placed into that package and turn it public.

So the solution is along these lines:

SimpleRouteMatcher routeMatcher1 = new SimpleRouteMatcher();
routeMatcher1.parseValidateAddRoute("get '/foo'", "*/*", wrap("/foo", "*/*", (req, res) -> "Hello World 1"));

MatcherFilter matcherFilter1 = new MatcherFilter(routeMatcher1, false, false);
matcherFilter1.init(null);
PublicJettyHandler handler1 = new PublicJettyHandler(matcherFilter1);
SparkServer server1 = new SparkServer(handler1);

new Thread(() -> {
            server1.ignite("0.0.0.0", 4567, null, null, null, null, "/META-INF/resources/", null, new CountDownLatch(1),
                    -1, -1, -1);
        }).start();

And need to duplicate the wrap method in your codebase:

protected RouteImpl wrap(final String path, String acceptType, final Route route) {
        if (acceptType == null) {
            acceptType = "*/*";
        }
        RouteImpl impl = new RouteImpl(path, acceptType) {
            @Override
            public Object handle(Request request, Response response) throws Exception {
                return route.handle(request, response);
            }
        };
        return impl;
    }

This seems to be a viable workaround if you need multiple Spark servers in your app.

like image 26
jabal Avatar answered Oct 21 '22 21:10

jabal


I have this problem running unit tests with Spark, to fix it I modify the pom.xml file.

forkCount=1 reuseForks=false

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <version>${surefire.version}</version>
            <dependencies>
                <dependency>
                    <groupId>org.junit.platform</groupId>
                    <artifactId>junit-platform-surefire-provider</artifactId>
                    <version>${junit.platform.version}</version>
                </dependency>
            </dependencies>
            <configuration>
                <forkCount>1</forkCount>
                <reuseForks>false</reuseForks>
like image 33
Agu-GC Avatar answered Oct 21 '22 21:10

Agu-GC