Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unit testing Java Spark microframework

I am trying to add a restful api to a java microservice. For this, I am using spark:

http://sparkjava.com/documentation.html

I've created a very simple class which stands up an api. That class is here:

public class Routes {
    public void establishRoutes(){
        get("/test", (req, res) -> "Hello World");
        after((req, res) -> {
            res.type("application/json");
        });

        exception(IllegalArgumentException.class, (e, req, res) -> {
            res.status(400);
        });
    }

Now, running Routes.establishRoutes() should stand up an api which would show "Hello World" in the event someone decides to visit http://localhost:4567/test. This does actually work. Hurray!

The next step is unit testing the code. My unit test, unfortunately, does not succeed. The spark documentation does not detail a sound way for doing testing so what I have is pieced together from examples I found around the net. Here is my Junit test:

public class TestRoutes {
        @Before
        public void setUp() throws Exception {
            Routes newRoutes = new Routes();
            newRoutes.establishRoutes();
        }

        @After
        public void tearDown() throws Exception {
            stop();
        }

        @Test
        public void testModelObjectsPOST(){

            String testUrl = "/test";


            ApiTestUtils.TestResponse res = ApiTestUtils.request("GET", testUrl, null);
            Map<String, String> json = res.json();
            assertEquals(201, res.status);
        }

Here is the code behind ApiTestUtils.request():

public class ApiTestUtils {
    public static TestResponse request(String method, String path, String requestBody) {


        try {
            URL url = new URL("http://localhost:4567" + path);
            HttpURLConnection connection = (HttpURLConnection) url.openConnection();
            connection.setRequestMethod(method);
            connection.setDoOutput(true);
            connection.connect();
            String body = IOUtils.toString(connection.getInputStream());
            return new TestResponse(connection.getResponseCode(), body);
        } catch (IOException e) {
            e.printStackTrace();
            fail("Sending request failed: " + e.getMessage());
            return null;
        }
    }

    public static class TestResponse {

        public final String body;
        public final int status;

        public TestResponse(int status, String body) {
            this.status = status;
            this.body = body;
        }

        public Map<String,String> json() {
            return new Gson().fromJson(body, HashMap.class);
        }
    }
}

I am failing on connection.connect() inside ApiTestUtils.request(). Specifically, I get the error: java.lang.AssertionError: Sending request failed: Connection refused

I believe this is happening because the application isn't listening when my test tries to make the request. However, I don't understand why that would be the case. I borrowed the test code from the demo project found here: https://github.com/mscharhag/blog-examples/blob/master/sparkdemo/src/test/java/com/mscharhag/sparkdemo/UserControllerIntegrationTest.java

UPDATE: I tried running the example linked above. Turns out, it doesn't work either. Looks like spinning up a spark instance in this context is more difficult than I thought? I'm not trying to figure out how to do so.

like image 783
melchoir55 Avatar asked Oct 14 '16 06:10

melchoir55


People also ask

What is spark in Java?

Spark is a Java micro framework that allows to quickly create web applications in Java 8. Spark is a lightweight and simple Java web framework designed for quick development. Sinatra, a popular Ruby micro framework, was the inspiration for it.


1 Answers

In your test case is missing the code used for waiting the initialization of the embedded server. I've tried your code and stumbled on the same issue as you did, but after debugging it I've noticed that the embedded spark server is initialized in a newly created thread. ( see the method spark.Service#init()). All you need to do in your test is to await for the initialization by calling the method spark.Spark#awaitInitialization()

import org.junit.After;
import org.junit.Before;
import org.junit.Test;

import static junit.framework.TestCase.assertEquals;
import static spark.Spark.awaitInitialization;
import static spark.Spark.stop;

public class TestRoutes {
    @Before
    public void setUp() throws Exception {
        Routes newRoutes = new Routes();
        newRoutes.establishRoutes();

        awaitInitialization();

    }

    @After
    public void tearDown() throws Exception {
        stop();
    }

    @Test
    public void testModelObjectsPOST() {

        String testUrl = "/test";


        ApiTestUtils.TestResponse res = ApiTestUtils.request("GET", testUrl, null);
        assertEquals(200, res.status);
    }

}
like image 159
marius_neo Avatar answered Oct 17 '22 00:10

marius_neo