Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Connection pooling in Spark Java framework

Java Spark framework uses embedded Jetty as as web server. Jetty supports connection pooling with tools such as HikariCP and provides configuration options in XML files. However, according to these posts, Spark does allow to configure Jetty. There are plenty of examples using Spark, but they either do not use database or use DriverManager to connect to the database.

Is it possible to configure connection pooling via datasource and JNDI in Spark? If so, how?

like image 800
Jan Bodnar Avatar asked Mar 23 '17 10:03

Jan Bodnar


People also ask

What is Java connection pooling?

Connection pooling is a technique of creating and managing a pool of connections that are ready for use by any thread that needs them. Connection pooling can greatly increase the performance of your Java application, while reducing overall resource usage.

Which is the best connection pooling in Java?

Although there are a lot of frameworks available to choose from like C3P0, Apache DBCP, BoneCp, Vibur etc. However, the most popular choices are Tomcat JDBC & HikariCP.

What is the advantage of connection pooling in Java?

Using connection pools helps to both alleviate connection management overhead and decrease development tasks for data access. Each time an application attempts to access a backend store (such as a database), it requires resources to create, maintain, and release a connection to that datastore.


1 Answers

I configured pooling in Spark Java with HikariCP for MariaDB. I did not use Jetty and used Apache Tomcat instead. Here are some code snippets:

src/main/resources/mysql-connection.properties

dataSourceClassName=com.mysql.jdbc.jdbc2.optional.MysqlDataSource
dataSource.url=<url>
dataSource.user=<user>
dataSource.password=<password>
dataSource.cachePrepStmts=true
dataSource.prepStmtCacheSize=100
dataSource.prepStmtCacheSqlLimit=2048
dataSource.useServerPrepStmts=true
maximumPoolSize=10

com/example/app/DataSourceFactory.java

public final class DataSourceFactory {

    static final Logger LOG = LoggerFactory.getLogger(DataSourceFactory.class);

    private static DataSource mySQLDataSource;

    private DataSourceFactory() {
    }

    //returns javax.sql.DataSource
    public static DataSource getMySQLDataSource() {
        if (mySQLDataSource == null) {
            synchronized (DataSourceFactory.class) {
                if (mySQLDataSource == null) {
                    mySQLDataSource = getDataSource("mysql-connection.properties");
                }
            }
        }
        return mySQLDataSource;
    }

    // method to create the DataSource based on configuration
    private static DataSource getDataSource(String configurationProperties) {
        Properties conf = new Properties();
        try {
            conf.load(DataSourceFactory.class.getClassLoader().getResourceAsStream(configurationProperties));
        } catch (IOException e) {
            LOG.error("Can't locate database configuration", e);
        }
        HikariConfig config = new HikariConfig(conf);
        HikariDataSource dataSource = new HikariDataSource(config);
        LOG.info("DataSource[" + configurationProperties + "] created " + dataSource);
        return dataSource;
    }

}

WebContent/WEB-INF/web.xml

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns="http://xmlns.jcp.org/xml/ns/javaee" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee                              
    http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd" version="3.1">

    <display-name>My Spark App</display-name>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.example.app.MySparkApp</param-value>
            <!-- MySparkApp implements spark.servlet.SparkApplication -->
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
    <listener>
        <listener-class>com.example.app.AppServletContextListener</listener-class>
    </listener>
</web-app>

com/example/app/AppServletContextListener.java

public class AppServletContextListener implements ServletContextListener {

    static final Logger LOG = LoggerFactory.getLogger(AppServletContextListener.class);

    @Override
    public void contextInitialized(ServletContextEvent arg0) {
        LOG.info("contextInitialized...");
    }

    @Override
    public void contextDestroyed(ServletContextEvent arg0) {
        LOG.info("contextDestroyed...");
        try {
            if (DataSourceFactory.getMySQLDataSource() != null) {
                DataSourceFactory.getMySQLDataSource().unwrap(HikariDataSource.class).close();
            }

        } catch (SQLException e) {
            LOG.error("Problem closing HikariCP pool", e);
        }

    }

}

And finally you can obtain a pooled java.sql.Connection by calling DataSourceFactory.getMySQLDataSource().getConnection()

like image 121
user1357768 Avatar answered Sep 22 '22 14:09

user1357768