I'm using Slick with a Play Framework 2.1 and I have some troubles.
Given the following entity...
package models
import scala.slick.driver.PostgresDriver.simple._
case class Account(id: Option[Long], email: String, password: String)
object Accounts extends Table[Account]("account") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def email = column[String]("email")
def password = column[String]("password")
def * = id.? ~ email ~ password <> (Account, Account.unapply _)
}
...I have to import a package for a specific database driver, but I want to use H2 for testing and PostgreSQL in production. How should I proceed?
I was able to workaround this by overriding the driver settings in my unit test:
package test
import org.specs2.mutable._
import play.api.test._
import play.api.test.Helpers._
import scala.slick.driver.H2Driver.simple._
import Database.threadLocalSession
import models.{Accounts, Account}
class AccountSpec extends Specification {
"An Account" should {
"be creatable" in {
Database.forURL("jdbc:h2:mem:test1", driver = "org.h2.Driver") withSession {
Accounts.ddl.create
Accounts.insert(Account(None, "[email protected]", "Password"))
val account = for (account <- Accounts) yield account
account.first.id.get mustEqual 1
}
}
}
}
I don't like this solution and I'm wondering if there is an elegant way to write DB-agnostic code so there are two different database engines used - one in testing and another in production?
I don't want to use evolution, either, and prefer to let Slick create the database tables for me:
import play.api.Application
import play.api.GlobalSettings
import play.api.Play.current
import play.api.db.DB
import scala.slick.driver.PostgresDriver.simple._
import Database.threadLocalSession
import models.Accounts
object Global extends GlobalSettings {
override def onStart(app: Application) {
lazy val database = Database.forDataSource(DB.getDataSource())
database withSession {
Accounts.ddl.create
}
}
}
The first time I start the application, everything works fine... then, of course, the second time I start the application it crashes because the tables already exist in the PostgreSQL database.
That said, my last two questions are:
onStart
method above DB-agnostic so that I can test my application with FakeApplication
?You find an example on how to use the cake pattern / dependency injection to decouple the Slick driver from the database access layer here: https://github.com/slick/slick-examples.
A few days ago I wrote a Slick integration library for play, which moves the driver dependency to the application.conf of the Play project: https://github.com/danieldietrich/slick-integration.
With the help of this library your example would be implemented as follows:
1) Add the dependency to project/Build.scala
"net.danieldietrich" %% "slick-integration" % "1.0-SNAPSHOT"
Add snapshot repository
resolvers += "Daniel's Repository" at "http://danieldietrich.net/repository/snapshots"
Or local repository, if slick-integration is published locally
resolvers += Resolver.mavenLocal
2) Add the Slick driver to conf/application.conf
slick.default.driver=scala.slick.driver.H2Driver
3) Implement app/models/Account.scala
In the case of slick-integration, it is assumed that you use primary keys of type Long which are auto incremented. The pk name is 'id'. The Table/Mapper implementation has default methods (delete, findAll, findById, insert, update). Your entities have to implement 'withId' which is needed by the 'insert' method.
package models import scala.slick.integration._ case class Account(id: Option[Long], email: String, password: String) extends Entity[Account] { // currently needed by Mapper.create to set the auto generated id def withId(id: Long): Account = copy(id = Some(id)) } // use cake pattern to 'inject' the Slick driver trait AccountComponent extends _Component { self: Profile => import profile.simple._ object Accounts extends Mapper[Account]("account") { // def id is defined in Mapper def email = column[String]("email") def password = column[String]("password") def * = id.? ~ email ~ password <> (Account, Account.unapply _) } }
4) Implement app/models/DAL.scala
This is the Data Access Layer (DAL) which is used by the controllers to access the database. Transactions are handled by the Table/Mapper implementation within the corresponding Component.
package models import scala.slick.integration.PlayProfile import scala.slick.integration._DAL import scala.slick.lifted.DDL import play.api.Play.current class DAL(dbName: String) extends _DAL with AccountComponent /* with FooBarBazComponent */ with PlayProfile { // trait Profile implementation val profile = loadProfile(dbName) def db = dbProvider(dbName) // _DAL.ddl implementation lazy val ddl: DDL = Accounts.ddl // ++ FooBarBazs.ddl } object DAL extends DAL("default")
5) Implement test/test/AccountSpec.scala
package test import models._ import models.DAL._ import org.specs2.mutable.Specification import play.api.test.FakeApplication import play.api.test.Helpers._ import scala.slick.session.Session class AccountSpec extends Specification { def fakeApp[T](block: => T): T = running(FakeApplication(additionalConfiguration = inMemoryDatabase() ++ Map("slick.default.driver" -> "scala.slick.driver.H2Driver", "evolutionplugin" -> "disabled"))) { try { db.withSession { implicit s: Session => try { create block } finally { drop } } } } "An Account" should { "be creatable" in fakeApp { val account = Accounts.insert(Account(None, "[email protected]", "Password")) val id = account.id id mustNotEqual None Accounts.findById(id.get) mustEqual Some(account) } } }
I cannot give you a sufficient answer to this question...
... but perhaps this is not really s.th you want to do. What if you add an attribute to an table, say Account.active
? If you want to safe the data currently stored within your tables, then an alter script would do the job. Currently, such an alter script has to be written by hand. The DAL.ddl.createStatements
could be used to retrieve the create statements. They should be sorted to be better comparable with previous versions. Then a diff (with previous version) is used to manually create the alter script. Here, evolutions are used to alter the db schema.
Here's an example on how to generate (the first) evolution:
object EvolutionGenerator extends App { import models.DAL import play.api.test._ import play.api.test.Helpers._ running(FakeApplication(additionalConfiguration = inMemoryDatabase() ++ Map("slick.default.driver" -> "scala.slick.driver.PostgresDriver", "evolutionplugin" -> "disabled"))) { val evolution = ( """|# --- !Ups |""" + DAL.ddl.createStatements.mkString("\n", ";\n\n", ";\n") + """| |# --- !Downs |""" + DAL.ddl.dropStatements.mkString("\n", ";\n\n", ";\n")).stripMargin println(evolution) } }
I was also trying to address this problem: the ability to switch databases between test and production. The idea of wrapping each table object in a trait was unappealing.
I am not trying to discuss the pros and cons of the cake pattern here, but I found another solution, for those who are interested.
Basically, make an object like this:
package mypackage
import scala.slick.driver.H2Driver
import scala.slick.driver.ExtendedProfile
import scala.slick.driver.PostgresDriver
object MovableDriver {
val simple = profile.simple
lazy val profile: ExtendedProfile = {
sys.env.get("database") match {
case Some("postgres") => PostgresDriver
case _ => H2Driver
}
}
}
Obviously, you can do any decision logic you like here. It does not have to be based on system properties.
Now, instead of:
import scala.slick.driver.H2Driver.simple._
You can say
import mypackage.MovableDriver.simple._
UPDATE: A Slick 3.0 Version, courtesy of trent-ahrens:
package mypackage
import com.typesafe.config.ConfigFactory
import scala.slick.driver.{H2Driver, JdbcDriver, MySQLDriver}
object AgnosticDriver {
val simple = profile.simple
lazy val profile: JdbcDriver = {
sys.env.get("DB_ENVIRONMENT") match {
case Some(e) => ConfigFactory.load().getString(s"$e.slickDriver") match {
case "scala.slick.driver.H2Driver" => H2Driver
case "scala.slick.driver.MySQLDriver" => MySQLDriver
}
case _ => H2Driver
}
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With