How would you write this query in SLICK?
DB.withSession {
implicit session =>
Tokens.where(_.expirationTime < DateTime.now ).delete
}
DateTime.now is of type org.joda.time.DateTime
and _.expirationTime is a type mapped column of the same type.
I get this error
[error] UserService.scala:80: value < is not a member of scala.slick.lifted.Column[org.joda.time.DateTime]
[error] Tokens.where(_.expirationTime < DateTime.now ).delete
[error] ^
[error] one error found
right now with the query in this form.
My guess is that the JodaTime types are not supported types out of the box for slick. If you changed that column to a java.sql.Timestamp
and used a Timestamp
as your comparison value, things would work. If you want to use joda types in your slick column mappings, you might want to look into this:
https://github.com/tototoshi/slick-joda-mapper
Importing com.github.tototoshi.slick.JodaSupport._
fixed the problem for me.
What I would do:
Let's assume that your Token class looks like this and that you use mysql:
import org.joda.time.DateTime
import com.github.tototoshi.slick.JdbcJodaSupport._
import play.api.db.slick.Config.driver.simple._
case class Token(
id: Option[Long]
expirationTime: Option[DateTime]
)
class Tokens(tag: Tag) extends Table[Token](tag, "Token") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc, O.Nullable)
def expirationTime = column[DateTime]("expirationTime", O.Nullable)
def * = (id.?, expirationTime.?) <> (Token.tupled, Token.unapply)
}
Eg. dependencies:
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "2.1.0-M2",
"com.typesafe.play" %% "play-slick" % "0.8.0-M1",
"com.github.tototoshi" %% "slick-joda-mapper" % "1.2.0",
"mysql" % "mysql-connector-java" % "5.1.26"
)
Then all you need to do is:
import com.github.tototoshi.slick.JdbcJodaSupport._
DB.withSession {
implicit session =>
Tokens.where(_.expirationTime < DateTime.now ).delete
}
If you use any other db than mysql you need to alter your imports and dependencies.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With