In our business logic we have to handle positive and negative Double.Infinity
values as well as Double.NaN
values.
We have to store these values into Microsot SQL Server 2008 database. The problem is that Microsot SQL Server doesn't support infinity or nan values. (Problem description for SQL Server 2005, also applicable for MS SQL Server 2008).
We are using Hibernate 3.6.7 as our JPA implementation and Microsoft's sqljdbc4 driver version 4.0.2206.100.
We have tried to solve this issue by setting JPA entity's column definition to VARCHAR like this
@Column(columnDefinition = "VARCHAR(40)")
private Double foo;
This does not seem to have any effect, even though column definition is properly changed to VARCHAR in the database. It seems that the infinity value is caught in JDBC driver's validation since we get the following stacktrace when trying to INSERT Double.Infinity
value:
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 6 (""): The supplied value is not a valid instance of data type float. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:404)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:350)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155)
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeUpdate(SQLServerPreparedStatement.java:314)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.hibernate.id.IdentityGenerator$GetGeneratedKeysDelegate.executeAndExtract(IdentityGenerator.java:94)
at org.hibernate.id.insert.AbstractReturningDelegate.performInsert(AbstractReturningDelegate.java:57)
... 79 more
Any ideas how to work around this issue are welcome.
Adaptive Server version 12.0 and later and SQL Anywhere version 6.0. x and later can store Java objects in a table, with some limitations. See the jConnect for JDBC Release Bulletin for more information.
Several special floating point values are treated in a case-insensitive manner: Inf, +Inf, Infinity, +Infinity: positive infinity. -Inf, -Infinity: negative infinity.
NaN is short for not a number. This is how null values are displayed in Databricks SQL.
Here's a simple workaround for this issue:
Use String
for field type and do the conversion in getters and setters.
This way you don't have to change your business logic and the actual transformation logic is encapsulated nicely.
// Column definition is not a necessity here
@Column(columnDefinition = "VARCHAR(40)")
private String foo;
public Double getFoo() {
return this.foo != null ? Double.valueOf(this.foo) : null;
}
public void setFoo(Double d) {
this.foo = d != null ? d.toString() : null;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With