I know I can set the DbContext's CommandTimeout for all queries with something like this:
public class YourContext : DbContext
{
public YourContext() : base("YourConnectionString")
{
// Get the ObjectContext related to this DbContext
var objectContext = (this as IObjectContextAdapter).ObjectContext;
// Sets the command timeout for all the commands
// to 2 min instead of the default 30 sec
objectContext.CommandTimeout = 120;
}
}
However, I want to keep the default 30 sec, except for one single method that takes a bit longer.
How should I change this for this single query?
I did try to use:
public void doSomething(){
// The using had another reason, but in this case it also
// automatically disposes of the DbContext
using(IMyDbContext = delegateDbContext()){
((IObjectContextAdapter)usingDb).ObjectContext.CommandTimeout = 120;
... // myQuery
}
}
Everything works perfectly, until I run my UnitTest with a Mock-DbContext (and yes, I did set my delegate to this Mock-DbContext). It gives me an InvalidCastException
:
System.InvalidCastException: Unable to cast object of type
'Castle.Proxies.FakeMyDbContextProxy' to type
'System.Data.Entity.Infrastructure.IObjectContextAdapter'.
Popular Answer You can use DbContext. Database. CommandTimeout = 180; It's pretty simple and no cast required.
Set database timeout in Entity Framework Try this on your context:...public class MyDatabase : DbContext { public MyDatabase () : base(ContextHelper. CreateConnection("Connection string"), true) { ((IObjectContextAdapter)this). ObjectContext. CommandTimeout = 180; } } ...
The default value is 30 secs .
That's because you're relying on an implementation detail (the fact that your IMyDbContext
also implements IObjectContextAdapter
) that you're not supposed to know about. In your unit test, the IMyDbContext
instance is actually a proxy generated by the mocking framework, and doesn't implement IObjectContextAdapter
.
Since the CommandTimeout
wouldn't make sense for this fake DbContext
, I suggest you try to cast and set the CommandTimeout
only if the cast succeed:
var objectContextAdapter = usingDb as IObjectContextAdapter;
if (objectContextAdapter != null)
objectContextAdapter.ObjectContext.CommandTimeout = 120;
This way, the CommandTimeout
will be set in the real execution environment, but not in the unit test (which doesn't matter, since the mock doesn't actually query the DB)
EDIT: actually, a better and cleaner option would be to modify IMyDbContext
to expose a way to set the CommandTimeout
:
interface IMyDbContext
{
...
int CommandTimeout { get; set; }
}
class MyDbContext : IMyDbContext
{
...
public int CommandTimeout
{
get { return ((IObjectContextAdapter)this).ObjectContext.CommandTimeout; }
set { ((IObjectContextAdapter)this).ObjectContext.CommandTimeout = value; }
}
}
And now you can just do:
usingDb.CommandTimeout = 120;
without worrying about the actual type of the context. The mocking framework would just generate a dummy implementation for this property.
And to touch on the original question of setting the Timeout for a single statement. The straight-forward approach is sometimes the best. Assuming you've exposed the CommandTimeout as suggested above (great idea), then in your function:
var originalTimeout = _dbContext.CommandTimeout;
_dbContext.CommandTimeout = 120;
// do something relevant
_dbContext.CommandTimeout = originalTimeout;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With