What kind of paging support is offered through ADO.NET EF and LINQ?
What does a "first 10" Select look-like?
A "next 10" Select?
As the others have explained here, Take() and Skip() are what you need.
They will chop the result set to get you the page you want.
You have to maintain the PageIndex and PageSize information somehow so that you can pass them in when running your query. If your data access is done through a web service for instance, you would pass the index/size in at the same time as your filtering criteria, maintaining those values in your client (application, or page if it is a website).
There is no "stateful iterator for paging" out of the box, if that is what you are looking for...
Moreover, if you are implementing a "standard paging" construct, you will need to get the total count of records before limiting your query, which you can do like that, assuming your function get PageSize and PageIndex as parameters somehow:
var query = ...your normal query here...
int totalRecordCount = query.Count();
var pagedQuery = query.Skip(PageIndex*PageSize).Take(PageSize);
The Take keyword is used to decide how many records are to be fetched. A simple example of the Take keyword is provided below.
List customers = GetCustomerList();
var first3Customers = ( from c in customers select new {c.CustomerID, c.CustomerName} ) .Take(4);
Here we are taking the first 4 customer for the list provided.
we can also use the where clause to narrow down the list first and then take 4 of them.
var first3Customers = ( from c in customers where c.Region == "Kol" select new {c.CustomerID, c.CustomerName} ) .Take(4);
But what if we want to get the the data between 4th and 8th record. In this case we use the skip keyword to skip the number of records(from top) we don’t want. Here is the example of using the Skip keyword.
var first3Customers = ( from c in customers where c.Region == "Kol" select new {c.CustomerID, c.CustomerName} ) .Skip(3).Take(4);
More Here
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With