Using Delphi Seattle. I have an application which makes various REST calls. Some of these calls might return 10-20 rows via JSON, while others might return 30-40 thousand rows. I have set my REST server up to return rows in batches of 1,000. When the data comes back to my client, I use RestDataAdapater, DataSource and Client Data Set to expose the data as if it was a local table. This part appears to work fine. If we are at the end of the 1000 rows, then I change the URL, and request the next batch of 1,000 rows.
My challenge: I would like to abstract this so that one routine can handle all scenarios (at least for GET calls). The tricky parts is how do I handle the datasource/client data set 1,000 row issues? An example might help clarify... I would like to be able to execute something like this...
...
genericREST_Get(baseURL, resource, suffix); // This would actually execute the REST call, where the components are in Datamodule DM1.
while not dm1.ds_Generic.DataSet.Eof do
begin
... some kind of processing
dm1.ds_Generic.DataSet.Next;
end;
How do I handle crossing the 1000 row threshold? When my calling program (shown above) goes from row 1000 to 1001, the REST API needs to request the next set of 1000 rows from the server. While I know HOW to do that, I don't know WHERE to do that. I want the "get next 1000 rows" to be in the generic routine (aka the genericREST_Get routine). I don't want each of the calling routines to have to deal with that.
Assume that all routines will ONLY move forward, never backwards.
Here are a few options for you to consider:
1) Just get all the data
30-40 thousand rows is not that much to hold in memory for most applications. Even if you need to make multiple rest calls to get the data you can do that up front. If you are always going to loop over all the data the time will be the same if you get it up front or inside the loop:
repeat
PartialData := genericREST_Get(baseURL, resource, suffix);
// CopyDataSet is actually a FireDac method that I don't see on ClientDataSet
// Basically just .Append and copy all fields with matching names.
FullDataMemTable.CopyDataSet(PartialData);
until PartialData.IsEmpty;
2) If you want only have one group of data at a time in memory you can wrap the DataSet in another object that duplicates some of the calls (Eof, FieldByName, Next, etc.) When "Next" hits eof then you try and get more data. The example here is a standalone class, but you could also make these public methods on your DataModule. Then instead of something like dm1.ds_Generic.DataSet.Next you just call dm1.Next.
constructor TDataFetcher.Create(BaseUrl, Resource, Suffix: string);
begin
FBaseUrl := BaseUrl;
FResource := Resource;
FSuffix := Suffix;
end;
procedure TDataFetcher.Open;
begin
FData := genericREST_Get(FBaseURL, FResource, FSuffix);
end;
procedure TDataFetcher.GetNextData;
begin
FData := genericREST_Get(FBaseURL, FResource, FSuffix);
end;
function TDataFetcher.Eof: boolean;
begin
result := FData.Eof;
end;
function TDataFetcher.FieldByName(FieldName: string): TField;
begin
result := FData.FieldByName(FieldName);
end;
procedure TDataFetcher.Next;
begin
FData.Next;
if FData.Eof then
begin
GetNextData;
end;
end;
Other Options:
a) Inherit from TClientDataSet
You can also accomplish this by deriving a new class from TClientDataSet and overriding MoveBy:
function MoveBy(Distance: Integer): Integer; virtual;
If the Inherited MoveBy sets EOF then you can load the next set of data. However if you try this make sure you consider all the use cases. For example what do you want to happen if the caller uses .Last? That is one advantage the wrapper class has. The caller can't do anything other than what you expose.
function TMyDataSet.MoveBy(Distance: Integer): Integer; override;
begin
inherited MoveBy
if self.Eof then
begin
FetchMoreData;
end;
end;
b) FetchOnDemand
ClientDataSet has built in support for FetchOnDemand. I don't know how that would interact with the RestDataAdapter. I'm sure given enough work you could get a provider that would return a total record count and then let the ClientDataSet request more records as needed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With