Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TClientDataSet: How to keep local data save and available when a database structure has been changed

Hello code enthusiasts!

I have a question, which is no doubt due to some inexperience of my Delphi XE2 knowledge. I'll try to explain it here.

Introduction:

I have an Interbase database with data. This database is located on a remote machine. The client application I am developing, uses this database. Due to the fact that the application must be used when there is no network connection available, I must use the briefcase model. That is why I make use of a ClientDataSet which retrieves the data and stores it locally in an XML format. In my opinion it would be easier to use a local database in stead of XML files, but I am not allowed to change that yet. Thus I am still bound to XML :(

Due to the importance of the data, I would like to keep it as safe as possible. Even when other developers are changing the internal structure of the database (e.g. fields in tables are added, renamed or even removed) the data stored locally still must be available.

What I do now, is that I use a ClientDataSet to retrieve the metadata from the database. This is stored seperately on disk. The next thing I am planning to do, is to compare the metadata from the database with the metadata stored in the local dataset. When I find differences in the fields, I create a new dataset in code where I build up the field definitions and add the data afterwards. In other words, I just create a new local dataset which is conform the structure of a table from the remote database.

This is easy when I find column (field) removals or additions, however it becomes a bit more difficult when there is a change in name or in the datatype of a field.

I do not yet take primary, foreign and unique keys into consideration, however I can feel this has to be done as well.

Question:

My question primarily is, is that I am wondering if this is the correct way. It is quite some work to achieve this, and before I am starting to implement this all, I would like to know if there are other (more convenient and easier) ways to achieve the things I described above.

In the way I see it, the data available locally has a higher priority than the data stored within the remote database. Just because the user is working with the local data and not directly on the remote data.

Any thoughts on this? I hope I could clarify my question enough, if not, please ask and I'll provide more details. I am working with Interbase XE (SP5) and Delphi XE 2.

like image 811
RvdV79 Avatar asked Nov 15 '13 09:11

RvdV79


1 Answers

Well it took me quite some time, but I have it working now. Although I am still a bit sceptic about my solution thus far (I am testing it for the second day now, thus far still no problems) I am also glad to have it working for now though.

I have to apologize for the length of this answer, for which I think it does not benefit the entire readability of my post, but I do not see another possibility to provide enough detail about this subject.

In case other people are working on similar stuff, I decided to post my solution as an answer. I do hope it helps, and of course I am eager to know if I might have missed something.

I wrote a function that tries to update the metadata when differences have been found. Since the local datasets are stored in XML format (thus everything stored locally can be considered being a string), I can treat them as being Variants. Which in fact is a huge benefit for when it comes to adding in data:

procedure TdmDatabase.UpdateMetaDataFor( cds : TCustomClientDataSet; folder : String);

Now the nested procedures and function follow. This might change later, since I am still not too sure about utilizing this approach...

    procedure AddInLocalData( local, newCds : TCustomClientDataSet );
    var i : Integer;
    begin
      try
         (* Assume that the new dataset is still closed... *)
         newCds.CreateDataSet;
         newCds.Insert;
         for i := 0 to Pred(local.Fields.Count) do
         begin
           if ( i < newCds.FieldCount ) then
             newCds.Fields[i].AsVariant := local.Fields[i].AsVariant;
         end;

       newCds.Post;

       newCds.SaveToFile( folder + newCds.TableName + '_updated.xml', dfXMLUTF8 );
     except on E: Exception do
       raise Exception.Create( _Translate(RS_ERROR_UNABLE_TO_SYNC_LOCAL_AND_REMOTE));
     end;
   end;

Adding fields is the easy part, especially when there are no constraints. The extra fields are found from the remote dataset (actually coming from the database itself) and the data is not important there. Thus, we can insert a new field without bothering data that has to be inserted there. If that would be the case (for our project it is not necessary) then this function certainly needs updating:

   function AddFieldsLocally( remote, local, newCds : TCustomClientDataSet ) :  TCustomClientDataSet;
   var i        : Integer;
       fieldDef : TFieldDef;
   begin
     try
       (* Local provider is leading... *)
       newCds.SetProvider(local);
       newCds.FieldDefs.Update;
       (* Find the already existing fields and add them *)
       for i := 0 to Pred(newCds.FieldDefs.Count) do
       begin
         with newCds.FieldDefs[i].CreateField(cds) do
         begin
           FieldName := local.Fields[i].FieldName;
           Calculated := local.Fields[i].Calculated;
           Required := local.Fields[i].Required;
           Size := local.Fields[i].Size; //TODO: Add checking here!
         end;
       end;
       (* Check for additional fields that exist remotely and add them *)
       for i  := newCds.fieldDefs.Count  to Pred(remote.FieldDefs.Count) do
       begin
         fieldDef := remote.FieldDefs.Items[i];
         if (fieldDef <> nil)  then
         begin
           newCds.FieldDefs.Add(fieldDef.Name, fieldDef.DataType, fieldDef.Size, fieldDef.Required);
           newCds.FieldDefs[ Pred( newCds.FieldDefs.Count )].CreateField(newCds);
         end;
       end;

     (* Finally, add the existing local data to the newly created dataset *)
     AddInLocalData(local, newCds);
     result := newCds;
     except on E:Exception
       raise E;
     end;
   end;

Removing fields is a bit more specific. For one thing, it still has to be verified if a field that needs to be removed, is having constraints. If it does, then method should not continue, and the entire local dataset with all tables should be removed and rebuilt from the database, just to ensure proper functionality. Currently, these changes are considered to be major changes. I perform a check if major changes have been applied, if it does, one will most likely need a new version of the application as well.

  function RemoveFieldsLocally( remote, local, newCds : TCustomClientDataSet ) : TCustomClientDataSet;
  var i        : Integer;
      fieldDef : TFieldDef;
      field    : TField;
  begin
    try
      (* Remote provider has lead here! *)
      newCds.SetProvider(remote);
      newCds.FieldDefs.Update;

      (* Find the already existing fields and add them *)
      for i := 0 to Pred(newCds.FieldDefs.Count) do
      begin
        field := newCds.FieldDefs[i].CreateField(cds);

        if assigned(field) then
        begin
          field.FieldName := local.Fields[i].FieldName;
          field.Calculated := local.Fields[i].Calculated;
          field.Required := local.Fields[i].Required;
          (* Necessary for compatibility with for example StringFields, BlobFields, etc *)
          if ( HasProperty( field, 'Size') ) then
          Field.Size := local.FIelds[i].Size;
        end;
      end;

     (* Now add in the existing data from the local dataset.
        Warning: since fields have been removed in the remote dataset, these
        will not be added as well. If constraints were put up, these become
        lost *)
     AddInLocalData(local, newCds);
     result := newCds;
  except on E:Exception do
    raise E;
  end;
end;

The function below checks for equality between fields. If there are differences detected for when it comes to the DataType (FieldType), FieldName, etc, it will try to update it according to the remote dataset's metadata, which is leading.

function VerifyInternalStructuresAndFields( remote, local, newCds : TCustomClientDataSet ) : boolean;
var i, equalityCounter : Integer;
    equal : boolean;
begin
  try
    (* We know that both datasets (local and remote) are equal for when it comes to
       the fieldcount. In this case, the structure of the dataset from the remote     dataset is leading. *)
    newCds.SetProvider(remote);
    newCds.FieldDefs.Update;

    equal := false;
    equalityCounter := 0;
    for i := 0 to Pred(newCds.FieldDefs.Count) do
    begin
      (* 1. Fielddefinitions which are exactly equal, can be copied *)
      equal := (remote.Fields[i].FieldName = local.Fields[i].FieldName ) and
               (remote.Fields[i].Required = local.Fields[i].Required ) and
               (remote.Fields[i].Calculated = local.Fields[i].Calculated ) and
               (remote.Fields[i].DataType = local.Fields[i].DataType) and
             (remote.Fields[i].Size = local.Fields[i].Size );

      if ( equal ) then
      begin
        inc(equalityCounter);
        with newCds.FieldDefs[i].CreateField(cds) do
        begin
          FieldName := local.Fields[i].FieldName;
          Calculated := local.Fields[i].Calculated;
          Required := local.Fields[i].Required;
          Size := local.FIelds[i].Size;
        end;
      end
      else (* fields differ, try to update it, here the remote fields are leading! *)
      begin
        if ( MessageDlg( _Translate( RS_WARNING_DIFFERENCES_IN_FIELDS), mtWarning, mbYesNo, 0) = IDYES ) then
        begin
          with newCds.FieldDefs[i].CreateField(cds) do
          begin
            FieldName := remote.Fields[i].FieldName;
            Calculated := remote.Fields[i].Calculated;
            Required := remote.Fields[i].Required;
            if ( HasProperty( remote, 'Size') ) then
              Size := remote.Fields[i].Size;
            SetFieldType( remote.Fields[i].DataType );  //TODO: If this turns out to be unnecessary, remove it.
          end;
        end
        else
        begin
          result := false;
          exit;
        end;
      end;
    end;

    if ( equalityCounter = local.FieldCount ) then
    begin
      result := false;
    end else 
    begin
      AddInLocalData(local, newCds);
      result := true;
    end;
  except on E:Exception do
    raise E;
  end;
end;

This is the main function which will try to detect the differences between fields and field definitions of the remote and local datasets.

function  FindDifferencesInFields( remote, local: TCustomClientDataSet ) : TCustomClientDataSet;
var i, k     : Integer;
    fieldDef : TFieldDef;
    newCds   : TKLAClientDataSet;
begin
  try
    newCds := TCustomClientDataSet.Create(nil);
    newCds.FileName := local.FileName;
    newCds.Name := local.Name;
    newCds.TableName := local.TableName;

    (* First check if the remote dataset has added fields. *)
    if ( remote.FieldDefs.Count > local.FieldDefs.Count ) then
    begin
      result := AddFieldsLocally(remote, local, newCds);
    end
    (* If no added fields could be found, check for removed fields *)
    else if (remote.FieldDefs.Count < local.FieldDefs.Count ) then
    begin
      result := RemoveFieldsLocally(remote, local, newCds);
    end
    (* Finally, check if the fieldcounts are equal and if renames have taken place *)
    else if (remote.FieldDefs.Count = local.FieldDefs.Count ) then
    begin
      if ( VerifyInternalStructuresAndFields(remote, local, newCds) ) then
        result := newCds
      else result := local;
    end;
  except on E:Exception do
    raise Exception.Create('Could not verify remote and local dataset: ' + E.Message);
  end;
end;

Since all used functions and procedures above are quite critical, I decided to have them nested within the main procedure called UpdateMetaDataFor. I might change this later, but for now it is good enough.

var fieldDefs : TFieldDefs;
remotecds     : TCustomClientDataSet;
constraints   : TCheckConstraints;
fileName      : String;
k             : integer;
begin
  try
    try
      ConnectDB(false);
      fileName := folder + cds.TableName + '_metadata_update.xml';

      (* Retrieve the latest metadata if applicable *)
      remotecds := CreateDataset;
      remotecds.SQLConnection := SQLConnection;
      remotecds.TableName := cds.TableName;
      remotecds.SQL.Text := Format('SELECT * from %s where id=-1', [cds.TableName]);
      remotecds.Open;

      remotecds.SaveToFile( fileName , dfXMLUTF8 );

      (* Load the local dataset with data for comparison *)
      cds.LoadFromFile( folder + cds.FileName );

      SyncProgress( _Translate(RS_SYNC_INTEGRITY_CHECK) + ' ' + cds.TableName);

      cds := FindDifferencesInFields( remotecds, cds );
      cds.SaveToFile( folder + cds.FileName, dfXMLUTF8 );
    except on E: Exception do
      ShowMessage( E.Message );
    end;
 finally
   if assigned(remotecds) then
     remotecds.Free;
   if FileExists( fileName ) then
     SysUtils.DeleteFile( fileName );
   end;
 end;

This concludes my very comprehensive answer, which still leave some things open for consideration. For example, what has to be done with constraints (these are not directly visible on a local dataset).

Another approach could be to add a table to the database itself, which will contain all changes depending on a version number. If these changes are considered minor (name change for a field that has no constraint or whatsoever) then this semi-automatic approach can still be used.

As always, I am still very curious about other approaches to ensure the entegrity of a database when applying briefcase models for databases.

like image 71
RvdV79 Avatar answered Sep 21 '22 10:09

RvdV79