Modifying delta server-side on the fly

Got an interesting situation and wondering what the best way to deal with it is.

Basically I have a situation where the server-side processing of a delta could give rise to other changes within the same table (or multiple tables in a master/detail hierarchy). Depending on the changes to a give record, other records may need to be added, updated or deleted.

Now I could override the AfterProcessChange procedure and, for any update which will require such changes to other records, create another instance of my data module on the server to implement the changes. This seems like a nice way to do things structurally but I foresee problems with transactions. I could make the changes to other records based on the original delta change, only for the original change to be rolled back due to a later problem with the overall delta, which would leave my database inconsistent.

I then had the idea of modifying the original delta itself before processing began. I could iterate through the changes and add additional changes to the delta for the other records, then allow the server to process all the resulting changes in one go as part of the same transaction. Is this a good idea or am I going to hit some weird problems doing this? One thing I wondered was whether it would cause issues with the automatic mechanism which returns information to the client so it can update its dataset, for things like autoincrement field values and so forth. Not sure whether mucking around with the delta on the server would upset things like this?

The only other alternative I can think of is to defer this additional processing until after the original update commits, basically recording what needs doing in some kind of queue which is only processed after the original commit. This seems a bit tidier but again I could have issues if the original transaction commits but the additional processing then fails.

Just wondering if anyone has hit anything like this before or has any suggestions.

1 Like

So anyone have any thoughts on this?

I’m kind of shying away from the idea of modifying the delta as this feels a bit “hacky” to me.

Ideally I’d like to go with the first option, i.e. making the other changes after the delta has been processed, probably via AfterProcessChange, but I’m wondering how I can/should handle transactions.

Is it possible to make these other required changes within the same transaction as the “main” changes, so they all commit or rollback together? To keep my code clean and encapsulated, what I’d ideally do is create another instance of my data module, with all the relevant TDAMemDataTables in it and so forth, then make the changes through this and call ApplyUpdates. As this would be running server-side, it would apply the update via an LDA but this would create an additional instance of my service in order to apply the changes which would presumably create another transaction. What I’d want to do is somehow instruct the framework to process these updates within the context of the existing transaction but not sure if I can do that.

by default, all changes are processed in one transaction. see TDataAbstractService.InternalUpdateData for details.

with DAService.ReturnUpdateFailureDelta property you can control how to changes will be processed. if this is set to False, at failure in any change, all changes will be rollbacked. it can be useful in your scenario.

another possibility is manually start/stop transactions.

When you say “all changes are processed…” I presume you mean all changes within a delta as these create a single service instance? If I create a second service instance, won’t this have it’s own transaction?

I’ll try to be a bit more specific and a little more vague about what I’m trying to do.

In my architecture, I have a data module for each “logical group” of tables which I instantiate when I need to make any changes to these tables. When instantiated by the server itself, these create an LDA instance which in turn creates a service instance to apply the updates. When instantiated on a client, they use an existing RDA instance instead. This works very well indeed.

The situation I have is that a client will create such a data module to apply some updates. When this is processed on the server, some of these changes could require that other records within the same table(s) need to be created/updated/deleted. My idea was that, during processing of the original client delta within the service instance, I would instantiate another instance of the same data module (although this time it would be using an LDA as within the server) and perform the additional changes via this instance, then return to my original delta to complete processing.

Are you saying that doing this would execute the additional updates within the same transaction as the main “parent” one initiated by the client? I can’t see how this would be the case as surely creating a fresh LDA instance within my second data module instance would create another service instance with its own transaction control.

Sorry if I’m getting confused here, just having a bit of trouble wrapping my head around this.

you can pass IDAConnection from one Service instance to other one:

  SecondService.Connection := Self.Connection;

also you can assign own service instance to lda like

  lda.ServiceInstance := Self;

from DAService events

Ah ok, so if I set the ServiceInstance property of my LDA in the “nested” data module to the same instance from which I’m creating it (i.e. the one I’m current in processing the original delta) then it will use the same connection and transaction?

I’ll give this a try and see how i get on.

One thing - my LDA is present in my data module at design-time, with the ServiceName set accordingly. Can I safely modify the ServiceInstance property as you suggest after creation or do I need to create the LDA at run-time with the relevant constructor, passing the service instance?

the same server instance and all his properties will be used, including connection and its transaction.

LDA.ServiceName is required for creation proper ServiceInstance only but you can empty this property before assigning ServiceInstance.

Note: Don’t forget to assign nil to LDA.ServiceInstance when you finish update operation otherwise it will keep service interface and service won’t be released properly.

Hmm, something’s not right here. I’m creating my second data module instance inside the AfterProcessChange handler and setting the ServiceInstance of the newly created LDA therein to the existing service (Sender.Service within the AfterProcessChange handler).

I then use this second data module to make the necessary additional changes (adding a couple of extra records in my test), then call ApplyUpdates.

Whilst this works, and the additional records are added, when the ApplyUpdates returns, the Sender.Service is now NIL.

I suspect that, when the secondary data module and the LDA within are destroyed, the service instance to which its attached is being destroyed. That would make sense under normal conditions but, as the service instance is attached to two different LDAs, the main one and the nested one, I’d have thought it wouldn’t be freed until both references were no longer valid.

Is there something extra I need to do before destruction of the secondary LDA to prevent the service instance from being freed? I’ve tried setting the ServiceInstance property back to NIL in the module’s destructor but this doesn’t have any effect.

Think our replies overlapped there :smile:
As you say, I am setting the ServiceInstance property back to NIL but am having the opposite problem whereby it frees it when I don’t want it to.

Seems the service instance isn’t actually being freed at this point, just the Sender.Service property is being set to NIL, the service frees later, as expected.

Is this something to do with my business processors? I’m using my own TDABusinessProcessor component rather than allowing the framework to create one on demand and it’s obviously the Service property of this which is being set to NIL.

this is as expected. this property is set/unset in

procedure TDataAbstractService.BP_ProcessChanges(aStruct: TDADeltaStruct;
  aChangeTypes: TDAChangeTypes;aSynchronizeAutoIncs: Boolean);
..
    aStruct.BusinessProcessor.Service := Self;
    try
      aStruct.BusinessProcessor.ProcessDelta(...);
    finally
      aStruct.BusinessProcessor.Service := nil;
    end;
..

if the same BP is used when you apply LDA change, you can store it in local variable and later restore like

var
  lService: TRORemoteDataModule;
begin
  lService := Sender.Service;
  try
     ...
  finally
    Sender.Service := lService;
  end;
end;

Yeah that makes sense as I’m re-using the same service so using the same BP for the secondary data module and LDA - I’ll have a play with that and see if I can sort it out :slight_smile:

Success! That works superbly, thanks for the help.

Of course that system only works if I’m re-using the same service. If I want to make use of another service but within the same transaction as the first, presumably I need to use the alternative method you mentioned, where I pass the connection from the first service to the second?

yes, you need to pass connection.

Ok I’ll have a play with that next. Thanks for all the help :slight_smile:

Hmm, how would I best go about assigning the connection to the second service? The service is obviously created on-demand by the framework when I call ApplyUpdates on my data table, so not sure how I’d intercept this to manually set the connection. Do I need to create the service instance itself manually, then assign this to the secondary LDA much like above?

you can store your service instance (or connection) in some variable and assign proper connection via event like OnAfterAcquireConnection or similar.

another possibility - create service instance manually.

Ok thanks.

If I choose the first option then I’m currently using OnBeforeAcquireConnection to set the connection name (I use multiple connections for different databases). I could set the connection there instead - if I use the OnAfterAcquireConnection then surely it will first acquire a new connection which I’d then immediately replace - wouldn’t that cause problems?

The second option seems a little cleaner to me. Can I literally just instantiate a service instance and then assign it to my LDA’s ServiceInstance property as before?

re 1st: you can also assign Connection in OnCreate/OnActivate event, in this case, OnBeforeAcquireConnection/OnAfterAcquireConnection won’t be fired at all.
as a result, LDA will create service instance by demand and the connection will be assigned in this event.

re 2nd: yes.

Using the first system strikes me as being fraught with problems as there could be other threads creating instances of the service at any time which I’d want to operate “normally”, so trying to use some kind of global variable to pass a connection would be problematic.

I’ve been playing with the second method which seems straightforward but not sure how to “attach” the session to the new service, can I literally just assign the current session to the new service’s Session property?