Detail tables failing to update in 1517

Hi,

Have recently upgraded to 10.0.0.1517 and am having a problem calling ApplyUpdates on a table whereby detail tables with changes are not seeing those changes applied.

The detail tables do have Deltas, I’ve checked that, yet nothing gets applied server-side after the client ApplyUpdates call on the parent table.

Literally nothing has changed here except upgrading to 1517 so I’m concerned something has been broken here.

I need to do more testing before I can be certain what the issue is but thought I’d check here in case anyone else is seeing similar issues or something has been changed in this area which could be causing this?

Hi,

I assume this is Delphi.
I don’t remember that we changed something in m/d in past months.

Can you create a simple testcase that reproduces this issue, pls?

Ok, I’ll try to track it down some more tomorrow.

Ok, I’ve now proven to my satisfaction that is is being caused by a change in RO.

I tried reverting to a build of my code with 1463 and it works fine. Simply rebuilding with 1517 and it breaks.

Now this appears to be related to the way I’m retrieving a large hierarchy, as per this old thread:

As described in that topic, with your help I managed to devise a means of retrieving a large hierarchy of data tables (potentially 13 tables across 4 levels) in the most efficient manner possible.

This has worked absolutely fine until now but has broken with 1517. If I stop using my method and simply do it the “usual” way by just opening the top level parent and allowing it to retrieve detail records on the fly as I navigate them, then it works fine.

So something has changed in 1517 which is causing my loading method to fail. The most obvious suspect is the RemoteFetchEnabled property. As per that topic, this is left set to False on the detail tables in order to prevent it attempting to retrieve data that’s already present. Is it possible something has changed that’s preventing updates being sent/applied when this property is false?

I’ll try to trace through the code and work out exactly where it’s failing and also see if I can produce a small test app.

Hi,
I cannot reproduce any failure with m/d feature in .1517
I’ve tested with the Multi Level Details sample from DA7: 25319.zip (5.6 MB)

I’ll see if I can reproduce it in a test program

Looking at the code and tracing through as it executes, I’ve noticed two things.

Firstly, the actual issue appears to be occurring inside TDABaseDataAdapter.InternalApplyUpdates_DetectTablesForUpdate

The CheckTable nested function is failing to add any of the detail tables as the MasterOptions doesn’t contain moIncludeDeltaIntoMasterCall and RemoteFetchEnabled is False.

What is strange is that this code hasn’t changed from 1463. I can’t currently check what it’s doing in that version without reinstalling it on my machine which I may try at some point.

Secondly, there’s a code change in TDADataTable.WriteDeltaToStream which has added an aRecursive parameter. I’m not sure if this is relevant but it does seem to have something to do with the processing of detail deltas. Could this be connected?

Additional to that WriteDeltaToStream function, it looks like this is being called from TDARemoteDataAdapter.InternalApplyUpdate_PrepareParams with the aRecursive parameter set to False.

This looks like a change from 1463 and may explain the issue I’m having. Going to try to produce a test case now.

Hi,

as I see - the “Recursive” fix was related to this topic

I need a simple testcase that reproduces your issue.

Ok I’ll try to create a simple test case but basically what’s happening is this.

In a ‘normal’ master/detail situation, the detail tables have RemoteFetchEnabled.
In this case, it looks like the TDABaseDataAdapter.InternalApplyUpdates_DetectTablesForUpdate is adding the detail tables as the nested CheckTable function returns True so it recurses through the hierarchy, adding all the details.

With my method, as per the linked thread, RemoteFetchEnabled has to be False on all detail tables.
In this case the above function will not add the details as CheckTable will return False.
However it looks like the subsequent call to TDARemoteDataAdapter.InternalApplyUpdate_PrepareParams was adding the detail tables in the WriteDeltaToStream call in 1463.

Now that WriteDeltaToStream is being passed a parameter to prevent recursion through the details, they never get added which produces my issue.

Hi,

from this sight, you have non-standard case.
logically - if detail table has RemoteFetchEnabled value different from master then this table shouldn’t be updated when master.ApplyUpdate is called because it has different settings.

imagine case:

  • master uses RDA1 (say MSSQL server)
    • detail1 uses RDA2 (say SQLite server)
      • detail11 uses RDA3 (say MySQL server)
        • detail111 uses briefcase as storage with RemoteFetchEnabled=false

in this case detail tables cannot be updated when master.ApplyUpdates is called.
you have a bit of similar case.

I understand where you’re coming from but the fact remains that this has worked fine until this version.

The ‘mass loading’ method I devised in the linked thread three years ago allows me to bulk load a huge hierarchy in ‘one hit’ which is much faster and has provided a big performance boost. This is now broken so I’m forced to either revert and never upgrade again, which isn’t really practical, or revert to the slower ‘standard’ method.

I kind of understand what you’re saying about RemoteFetchEnabled but there are scenarios, including mine, where you want to populate the tables by ‘other means’ yet still have deltas sent to the server via the regular method.

The reason I need to disable RemoteFetchEnabled on the detail tabless in the first place is because, if I don’t, the framework tries to retrieve data as I navigate the parent tables, despite the data already having been populated.

What I really need is a ‘proper’ way of flagging that data has already been retrieved/populated. Once I’ve done my bulk fetching of all data throughout the hierarchy, if I could somehow indicate this to the framework then I could leave RemoteFetchEnabled without it trying to re-fetch data continually.

The framework sort of does this anyway. In a ‘normal’ scenario, if you have a three level master-detail-subdetail hierarchy then, as you navigate the detail table, the relevant subdetail records are retrieved on-the-fly but only once for each detail record. Once you’ve navigated all detail records, the subdetail table contains all records and doesn’t retrieve them again as you move round the detail table.

That’s what I’m trying to achieve, to somehow tell the framework that all the subdetail records are there already so it doesn’t need to fetch them the first time I navigate to a detail record.

The performance benefits of my method are far from trivial. The hierarchy in question represents parts in a stock/pricing system and there are cases where I have a single top-level parent with thousands of detail records and subdetail records beneath them. With standard loading, if I need to iterate all of these records then it will generate thousands of round-trips to the server to retrieve the subdetail for each detail as I go. Being able to retrieve everything in a single trip saves masses of time.

Hi,

try to use these workarounds:

  • set RemoteFetchEnabled to True for all details tables before calling master.ApplyUpdates
  • master.ApplyUpdates
  • set RemoteFetchEnabled to False for all details tables after calling master.ApplyUpdates.

or pass detail tables to RDA too:

  • call RDA.ApplyUpdates([master, all details]).

Ok thanks I’ll give that a try.

As a developer myself, I do appreciate your position and that I’ve been using the framework in a ‘non-standard’ way which happened to work but of course isn’t guaranteed to forever. We often have a similar situation when customers use our software in a non-standard way that happens to work but we then fix or change something which stops that use case from working.

The ideal would be some way of “telling” the data table that detail records are already present and don’t need to be fetched. Obviously there’s some kind of “flag” against each record which indicates that detail records have been retrieved in order to prevent re-retrieval when that record is visited again. If I could somehow “set” that flag against a record or even just the whole table it would solve the issue completely.

Hi,

Are you using MasterOptions.moAllInOneFetch?

moAllInOneFetch seems to go to the other extreme.

If I set that option in all tables then, when I just open the top-level table in the ‘normal’ way, it retrieves ALL records for the detail and subdetail tables down the tree with no WHERE clause on the SQL at all. I don’t want to retrieve the whole database, just all the detail and subdetail records for the selected top-level master.

This is what my bulk loading function does. I apply a Dynamic Where to each detail table throughout the hierarchy to retrieve ALL records for the selected top-level master, then open them all in a single Fill call. This works perfectly, the framework just doesn’t realise I’ve done it and insists on re-retrieving the records again as I navigate.

Regards the possible workarounds…

I tried enabling RemoteFetchEnabled before the ApplyUpdates call which did cause the updates to be applied but it immediately then re-retrieves records again before I can disable RemoteFetchEnabled.

Calling RDA.ApplyUpdates and specifying all tables doesn’t work either as they still get filtered out by the CheckTable function mentioned earlier.

One thing I have just tried is adding the moIncludeDeltaInMasterCall option to the detail table since the CheckTable function is testing this. This actually appears to work! The detail updates are applied and no re-retrieval of records takes place. The only question here is what else might adding this option effect as I’ve never used it before?

EDIT: Looking at it a bit more, the only places this option appears to be used is in the aforementioned CheckTable function and also in the aforementioned WriteDeltaToStream function but then only if aRecursive is True. From the looks of things, adding this option would seem to do what I need with no adverse effects, would you concur with that?

Hi,

I need a simple testcase to be sure that this isn’t testcase failure.

you can add DynamicWhere expressions to detail tables for initial loading. as a result, you can reduce number of records

Aha! I was unaware of this!

I’ve just tried that, so using moAllInOneFetch on all the tables but also configuring initial Dynamic Where clauses on each detail to restrict it to only records for the top-level master.

It seems to work! All detail records are loaded according to my where clauses when I open the top-level master and subsequent updates also appear to work. I’ll test this some more.

I’ve attached a small test-case app I’ve been playing with. I’m using FireDAC against a SQL Server database and there’s a SQL script in the ZIP to create the database. There are buttons for regular opening (top-level table only) and my existing bulk-load logic. As it stands this test-case app has the moAllInOneFetch option enabled and the Dynamic Where clauses against the details so the regular Open functionality appears to be working.

BulkLoadTest.zip (114.5 KB)

Hi,

as for me, you shouldn’t apply this expression:

with tblMaster.DynamicWhere do Expression := NewBinaryExpression('Master','MasterId',dboEqual,1);

also it works correctly for me with

procedure TClientForm.btnOpenClick(Sender: TObject);
begin
  with ClientDataModule do
  begin
    tblMaster.Open;
  end;
  UpdateButtons;
end;