Unexpected returned table rows on client side

I would like to know what could cause the DA4 dataset on the client side be smaller than on the server side

on client side I have a DA4 table which I want to open fully
Table:=TBerichtenDATable.Create(nil,True); // from cache
Table.Open();
This returns table.recordcount = 3 (or some other number)

on the server side I return the Binary dataset from a cache I create
FCachedData[tnl]:=Binary.Create();
DataSet.Dataset.First();
DABINDataStreamer.WriteDataset(FCachedData[tnl],DataSet,[woSchema,woRows]);
Here we see DataSet.recordcount=53 which is correct.

My RO getdata function has interface
function TTaakService.OffGetCachedData(const aTableNameArray:StringArray;const TableName:string; const UserId: Integer; const Context: String; const MaxRecords: Integer): Binary;

and sets its result with
Result:=FCachedData[tnl].Clone();

What could be the reason that an incorrect recordcount is returned?

FYI: I am debugging both server and client simultaneously in 2 delphi IDEs

Some extra info:
The DataSet:IDaDataSet;
is opened with
DataSet := DATaakSchema.NewDataSet (Connection, tablename);
dw:=DataSet.DynamicWhere;
minDate:=IncDay(Date,-62);
CombineDynamicWhere(dw,dw.NewBinaryExpression(nme_Berichten,fld…
CombineDynamicWhere(dw,dw.NewBinaryExpression(nme_Berichten,fld_…
DataSet.Open();
and has a recordcount of 53
however when i do
DataSet.Dataset.First();
the recno is set to 51
so that seems why i get recno 51 to 53 but nothing more
when i do
DataSet.Dataset.Last();
the recno is set to 0

Why can I not trust the First and Last statements?

An extra piece of information:
I process the dataset after the open and before I serialize it.
I have a ds.First();
while not ds.EOF do … ds.Next(); end;
loop over the DataSet.DataSet which successfully runs over the 53 records.
It seems I can only loop over it once. My dataset has type TDAESDACQuery

Hi,

such DB datasets are optimized for performance and may give some incorrect information when you asked for RecordCount.

for example, native SDAC dataset is created as

  result := TMSQuery.Create(nil);

  TMSQuery(result).FetchAll := True;  //for preventing creating an additional session when you call StartTransaction (an known issue of OLEDB)
  TMSQuery(result).Unidirectional := True;
  TMSQuery(result).ReadOnly := TRUE;
  TMSQuery(result).Connection := TDAESDACConnection(aConnection).fMSConnection;

Hello Evgeny,

What is the solution then?
Maybe converting the binary back to a dataset on server side? Or is there something more efficient?
FYI: The recordcount is correct (except during the first pass where it increases gradually) but this is not an issue for me.

Hi,

you can just catch all records on server-side with

dataset.First;
while not dataset.Eof do dataset.Next;

it should solve issue with incorrect RecordCount.

in regards to sdac:
TCustomDADataSet.UniDirectional Property

Class

TCustomDADataSet

Syntax

property UniDirectional: boolean default False;

Remarks

Traditionally SQL cursors are unidirectional. They can travel only forward through a dataset. TCustomDADataset, however, permits bidirectional travelling by caching records. If an application does not need bidirectional access to the records in the result set, set UniDirectional to True. When UniDirectional is True, an application requires less memory and performance is improved. However, UniDirectional datasets cannot be modified. In FetchAll=False mode data is fetched on demand. When UniDirectional is set to True, data is fetched on demand as well, but obtained rows are not cached except for the current row. In case if the Unidirectional property is True, the FetchAll property will be automatically set to False. And if the FetchAll property is True, the Unidirectional property will be automatically set to False. The default value of UniDirectional is False, enabling forward and backward navigation.

Note: Pay attention to the specificity of using the FetchAll property=False

so fetchall does not apply if the doc is correct…
don’t know what is better performance wise…

That’s what I do but then later I cannot call
DABINDataStreamer.WriteDataset(FCachedData[tnl],DataSet,[woSchema,woRows]);
anymore.
IMO the second time you do a dataset.First and while loop the data is not fully present anymore in the dataset.

My question is: How do I loop over the data twice??

streamer itself writes IDADataset as

while not dataset.Eof do begin
  //write current row
  dataset.Next;
end;

so it should store all table records.


another workaround can be in creating TDAMemDatatable on server-side and return his data instead of using FCachedData.
if you register this table as ExportedDataTables, it will work in the same way as usual table in schema.

Hello Evgeny,

My caching (FCachedData) is actually irrelevant. The issue occurs when I load from SQL DB.

I need a solution where I can do a
while not dataset.Eof do begin
//my own processing
dataset.Next;
end;
followed by
DABINDataStreamer.WriteDataset(FCachedData[tnl],DataSet,[woSchema,woRows]);

What is your proposed solution to do this?

so the question is : if one manipulates a idadataset by iterating over the idadataset, will the Writedataset call stream all records or not…

Streamer has this code:

  while (k<>max) and not Source.EOF do begin
  ....
      Source.Next;
      if Source.EOF then Break;
    end;

so it writes data from current position of dataset up to end.


Probably, I missed something in this issue, so can you create a simple testcase that reproduces original issue, pls?
you can attach testcase here or drop directly to support@ for keeping it privately

It’s difficult to create a testcase with DB backend e.a.

My findings are that a dataset of type TDAESDACQuery can only be iterated once.
The dataset has 53 records and will return them once via while loop or WriteDataSet
The second iteration will not return all (even when calling DataSet.First()).
Probably because for performance reasons the DataSet is not persisted in memory.
Meaning the First call does not jump to record 1/53 but to record 51/53.
IMO: It should not do this, it should throw an exception (or ideally simply work)

when you open dataset, it already has 1st row active so you shouldn’t call First .

Yes, and the second time I loop …

FYI: As already mentioned: I think a solution would be to do DABINDataStreamer.WriteDataset first and then read that Binary into a dataset on the server side. Can you tell me which function call can be used to convert the Binary to a dataset from the schema?

DataStreamer.ReadDataset

Can you tell me how I should use this?
I do not see a Binary object in the parameters
I also do not see documentation via https://docs.remotingsdk.com/#q=ReadDataset

you are trying to see in RO docs …
try to use https://docs.dataabstract.com for DA docs

correct usage of ReadDataset is

      Streamer.Initialize(stream, aiReadFromBeginning);
      try
        Streamer.ReadDataset('table', ltable, True);
      finally
        Streamer.Finalize;
      end;

How do I create the ltable in your example?

I have implemented the code

function TTaakService.VerwerkDataFromBinary(const tablename:string;bindata:Binary;var bExpanded:boolean):boolean;
var
VerwerkDataSet:IDaDataSet;
begin
VerwerkDataSet:=DATaakSchema.NewDataSet (Connection, tablename);
DABINDataStreamer.Initialize(bindata,aiReadFromBeginning);
try
DABINDataStreamer.ReadDataset(tablename, VerwerkDataSet, True);
finally
DABINDataStreamer.Finalize();
end;
VerwerkData(VerwerkDataSet.DataSet,bExpanded);
Result:=True;
end;

but I get a ‘interface not supported’ error in ReadDataset on line Destination as IDAEditableDataset
so the DATaakSchema.NewDataSet I use does not give the correct idataset class (DATaakSchema is of type TDASchema)

How should I create a compatible IDADataSet?

try TDAMemDataTable.Create