Hello
That’s the infamous n+1 selects
problem tat affects all existing ORM (in this case Data Abstract works as an ORM and is also affected by this fundamental problem).
In general Data Abstract doesn’t support navigational properties or eager loading.
Still it IS possible to fetch all data is several related tables using singe query.
Take a look at this sample: GroupLoading.zip (120.1 KB)
It contains two DA LINQ queries that fetch data from several rows at once.
This sample uses the PCTrade.sqlite.db
database shipped with Data Abstract, so you can run it without any changes in the connection definitions file.
The server part doesn’t contain any interesting code. Actually the only change added to it after it has been generated by the New Project Wizard is the BeforeExecutingGetDataReader
event handler in the DataService
implementation. It is used for logging purposes and doesn’t actually affect anything.
In the client-side code the first thing that needs to be explained is the TabbleDefinitions_Extension.cs
file. This file contains some additional code for the Orders
class generated as part of the LINQ table definitions. Thanks to the partial modifier applied to it we can add additional code to this class in the way that ensures that such additional code won’t be lost when the TableDefinitions file is regenerated.
The code contains an additional property definition (note the attribute applied to it) and Equals
/GetHashCode
methods pair. The property will be used to store retrieved data while the methods are necessary to allow the Orders
class to be used as a key in the LinqToObjects GroupBy
method.
Note: In general Table Definition classes do not need these methods because
GroupBy
is effectively translated into the GROUP BY clause and is executed by the database server itself. At the same LinqToObjects operations are performed over objects in local memory, so .NET runtime need some help to be able to compare objects correctly.
public partial class Orders
{
[RemObjects.DataAbstract.Linq.IgnoreProperty]
public IList<OrderDetails> Details { get; set; }
public override bool Equals(object obj)
{
...
}
public override int GetHashCode()
{
...
}
}
MainForm code-behind contains 2 sample queries.
1st one is a simple join over 2 tables:
// Aliases to make the query cleaned
// These assignments do not cause any data request operations
var orders = this._dataModule.DataAdapter.GetTable<Orders>();
var orderDetails = this._dataModule.DataAdapter.GetTable<OrderDetails>();
var products = this._dataModule.DataAdapter.GetTable<Products>();
var query = orders
.Where(o => o.OrderStatus != 128)
.Join(orderDetails,
o => o.Id,
od => od.OrderId,
(order, details) => new { Order = order, Details = details })
.ToList()
.Select(g => g)
.GroupBy(g => g.Order)
.Select(g =>
{
g.Key.Details = g.Select(r => r.Details).ToList();
return g.Key;
});
What happens here:
- A WHERE condition defined. It is better to apply conditions earlier as this will reduce the server → client traffic and the db workload
- JOIN between the
Orders
andOrderDetails
tables is defined .ToList()
call is issued. At this point the query will be sent to the DA server and executed. The result is a denormalized join result between theOrders
andOrderDetails
tables. All Linq operators below are performed over local in-memory objects, not over the DB rows..GropyBy
operation is performed. This is the point where earlier definedEquals
/GetHashCode
methods are used.- The
.Select
operation is performed. This operation takes the LINQ group object and converts it into a simpleOrders
object with initializedDetails
property.
The second query is a bit more complex because it involves more than one Join:
var query = orderDetails
.Join(products,
od => od.ProductId,
p => p.Id,
(od, p) => new { Detail = od, Product = p })
.Join(orders,
od => od.Detail.OrderId,
o => o.Id,
(detail, order) => new { Order = order, Detail = detail })
.Where(o => o.Order.OrderStatus != 128)
.ToList()
.Select(g => g)
.GroupBy(g => g.Order)
.Select(g => new { Order = g.Key, Details = g.Select(r => r.Detail).ToList() });
Still it uses the same approach:
- Fetch denormalized data
- Normalize it back client-side