Binary field is hanging the system

We’ve encountered a very strange issue, reported by a customer.
One of our tables contains a binary column into which images are loaded. The customer is experiencing problems with some specific images whereby the software just hangs.

I have spent a while trying to create a small test program which reproduces the issue and have succeeded. Could you advise how I can send this to you privately as I don’t wish to attach it publicly here as it contains one of the customer’s images.

Fundamentally, when we load the errant image into the client data table and then attempt to post the update to the server, the server appears to receive the new record and correctly insert it into the database but the client just hangs. I am not sure if the problem occurs within the server after applying the update or the client.

If you then restart the client and try to just open the table, now that it has the errant image stored in a record, the client again just hangs indefinitely.

Whilst the image in question is quite large (circa 15MB TIFF file), this shouldn’t be a problem per se and indeed testing with other random files of this size or larger causes no problems.

There appears to be something about this particular file that is causing the problem and we’re at a loss…

We are using Delphi 10.2.3 with DA We are using MSSQL with FireDAC.


You can

  • drop testcase to support@ by email


  • PM to support group (or me) here and attach it to post.

Thanks, have emailed support@


if you launched server-side in debug mode, you can see

Debugger Exception Notification
Project ImageTestServer.exe raised exception class EROException with message 'Package too large'.
Break   Continue   Copy   Help   

this means that MaxPackageSize property isn’t equal on client- and server-side and server-side can’t send big response back to client.

your testcase has:

  • Server.MaxPackageSize = 100MB
  • ClientChannel.MaxPackageSize = 10MB

if you set ClientChannel.MaxPackageSize = 100MB, everything will work as expected.


I did originally get the package size error which is why I increased the server setting but neglected to increase the client one to match.

Two questions occur however.

Firstly, why did other files I tested of a similar size work without issue?
Secondly, why was I even able to submit the initial update and have the server create the record. Surely that should have failed too?

Lastly, is there any way to optimise performance when transmitting larger blob fields like this?
Even 15MB of data would take a small fraction of a second to transmit on a gigabit Ethernet network, yet retrieving a record containing this blob field through the framework is taking over a second.
Is the overhead in serialisation/deserialisation causing this?



Looks like, after compression their size less that 10 MB.

not yet.
client-side checked package size on server-side and detected that it had 100MB. so request’s size passed this check and request was sent to server.

You have 1GB Ethernet network so you can try to disable compression for bin message on both sides. performance may increase because initial stream shouldn’t be compressed.

You store data into DB. 1st request always will have delay: it should establish connection to DB server. 2nd request should be a bit faster - connection to DB server can be taken from cache.

Ok thanks.

Rather than starting another thread, can I quickly pick your brain on the compression setting.

Specifically, how do the server and client side compression settings interact? Is it the case that compression will only be used if both server and client messages have it enabled?

My testing would seem to suggest this is the case and I can thus leave it enabled on the client and then control whether it’s used by changing the server setting. Is this the case?

My logic here is that I’d like to have a server-side setting to control whether compression is enabled.
This is fine for the server but then the client would need to retrieve this setting from the server which leads to a slight catch-22 situation where it needs to communicate with the server before it knows whether compression should be enabled or not.

If i could just leave it enabled on the client and just control it from the server that would be ideal but I don’t know if this is safe or even advisable.


More testing suggests that the compression setting controls whether compression is used on outbound messages, so if the client has it enabled but the server is disabled, compression will be applied to messages from client to server but not the other way around, is this the case?


compression on client- and server-side doesn’t interact. so it can have any value on both sides: both enabled, both disabled or have different values.

it just controls how output stream will be sent: as is or compressed.

the compression property doesn’t have influence to incoming stream

cool thanks