We’ve encountered a very strange issue, reported by a customer.
One of our tables contains a binary column into which images are loaded. The customer is experiencing problems with some specific images whereby the software just hangs.
I have spent a while trying to create a small test program which reproduces the issue and have succeeded. Could you advise how I can send this to you privately as I don’t wish to attach it publicly here as it contains one of the customer’s images.
Fundamentally, when we load the errant image into the client data table and then attempt to post the update to the server, the server appears to receive the new record and correctly insert it into the database but the client just hangs. I am not sure if the problem occurs within the server after applying the update or the client.
If you then restart the client and try to just open the table, now that it has the errant image stored in a record, the client again just hangs indefinitely.
Whilst the image in question is quite large (circa 15MB TIFF file), this shouldn’t be a problem per se and indeed testing with other random files of this size or larger causes no problems.
There appears to be something about this particular file that is causing the problem and we’re at a loss…
We are using Delphi 10.2.3 with DA 10.0.0.1521. We are using MSSQL with FireDAC.
I did originally get the package size error which is why I increased the server setting but neglected to increase the client one to match.
Two questions occur however.
Firstly, why did other files I tested of a similar size work without issue?
Secondly, why was I even able to submit the initial update and have the server create the record. Surely that should have failed too?
Lastly, is there any way to optimise performance when transmitting larger blob fields like this?
Even 15MB of data would take a small fraction of a second to transmit on a gigabit Ethernet network, yet retrieving a record containing this blob field through the framework is taking over a second.
Is the overhead in serialisation/deserialisation causing this?
Rather than starting another thread, can I quickly pick your brain on the compression setting.
Specifically, how do the server and client side compression settings interact? Is it the case that compression will only be used if both server and client messages have it enabled?
My testing would seem to suggest this is the case and I can thus leave it enabled on the client and then control whether it’s used by changing the server setting. Is this the case?
My logic here is that I’d like to have a server-side setting to control whether compression is enabled.
This is fine for the server but then the client would need to retrieve this setting from the server which leads to a slight catch-22 situation where it needs to communicate with the server before it knows whether compression should be enabled or not.
If i could just leave it enabled on the client and just control it from the server that would be ideal but I don’t know if this is safe or even advisable.
More testing suggests that the compression setting controls whether compression is used on outbound messages, so if the client has it enabled but the server is disabled, compression will be applied to messages from client to server but not the other way around, is this the case?