Session creation and deletion: should it be some symmetry between those two calls?

Hello,

I am trying to find what appears to be a memory leak somewhere in my server, which is specially present when our unit test suite runs the “concurrent” users tests. This tests create lots of different threads and each thread does some calls to the server.

So far I haven’t find anything obvious, but there is one thing that I find odd. I have extensive logging on the server and I log each call of MemSessionManagerSessionCreated. At the end of the concurrent users tests I can see 18,007 calls to that event. I also log the SessionDeleted event. And of those I can only see 602 calls. So there is a clear difference between the two, and maybe that explains that the memory usage of the server doesn’t go back to baseline when running that tests.

Now, another strange thing and I think this could be the reason behind the asymmetry is: most of the calls to SessionCreated have a SessionId of 00000000-0000-0000-0000-000000000000. But all of the calls to SessionDeleted have a non-empty SessionId.

In the client side I set the Message.ClientId property to the SessionId I got from the Login function, which is called at the beginning of each threaded test. And all the methods called from the server require a session and they don’t fail. I can see in the methods I have logged that info the expected SessionId… so I am not sure where are those session with empty Id created and definitely can’t see them being destroyed…

On the other hand: I have a counter for the number of active sessions and it shows an expected number (a maximum of around 115 session), nothing near the number of calls I see on the log…

Am I looking at this wrong? Is this expected?

The client and the unit tests are on .NET and the server is made in C++Builder. We are using the latest version, 1623.

Some more info:

The baseline as shown by Process Explorer before any client connects is around 250Mb.

Originally I had the client connect using SuperTCP transport. In that case, after finishing the tests, the server stays around 600Mb or Private Bytes.

In changed the transport to TCP and now it’s higher! its around 960Mb after running the same tests.

In both cases the memory stays like that no matter what. The client already disconnected and the process closed, the session expiry already reached (although the session count is 0 after the client finishes anyway, so it wouldn’t appear to be lingering sessions). Seems to be something related to the transport channels… and the number of connections? Any ideas?

Hi,

so server-side uses Remoting SDK for Delphi and client-side uses Remoting SDK for .NET.
ok.

this is as expected. Session is usually deleted after SessionDuration (15 min by default) or after unsuccessful login (i..e when you manually called service.DestroySession)

You can also log service.OnActivate and service.OnDeactivate. By other hand, service.OnGetDispatchInfo can provide access to client info like his IP , etc

weird, Session.SessionID is read-only and cannot be changed.

again, Session is deleted only after expiring. active session can be reused in next calls.

you can use DestroySession like

procedure TNewService.NewMethod;
begin
   // do something
   DestroySession; // session will be compulsorily deleted after method call is finished
end;

in this case, you should have 18,007 calls for SessionDeleted event

if you call your tests several times, will memory stay the same or it will be constantly increased ?
I can suggest to use FastMM5 (or FastMM4) on server-side. it can detect and log memory leaks.

Thanks for your reply Evgeny,

I am not quite sure I follow about the discrepancy between SessionCreated and SessionDeleted. Shouldn’t all created sessions be deleted? Either because they timed out, or where manually deleted somewhere?

It would appear that at first there is a session created by the SessionManager when receiving a request, which for some reason has the SessionId/ClientId from the client set to an Empty Guid, but then when the methods are called the SessionId/ClientId is correctly populated, and those are the sessions that get eventually deleted.

From the client I am setting the Message.ClientId property to the SessionId returned by the Login method. So the only method that gets called without a ClientId set is the Login one. The server logs confirm as far as I can see this: to begin with, all the methods except the Login one and some informational others need a Session to be present, and they don’t fail; and I log the ClientId on some of those methods and I can see that it is not an Empty Guid.

if you call your tests several times, will memory stay the same or it will be constantly increased ?
I can suggest to use FastMM5 (or FastMM4) on server-side. it can detect and log memory leaks.

If I don’t call the concurrent tests (which produce as I said up to ~120 concurrent connections) the memory stays constant at ~250Mb at the end, which is what the starting baseline is. If I call several times the concurrent tests on a row then the memory stays constant at around 600Mb (when using SuperTCP). It doesn’t appear then to consume more memory if I run the tests several times, but it doesn’t go back to the starting baseline of 250Mb. I have waited the Timeout period with the server running and it doesn’t go back either.

Now: all the tests are run from the same computer the server is running. The concurrent ones just fire up threads. I don’t need right now to identify the client nor anything like that.

And, Deleting manually the session I am not sure it’s the idea: the method calls are called within a “valid” session, which gets destroyed on the Logout. I think “the problem” are the session that get created with an Empty Guid which don’t appear to be destroyed… but I don’t know what else to log or try to pinpoint the issue.

Thanks!

Hi,

Each session has some live duration during which it can receive events.
by default, session duration is set to 15 min and re-used when client accesses to server in this period.

what message type and client channel were used?
can you show your code, when you set Message.ClientId but server-side receives 00000000-0000-0000-0000-000000000000 , pls?
AFAIK, for super channels you should assign channel.ClientID instead of message.ClientID.

it depends on memory manager. standard one in some cases isn’t very good. 3rd party like FastMM, Nexus, etc can do memory management and optimization a bit better.

I am using SuperTCP. I have always only changed the Message.ClientID (I remember asking about this several years ago, and my understandint was that this was enough). I have just changed the code to try and change the ClientID for the Channel but it produces an exception at runtime that says: `ClientID cannot be changed while the server connection is open`. I don’t know if internally the ClientID of the channel is updated with the “current” session one… I guess I can log that and see if that’s the case… and yes, for what I can see, the Channel.ClientID correspond to the SessionID I get from the Login calls.

Here each thread gets an instance of the “connection” class from a pool. The channel is connected to the server on the constructor of that class, and I have a property that sets the ClientID of the Message object in that class. When the Login is made I change the ClientID in that instance. Now, all of the thread execution is made with the same instance of the connection class, and until it finishes, the instance is released to the pool (and there the ClientID is set to Guid.Empty). The next time this instance is used the caller code uses the Login method, sets the ClientID, and so on.

as expected. You can change ClientID for unactive channel only.

for super channels, channel.ClientID is needed for for establishing connection, i.e. when hello package is generated.

Ok, so, as far as I can see there shouldn’t be any messages apart from the hello package with the ClientID set to Guid.Empty before the Login, as afterwards I am using the same Channel and the service instances are created afterwards (except the one that handles the Login) with both the Message.ClientID (manually) and Channel.ClientID (internally by RO-SDK) set to the session generated from the Login method.

I am not really sure how to try anything more about this. I feel the “problem” (if there is a problem at all) is on the Server Channel’s connection handling. It certainly doesn’t appear to be a memory leak, as I can run this tests consecutively without any extra memory being consumed. But it is clear that this test with it’s 100s of connections use more memory from the server that it is not released even after the session timeout window has passed… several times. And it appears like the non-Super channels, TCP is the only I tried, even use more memory.

For me it’s not a big deal, I only noticed this because one of the other tests uses a lot of memory and, being this a 32bit server, that test fails if the concurrent one was run first. In real life none of those two scenarios is likely to happen, at least for the time being.

If there is anything that we could try here please let me know, for now I am putting a pin on this.

Thanks

Hi,

this logic is used in super tcp channel:

            // Borrow Client ID from the message properties
            if (this.ClientID == Guid.Empty)
            {
                this.ClientID = message.ClientID;
            }
...
            message.ClientID = this.ClientID;

You can try to use SetProcessWorkingSetSize as described at https://stackoverflow.com/questions/2031577/can-memory-be-cleaned-up . it may cleanup memory in your case.