As the title says, i got a WCF Server having this service behavior defined:
[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode = ConcurrencyMode.Multiple)]
I' using a named pipe binding and my clients are connecting in this way:
NetNamedPipeBinding binding = new NetNamedPipeBinding();
const int maxValue = 0x40000000; // 1GB
binding.MaxBufferSize = maxValue;
binding.MaxReceivedMessageSize = maxValue;
binding.ReaderQuotas.MaxArrayLength = maxValue;
binding.ReaderQuotas.MaxBytesPerRead = maxValue;
binding.ReaderQuotas.MaxStringContentLength = maxValue;
// receive timeout acts like a general timeout
binding.ReceiveTimeout = TimeSpan.MaxValue;
binding.SendTimeout = TimeSpan.MaxValue;
ChannelFactory<IDatabaseSession> pipeFactory = new ChannelFactory<IDatabaseSession>(binding, new EndpointAddress("net.pipe://localhost/DatabaseService"));
IDatabaseSession dbSession = pipeFactory.CreateChannel()
Every client I start executes the code above, and for every client the CPU usage raises by 25% (actually not for the 5. client, but at this point the service execteable is covering nearly a 100% of the whole CPU capacity).
What I'm searching for is a kind of resource (website/list or just YOUR powerful knowledge) telling me what a CreateChannel actually does (regarding in resource allocation issues).
hint: the CPU usage increases even if no communication is actually done, just the Channel is created.
Pause the debugger and see where all threads have halted. This is likely the hot part of your code. Look at the call stack.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With