Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Seeking tutorials and information on load-balancing between threads

I know the term "Load Balancing" can be very broad, but the subject I'm trying to explain is more specific, and I don't know the proper terminology. What I'm building is a set of Server/Client applications. The server needs to be able to handle a massive amount of data transfer, as well as client connections, so I started looking into multi-threading.

There's essentially 3 ways I can see implementing any sort of threading for the server...

  • One thread handling all requests (defeats the purpose of a thread if 500 clients are logged in)
  • One thread per user (which is risky to create 1 thread for each of the 500 clients)
  • Pool of threads which divide the work evenly for any number of clients (What I'm seeking)

The third one is what I'd like to know. This consists of a setup like this:

  • Maximum 250 threads running at once
  • 500 clients will not create 500 threads, but share the 250
  • A Queue of requests will be pending to be passed into a thread
  • A thread is not tied down to a client, and vice-versa
  • Server decides which thread to send a request to based on activity (load balance)

I'm currently not seeking any code quite yet, but information on how a setup like this works, and preferably a tutorial to accomplish this in Delphi (XE2). Even a proper word or name to put on this subject would be sufficient so I can do the searching myself.

EDIT

I found it necessary to explain a little about what this will be used for. I will be streaming both commands and images, there will be a double-socket setup where there's one "Main Command Socket" and another "Add-on Image Streaming Socket". So really one connection is 2 socket connections.

Each connection to the server's main socket creates (or re-uses) an object representing all the data needed for that connection, including threads, images, settings, etc. For every connection to the main socket, a streaming socket is also connected. It's not always streaming images, but the command socket is always ready.

The point is that I already have a threading mechanism in my current setup (1 thread per session object) and I'd like to shift that over to a pool-like multithreading environment. The two connections together require a higher-level control over these threads, and I can't rely on something like Indy to keep these synchronized, I'd rather know how things are working than to learn to trust something else to do the work for me.

like image 304
Jerry Dodge Avatar asked Feb 27 '12 20:02

Jerry Dodge


People also ask

What is the purpose of the load balancer resource?

Load balancers are used to increase capacity (concurrent users) and reliability of applications. They improve the overall performance of applications by decreasing the burden on servers associated with managing and maintaining application and network sessions, as well as by performing application-specific tasks.

What is load balancing in Java?

The load balancer attempts to evenly distribute the workload among multiple Application Server instances (either stand-alone or clustered), thereby increasing the overall throughput of the system. Using a load balancer also enables requests to fail over from one server instance to another.

What is load balancer and how it works?

A load balancer acts as the “traffic cop” sitting in front of your servers and routing client requests across all servers capable of fulfilling those requests in a manner that maximizes speed and capacity utilization and ensures that no one server is overworked, which could degrade performance.

What are the importance of threads?

Advantages of Thread Threads minimize the context switching time. Use of threads provides concurrency within a process. Efficient communication. It is more economical to create and context switch threads.


2 Answers

IOCP server. It's the only high-performance solution. It's essentially asynchronous in user mode, ('overlapped I/O in M$-speak), a pool of threads issue WSARecv, WSASend, AcceptEx calls and then all wait on an IOCP queue for completion records. When something useful happens, a kernel threadpool performs the actual I/O and then queues up the completion records.

You need at least a buffer class and socket class, (and probably others for high-performance - objectPool and pooledObject classes so you can make socket and buffer pools).

like image 169
Martin James Avatar answered Oct 18 '22 20:10

Martin James


500 threads may not be an issue on a server class computer. A blocking TCP thread doesn't do much while it's waiting for the server to respond.

There's nothing stopping you from creating some type of work queue on the server side, served by a limited size pool of threads. A simple thread-safe TList works great as a queue, and you can easily put a message handler on each server thread for notifications.

Still, at some point you may have too much work, or too many threads, for the server to handle. This is usually handled by adding another application server.

To ensure scalability, code for the idea of multiple servers, and you can keep scaling by adding hardware.

There may be some reason to limit the number of actual work threads, such as limiting lock contention on a database, or something similar, however, in general, you distribute work by adding threads, and let the hardware (CPU, redirector, switch, NAS, etc.) schedule the load.

like image 21
Marcus Adams Avatar answered Oct 18 '22 19:10

Marcus Adams