Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can multithreading speed up an application (when threads can't run concurrently)?

I'm learning about multithreading, but after reading some tutorials I'm sort of confused. I don't understand how multithreading can speed up an application.

By intuition, I would say multithreading slows down an application, because you constantly have to wait for those semaphores.

How and when can multithreading speed up an application, when threads can't run concurrently?

like image 288
xcrypt Avatar asked Mar 07 '12 22:03

xcrypt


People also ask

How can multithreading can speed up certain computing tasks?

Multithreading allows the app to still work properly while executing the task. This is the way to create concurrent computing in your device. It works as if there are multiple processors within the single processor core.

How does multithreading make faster?

In many cases, multithreading gives excellent results for I/O bound application, because you can do multiple things in parallel, rather than blocking your entire app waiting for single I/O operation. This is also most common case when using more threads than cpu cores is beneficial.

How multithreading improves performance over a single threaded solution?

In a multiprocessor architecture, each thread can run on a different processor in parallel using multithreading. This increases concurrency of the system. This is in direct contrast to a single processor system, where only one process or thread can run on a processor at a time.

Why use multithreading in your application?

What Is Multithreading Used For? The main reason for incorporating threads into an application is to improve its performance. Performance can be expressed in multiple ways: A web server will utilize multiple threads to simultaneous process requests for data at the same time.


3 Answers

because you constantly have to wait for those semaphores.

Only in a poorly-designed program or in one designed for parallel work on a single-processor machine. In a well-designed program, the threads do useful work in parallel in between the synchronization points, and enough of it to outweigh the overhead of synchronization.

Even without parallel (multicore/multiprocessor) processing, multithreading can be beneficial when the threads do blocking I/O. E.g., the good old CVSup programs used multithreading in the single-core era to make full use of network connections' duplex capabilities. While one thread was waiting for data to arrive over the link, another would be pushing data the other way. Due to network latency, both threads necessarily had to spend a lot of time waiting, during which the other threads could do useful work.

like image 57
Fred Foo Avatar answered Dec 31 '22 09:12

Fred Foo


Two ways I can think of, the first of which is probably what you mean by "parallel threading".

  • If you have multiple CPUs or cores, they can work simultaneously if you're running multiple threads.
  • In the single core case, if your thread ends up waiting for (synchronous) I/O, let's say you call read() to read 100 MB from tape, another thread can get scheduled and get work done while you wait.
like image 39
Joachim Isaksson Avatar answered Dec 31 '22 10:12

Joachim Isaksson


The idea behind multithreading is to have as few blocking points as possible. In other words, if a thread has to constantly wait on another thread to finish something, then the benefit of threads is likely lost in that situation.

Obligatory link: http://en.wikipedia.org/wiki/Amdahl's_law

Also, as Mark Ransom said, if your hardware can't actually do more than 1 thing at once, then threads are really just logically running at the same time (by swapping) than actually running at the same time. That can still be useful in situations with IO blocking though.

like image 26
Corbin Avatar answered Dec 31 '22 11:12

Corbin