dispatch_apply
takes a dispatch queue as a parameter, which allows you to choose which queue to execute the block on.
My understanding is that DispatchQueue.concurrentPerform
in Swift is meant to replace dispatch_apply
. But this function does not take a dispatch queue as a parameter. After googling around, I found this GCD tutorial which has this code:
let _ = DispatchQueue.global(qos: .userInitiated)
DispatchQueue.concurrentPerform(iterations: addresses.count) { index in
// do work here
}
And explains:
This implementation includes a curious line of code:
let _ = DispatchQueue.global(qos: .userInitiated)
. Calling this tells causes GCD to use a queue with a.userInitiated
quality of service for the concurrent calls.
My question is, does this actually work to specify the QoS? If so, how?
It would kind of make sense to me that there is no way to specify the queue for this, because a serial queue makes no sense in this context and only the highest QoS really makes sense given that this is a synchronous blocking function. But I can't find any documentation as to why it is possible to specify a queue with dispatch_apply
but impossible(?) with DispatchQueue.concurrentPerform
.
The main dispatch queue is a globally available serial queue executing tasks on the application's main thread.
Dispatch queues are FIFO queues to which your application can submit tasks in the form of block objects. Dispatch queues execute tasks either serially or concurrently. Work submitted to dispatch queues executes on a pool of threads managed by the system.
DispatchQueue. main (the main queue) is a serial queue. sync and async do not determine serialization or currency of a queue, but instead refer to how the task is handled. Synchronous function returns the control on the current queue only after task is finished.
To summarize, DispatchQueue. async allows you to schedule work to be done using a closure without blocking anything that's ongoing. In most cases where you need to dispatch to a dispatch queue you'll want to use async .
The author’s attempt to specify the queue quality of service (QoS) is incorrect. The concurrentPerform
uses the current queue’s QoS if it can. You can confirm this by tracking through the source code:
concurrentPerform
calls _swift_dispatch_apply_current
.
_swift_dispatch_apply_current
calls dispatch_apply
with 0
, i.e., DISPATCH_APPLY_AUTO
, which is defined as a ...
... Constant to pass to
dispatch_apply()
ordispatch_apply_f()
to request that the system automatically use worker threads that match the configuration of the current thread as closely as possible.When submitting a block for parallel invocation, passing this constant as the queue argument will automatically use the global concurrent queue that matches the Quality of Service of the caller most closely.
This can also be confirmed by following dispatch_apply
call dispatch_apply_f
in which using DISPATCH_APPLY_AUTO
results in the call to _dispatch_apply_root_queue
. If you keep tumbling down the rabbit hole of swift-corelibs-libdispatch, you’ll see that this actually does use a global queue which is the same QoS as your current thread.
Bottom line, the correct way to specify the QoS is to dispatch the call to concurrentPerform
to the desired queue, e.g.:
DispatchQueue.global(qos: .userInitiated).async {
DispatchQueue.concurrentPerform(iterations: 3) { (i) in
...
}
}
This is easily verified empirically by adding a break point and looking at the queue in the Xcode debugger:
Needless to say, the suggestion of adding the let _ = ...
is incorrect. Consider the following:
DispatchQueue.global(qos: .utility).async {
let _ = DispatchQueue.global(qos: .userInitiated)
DispatchQueue.concurrentPerform(iterations: 3) { (i) in
...
}
}
This will run with “utility” QoS, not “user initiated”.
Again, this is easily verified empirically:
See WWDC 2017 video Modernizing Grand Central Dispatch for a discussion about DISPATCH_APPLY_AUTO
and concurrentPerform
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With