Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

GCD - does a serial queue require an `NSLock` or a memory barrier to synchronize work?

I read the Apple documentation on GCD queues and started to wonder what happens if I lets say modify an instance member of type NSMutableArray which is not thread safe in a serial queue? The serial queue would guarantee me that I execute the operations serially, but I still feel that I need to either do an @syncrhonized block or other technique to force a memory barrier, since as far as I understand the tasks on my serial queue can be invoked on different threads. Is that correct? Here is a simple example:

@interface Foo : NSObject

-(void)addNumber:(NSNumber*)number;
-(void)printNumbers;
-(void)clearNumbers;

@end

@implementation Foo
{
   dispatch_queue_t _queue;
   NSMutableArray<NSNumber*>* _numbers;
}

-(instancetype)init
{
   if (self = [super init])
   {
       _queue = dispatch_queue_create(NULL, NULL);
       _numbers = [NSMutableArray array];
   }
   return self;
}

-(void)addNumber:(NSNumber*)number
{
   dispatch_async(_queue,
   ^{
       [_numbers addObject:number];
   });
}

-(void)printNumbers
{
   dispatch_async(_queue,
   ^{
       for (NSNumber* number in _numbers)
       {
           NSLog(@“%@“, number);
       }
   });
}

-(void)clearNumbers
{
   dispatch_async(_queue,
   ^{
       _numbers = [NSMutableArray array];
   });
}
@end;

As far as I understand I could run into memory issues here if I call the member methods from arbitrary threads? Or GCD gives some guarantees under the hood, why I do not need to force memory barriers? Looking at the examples I did not find such constructs anywhere, but coming from C++ it would make sense to touch the member variable under a lock.

like image 956
Rudolfs Bundulis Avatar asked Mar 06 '23 23:03

Rudolfs Bundulis


1 Answers

If your queue is a serial queue, it will only allow one operation at a time, no matter which thread it's running on. Therefore, if every access to a resource occurs on the queue, there's no need to further protect that resource with a lock or a semaphore. In fact, it's possible to use dispatch queues as a locking mechanism, and for some applications, it can work quite well.

Now if your queue is a concurrent queue, then that's a different story, since multiple operations can run at the same time on a concurrent queue. However, GCD provides the dispatch_barrier_sync and dispatch_barrier_async APIs. Operations that you start via these two function calls will cause the queue to wait until all other operations finish before executing your block, and then disallow any more operations from running until the block is finished. In this way, it can temporarily make the queue behave like a serial queue, allowing even a concurrent queue to be used as a sort of locking mechanism (for example, allowing reads to a resource via a normal dispatch_sync call, but doing writes via a dispatch_barrier_async. If the reads occur very frequently and the writes very infrequently, this can perform pretty well).

like image 79
Charles Srstka Avatar answered May 07 '23 09:05

Charles Srstka