Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Avoid Data Race condition in swift

I am getting race conditions in my code when I run TSan tool. As same code has been accessed from different queues and threads at the same time that's why I can not use Serial queues or barrier as Queue will block only single queues accessing the shared resource not the other queues.

I used objc_sync_enter(object) | objc_sync_exit(object) and locks NSLock() or NSRecursiveLock() to protect shared resource but these are also not working.

While when I use @synchronized() keyword in Objective C to protect shared resource, it's working fine as expected and I am not getting race conditions in particular block of code.

So, what is an alternative to protect data in Swift as we can not use @synchronized() keyword in Swift language.

PFA screenshot for reference - Data Race

like image 983
Nishant Tyagi Avatar asked Aug 31 '18 05:08

Nishant Tyagi


People also ask

How can we avoid data race?

The only way to avoid data races is to synchronize access to all mutable data that is shared between threads. There are several ways to achieve this. In Go, you would normally use a channel or a lock. (Lower-lever mechanisms are available in the sync and sync/atomic packages.)

What is a solution to avoid the race condition?

The usual solution to avoid race condition is to serialize access to the shared resource. If one process gains access first, the resource is "locked" so that other processes have to wait for the resource to become available.

What is race condition in Swift?

A race condition occurs when the timing or order of events affects the correctness of a piece of code. Data Race. A data race occurs when one thread accesses a mutable object while another thread is writing to it.

How can Singleton avoid race condition?

As I understand, Data race can be avoided by making sure that one or more of the above conditions hold false - ie, by making shared variables immutable or by making the access to them properly synchronized .


1 Answers

I don't understand "I can not use Serial queues or barrier as Queue will block only single queues accessing the shared resource not the other queues." Using a queue is the standard solution to this problem.

class MultiAccess {
    private var _property: String = ""
    private let queue = DispatchQueue(label: "MultiAccess")
    var property: String {
        get {
            var result: String!
            queue.sync {
                result = self._property
            }
            return result
        }
        set {
            queue.async {
                self._property = newValue
            }
        }
    }
}

With this construction, access to property is atomic and thread-safe without the caller having to do anything special. Note that this intentionally uses a single queue for the class, not a queue per-property. As a rule, you want a relatively small number of queues doing a lot of work, not a large number of queues doing a little work. If you find that you're accessing a mutable object from lots of different threads, you need to rethink your system (probably reducing the number of threads). There's no simple pattern that will make that work efficiently and safely without you having to think about your specific use case carefully.

But this construction is useful for problems where system frameworks may call you back on random threads with minimal contention. It is simple to implement and fairly easy to use correctly. If you have a more complex problem, you will have to design a solution for that problem.


Edit: I haven't thought about this answer in a long time, but Brennan's comments brought it back to my attention. Because of the bug I had in the original code, my original answer was ok, but if you fixed the bug it was bad. (If you want to see my original code that used a barrier, look in the edit history, I don't want to put it here because people will copy it.) I've changed it to use a standard serial queue rather than a concurrent queue.

Don't generate concurrent queues without careful thought about how threads will be generated. If you are going to have many simultaneous accesses, you're going to create a lot of threads, which is bad. If you're not going to have many simultaneous accesses, then you don't need a concurrent queue. GCD talks make promises about managing threads that it doesn't actually live up to. You definitely can get thread explosion (as Brennan mentions.)

like image 93
Rob Napier Avatar answered Sep 28 '22 01:09

Rob Napier