Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

value based thread lock

Forgive me if this has been asked before. I have looked around a lot, but I get the feeling that I don't have the proper vocabulary to find this by searching the web.

I have a multithreaded application in python. I want to be able to lock a certain block of code but only to other threads with a certain condition. Let me give an example: There are three threads, thread_a, thread_b and thread_c. Each thread may run through the function foo at any time. I don't want any two threads with bar equal to each other to be able to access Code block ALPHA at the same time. However, I don't want to block threads whose bar value is different. In this case, let's say thread_a has a bar == "cat" and hits line (3) first. Before thread_a hits line (5), let's say thread_b, with bar == "cat" hits line (3). I would like for thread_b to wait. But if thread_c comes along, with bar == "dog", I would like for it to be able to keep going.

(1) def foo(bar):
(2)    
(3)     lock(bar)
(4)     # Code block ALPHA (two threads with equivalent bar should not be in here)
(5)     unlock(bar)

As another note, the possible values for bar are completely unpredictable but with a very high chance of collision.

Thank you for any help. The library I am looking at is the python threading library

like image 486
Phillip Martin Avatar asked Jun 03 '16 22:06

Phillip Martin


People also ask

What does threading lock () do?

Once a thread has acquired the lock, all subsequent attempts to acquire the lock are blocked until it is released. The lock can be released using the release() method.

What is the difference between threading lock and threading RLock?

Lock can only be acquired once, and once acquired it cannot be acquired again by the same thread or any other thread until it has been released. A threading. RLock can be acquired more than once by the same thread, although once acquired by a thread it cannot be acquired by a different thread until it is been released.

Can two threads acquire the same lock?

Typically, threads cannot acquire locks twice in a row: a thread must release an acquired lock before attempting to acquire it again. However, reentrant locks can be acquired multiple times by the same thread. Reentrant locks allow code to acquire a lock before calling other functions that acquire the same lock.

Do locks block waiting threads?

With locking, deadlock happens when threads acquire multiple locks at the same time, and two threads end up blocked while holding locks that they are each waiting for the other to release. The monitor pattern unfortunately makes this fairly easy to do.


2 Answers

Updated

Good news: I was able to reproduce the release_lock problem you encountered using my original answer via a somewhat crude testbed I cobbled together, and fix the issue using a counting mechanism (as you suggested) — at least a far as I can tell with my testing apparatus.

Now two separate shared dictionaries are used, one to keep track of the "names" or values associated with each lock as before, and another to keep track of how many threads are using each one at a given time.

As before, lock names must be hashable values so they can be used as keys in dictionaries.

import threading

namespace_lock = threading.Lock()
namespace = {}
counters = {}

def aquire_lock(value):
    with namespace_lock:
        if value in namespace:
            counters[value] += 1
        else:
            namespace[value] = threading.Lock()
            counters[value] = 1

    namespace[value].acquire()

def release_lock(value):
    with namespace_lock:
        if counters[value] == 1:
            del counters[value]
            lock = namespace.pop(value)
        else:
            counters[value] -= 1
            lock = namespace[value]

    lock.release()

# sample usage    
def foo(bar):
    aquire_lock(bar)
    # Code block ALPHA (two threads with equivalent bar should not be in here)
    release_lock(bar)
like image 151
martineau Avatar answered Oct 16 '22 07:10

martineau


Have one lock, acquired whenever a thread tries to enter or exit the critical section, and use separate condition variables for each value of bar. The following could probably be optimized to create less condition variables, but doing so for this post felt like premature optimization:

import collections
import contextlib
import threading

lock = threading.Lock()

wait_tracker = collections.defaultdict(lambda: (False, 0, threading.Condition(lock)))

@contextlib.contextmanager
def critical(bar):
    with lock:
        busy, waiters, condition = wait_tracker[bar]
        if busy:
            # Someone with the same bar value is in the critical section.

            # Record that we're waiting.
            waiters += 1
            wait_tracker[bar] = busy, waiters, condition

            # Wait for our turn.
            while wait_tracker[bar][0]:
                condition.wait()

            # Record that we're not waiting any more.
            busy, waiters, condition = wait_tracker[bar]
            waiters -= 1

        # Record that we're entering the critical section.
        busy = True
        wait_tracker[bar] = busy, waiters, condition
    try:
        # Critical section runs here.
        yield
    finally:
        with lock:
            # Record that we're out of the critical section.
            busy, waiters, condition = wait_tracker[bar]
            busy = False
            if waiters:
                # Someone was waiting for us. Tell them it's their turn now.
                wait_tracker[bar] = busy, waiters, condition
                condition.notify()
            else:
                # No one was waiting for us. Clean up a bit so the wait_tracker
                # doesn't grow forever.
                del wait_tracker[bar]

Then each thread that wants to enter the critical section does the following:

with critical(bar):
    # Critical section.

This code is untested, and parallelism is hard, especially locks-and-shared-memory parallelism. I make no guarantees that it will work.

like image 27
user2357112 supports Monica Avatar answered Oct 16 '22 09:10

user2357112 supports Monica