Occurrence and avoidance of deadlock and livelock

Posted by phorman on Wed, 16 Feb 2022 00:57:16 +0100

Occurrence and avoidance of deadlock and livelock

When multithreading programming involves reading and writing shared data at the same time, you should be extra careful. If the shared data is an exclusive resource, the simplest way to have exclusive access to the read and write of the shared data is to lock it. The lock cannot be used casually, otherwise it may cause deadlock and livelock. This article will explain in detail how deadlocks and livelocks occur and how to avoid them through examples. ​
Avoid multiple threads reading and writing shared data at the same time
In the actual development, it is inevitable to meet the demand of multi-threaded reading and writing shared data. For example, in a business processing, first obtain the shared data (such as a count), then use the shared data for some calculations and business processing, and finally modify the shared data to a new value. Because multiple threads operate at the same time, after a thread obtains the shared data, the shared data may be modified by other threads. Then the data obtained by this thread is the wrong old data. Let's look at a specific code example:

static int count { get; set; }

static void Main(string[] args)
{
    for (int i = 1; i <= 2; i++)
    {
        var thread = new Thread(ThreadMethod);
        thread.Start(i);
        Thread.Sleep(500);
    }
}

static void ThreadMethod(object threadNo)
{
    while (true)
    {
        var temp = count;
        Console.WriteLine("thread  " + threadNo + " Read count");
        Thread.Sleep(1000); // Simulate time-consuming work
        count = temp + 1;
        Console.WriteLine("thread  " + threadNo + " Count increased to: " + count);
        Thread.Sleep(1000);
    }
}

In the example, two independent threads are started to work and count. If the ThreadMethod is executed for the fourth time (that is, the count value should be 4 at this time), the change process of count value should be: 1, 2, 3 and 4, while the change of actual run-time count is: 1, 1, 2, 2. In other words, except for the first time, the counts read by the two threads are old error data, which we call dirty data.

Therefore, when reading and writing shared data, it should be regarded as an exclusive resource for exclusive access to avoid reading and writing at the same time. When one thread reads and writes to it, other threads must wait. The easiest way to avoid reading and writing shared data at the same time is to lock it.

Modify the following example to lock count:

static int count { get; set; }
static readonly object key = new object();

static void Main(string[] args)
{
    ...
}

static void ThreadMethod(object threadNumber)
{
    while (true)
    {
        lock(key)
        {
            var temp = count;
            ...
             count = temp + 1;
            ...
        }
        Thread.Sleep(1000);
    }
}

This ensures that only one thread can read and write shared data at the same time to avoid dirty data.

Occurrence of deadlock

In order to solve the problem that multiple threads read and write shared data at the same time, lock is introduced.
**However, if the same thread needs to occupy multiple exclusive resources in a task, it will bring a new problem: deadlock** In short, a deadlock occurs when a thread waits when the request for exclusive resources is not satisfied and does not release the occupied resources. Deadlock is that multiple threads wait for each other in a loop at the same time, waiting for the other party to release the resources it occupies for its own use. You wait for me, I treat you, and you and I are always in a state of waiting for each other. Here is an example of how deadlocks occur:

class Program
{
    static void Main(string[] args)
    {
        var workers = new Workers();
        workers.StartThreads();
        var output = workers.GetResult();
        Console.WriteLine(output);
    }
}

class Workers
{
    Thread thread1, thread2;

    object resourceA = new object();
    object resourceB = new object();

    string output;

    public void StartThreads()
    {
        thread1 = new Thread(Thread1DoWork);
        thread2 = new Thread(Thread2DoWork);
        thread1.Start();
        thread2.Start();
    }

    public string GetResult()
    {
        thread1.Join();
        thread2.Join();
        return output;
    }

    public void Thread1DoWork()
    {
        lock (resourceA)
        {
            Thread.Sleep(100);
            lock (resourceB)
            {
                output += "T1#";
            }
        }
    }

    public void Thread2DoWork()
    {
        lock (resourceB)
        {
            Thread.Sleep(100);
            lock (resourceA)
            {
                output += "T2#";
            }
        }
    }
}

The example never outputs results after running, and A deadlock occurs. When thread 1 works, resource A is locked, and resource B needs to be locked during this period; However, at this time, resource B is monopolized by thread 2, which happens to be waiting for resource A to be released; And resource A is occupied by thread 1... In this way, both sides fall into an eternal loop waiting.

deadlock avoidance

To avoid deadlock in the above situations, you can use monitor Tryenter (obj, timeout) method to check whether an object is occupied. This method attempts to obtain the exclusive permission of the specified object. If the access right of the object is still not obtained within the timeout time, it will actively "yield" and call thread The yield () method returns other resources occupied by the thread to the CUP, so that other threads waiting for the resource can continue to execute. That is, when the request for exclusive resources is not satisfied, the thread takes the initiative to make concessions to avoid deadlock.

Use the Thread1DoWork method of the Workers class of the above example code to monitor Tryenter modify:

//... (omit the same code)

public void Thread1DoWork()
{
    bool mustDoWork = true;
    while (mustDoWork)
    {
        lock (resourceA)
        {
            Thread.Sleep(100);
            if (Monitor.TryEnter(resourceB, 0))
            {
                output += "T1#";
                mustDoWork = false;
                Monitor.Exit(resourceB);
            }
        }
        if (mustDoWork) Thread.Yield();
    }
}

public void Thread2DoWork()
{
    lock (resourceB)
    {
        Thread.Sleep(100);
        lock (resourceA)
        {
            output += "T2#";
        }
    }
}

Run the example again, the program outputs T2#T1# and ends normally, and the deadlock problem is solved.

Note that this solution depends on thread 2's stubborn possession of the exclusive resources it needs and thread 1's willingness to "give in" to let thread 2 always give priority to execution. At the same time, note that after locking resourceA, thread 1 makes a concession because it cannot compete for resourceB. After releasing the occupied resourceA, thread 2 must wait until thread 2 uses resourceA to lock resourceA again, and then redo the work.

Because thread 2 always takes precedence, if thread 2 occupies resourceA or resourceB very frequently (for example, a while(true) like loop is nested outside), thread 1 may not be able to obtain the required resources. This phenomenon is called thread starvation, It is caused by the high priority thread swallowing the CPU execution time of the low priority thread. In addition to this reason of thread hunger, it is also possible that the thread is waiting for a task that itself is permanently waiting to be completed.

We can continue to open a brain hole. In the above example, if thread 2 is willing to give in, what will happen?

Occurrence and avoidance of livelock
Let's modify the above example to make thread 2 willing to give in:

public void Thread1DoWork()
{
    bool mustDoWork = true;
    Thread.Sleep(100);
    while (mustDoWork)
    {
        lock (resourceA)
        {
            Console.WriteLine("T1 redo");
            Thread.Sleep(1000);
            if (Monitor.TryEnter(resourceB, 0))
            {
                output += "T1#";
                mustDoWork = false;
                Monitor.Exit(resourceB);
            }
        }
        if (mustDoWork) Thread.Yield();
    }
}
public void Thread2DoWork()
{
    bool mustDoWork = true;
    Thread.Sleep(100);
    while (mustDoWork)
    {
        lock (resourceB)
        {
            Console.WriteLine("T2 redo");
            Thread.Sleep(1100);
            if (Monitor.TryEnter(resourceA, 0))
            {
                output += "T2#";
                mustDoWork = false;
                Monitor.Exit(resourceB);
            }
        }
        if (mustDoWork) Thread.Yield();
    }
}

Note that in order to make the effect I want to demonstrate more obvious, I put the thread of two threads Sleep time opened a little. The effects after operation are as follows:

By observing the running effect, we found that thread 1 and thread 2 have been giving in to each other, and then continue to restart. Neither thread can enter monitor Tryenter code blocks are running, but they don't really work.

We call this phenomenon that the thread has been running but its task has been unable to progress as a livelock. The difference between livelock and deadlock is that the thread in livelock is running, while the thread in deadlock is waiting; A live lock may unlock itself, but a deadlock cannot.

To avoid livelocks, it is necessary to reasonably estimate the occupation time of each thread on exclusive resources, and reasonably arrange the task call time interval. Be extra careful. In reality, this kind of business scenario is rare. In the example, this complex resource occupation logic is easy to confuse people and is extremely difficult to maintain. The recommended approach is to use semaphore mechanism instead of lock. This is another topic, which will be discussed in a separate article later.

summary

  1. We should avoid multiple threads reading and writing shared data at the same time. The simplest way to avoid this is to lock the shared data and use it exclusively as an exclusive resource.
  2. When multiple threads need to lock multiple exclusive resources in a task, deadlock may occur due to mutual cyclic waiting. To avoid deadlocks, at least one thread must make concessions. That is, when you find that the resources you need are not met, you should actively release the occupied resources so that other threads can execute and complete smoothly.
  3. In most cases, deadlock can be avoided by arranging a thread concession, but in complex business, multiple threads may compromise each other, resulting in livelock. In order to avoid livelocks, it is necessary to reasonably arrange the time interval of thread task calls, which will make the business code very complex. It is better to give up using locks and use semaphore mechanism to achieve exclusive access to resources.

Topics: C#