Questions about Limited ConcurrencyLevelTaskScheduler

Posted by kumar_ldh on Wed, 31 Jul 2019 01:08:52 +0200

Original Link: http://www.cnblogs.com/DiscoverPalace/p/4567222.html

1. Introduction to LimitedConcurrencyLevelTaskScheduler

As this TaskScheduler used to know, Microsoft Open Source Task Scheduler has a simple code.
It's good to understand, but what I don't understand is how he limits concurrency.
First, paste down its code, so you can get familiar with it.

public class LimitedConcurrencyLevelTaskScheduler : TaskScheduler
{
    /// <summary>Whether the current thread is processing work items.</summary> 
    [ThreadStatic]
    private static bool _currentThreadIsProcessingItems;
    /// <summary>The list of tasks to be executed.</summary> 
    private readonly LinkedList<Task> _tasks = new LinkedList<Task>(); // protected by lock(_tasks) 
                                                                       /// <summary>The maximum concurrency level allowed by this scheduler.</summary> 
    private readonly int _maxDegreeOfParallelism;
    /// <summary>Whether the scheduler is currently processing work items.</summary> 
    private int _delegatesQueuedOrRunning = 0; // protected by lock(_tasks) 

    /// <summary> 
    /// Initializes an instance of the LimitedConcurrencyLevelTaskScheduler class with the 
    /// specified degree of parallelism. 
    /// </summary> 
    /// <param name="maxDegreeOfParallelism">The maximum degree of parallelism provided by this scheduler.</param> 
    public LimitedConcurrencyLevelTaskScheduler(int maxDegreeOfParallelism)
    {
        if (maxDegreeOfParallelism < 1) throw new ArgumentOutOfRangeException("maxDegreeOfParallelism");
        _maxDegreeOfParallelism = maxDegreeOfParallelism;
    }

    /// <summary>
    /// current executing number;
    /// </summary>
    public int CurrentCount { get; set; }

    /// <summary>Queues a task to the scheduler.</summary> 
    /// <param name="task">The task to be queued.</param> 
    protected sealed override void QueueTask(Task task)
    {
        // Add the task to the list of tasks to be processed. If there aren't enough 
        // delegates currently queued or running to process tasks, schedule another. 
        lock (_tasks)
        {
            Console.WriteLine("Task Count : {0} ", _tasks.Count);
            _tasks.AddLast(task);
            if (_delegatesQueuedOrRunning < _maxDegreeOfParallelism)
            {
                ++_delegatesQueuedOrRunning;
                NotifyThreadPoolOfPendingWork();
            }
        }
    }
    int executingCount = 0;
    private static object executeLock = new object();
    /// <summary> 
    /// Informs the ThreadPool that there's work to be executed for this scheduler. 
    /// </summary> 
    private void NotifyThreadPoolOfPendingWork()
    {
        ThreadPool.UnsafeQueueUserWorkItem(_ =>
        {
            // Note that the current thread is now processing work items. 
            // This is necessary to enable inlining of tasks into this thread. 
            _currentThreadIsProcessingItems = true;
            try
            {
                // Process all available items in the queue. 
                while (true)
                {
                    Task item;
                    lock (_tasks)
                    {
                        // When there are no more items to be processed, 
                        // note that we're done processing, and get out. 
                        if (_tasks.Count == 0)
                        {
                            --_delegatesQueuedOrRunning;

                            break;
                        }
                        
                        // Get the next item from the queue 
                        item = _tasks.First.Value;
                        _tasks.RemoveFirst();
                    }
                  

                    // Execute the task we pulled out of the queue 
                    base.TryExecuteTask(item);
                }
            }
            // We're done processing items on the current thread 
            finally { _currentThreadIsProcessingItems = false; }
        }, null);
    }

    /// <summary>Attempts to execute the specified task on the current thread.</summary> 
    /// <param name="task">The task to be executed.</param> 
    /// <param name="taskWasPreviouslyQueued"></param> 
    /// <returns>Whether the task could be executed on the current thread.</returns> 
    protected sealed override bool TryExecuteTaskInline(Task task, bool taskWasPreviouslyQueued)
    {

        // If this thread isn't already processing a task, we don't support inlining 
        if (!_currentThreadIsProcessingItems) return false;

        // If the task was previously queued, remove it from the queue 
        if (taskWasPreviouslyQueued) TryDequeue(task);

        // Try to run the task. 
        return base.TryExecuteTask(task);
    }

    /// <summary>Attempts to remove a previously scheduled task from the scheduler.</summary> 
    /// <param name="task">The task to be removed.</param> 
    /// <returns>Whether the task could be found and removed.</returns> 
    protected sealed override bool TryDequeue(Task task)
    {
        lock (_tasks) return _tasks.Remove(task);
    }

    /// <summary>Gets the maximum concurrency level supported by this scheduler.</summary> 
    public sealed override int MaximumConcurrencyLevel { get { return _maxDegreeOfParallelism; } }

    /// <summary>Gets an enumerable of the tasks currently scheduled on this scheduler.</summary> 
    /// <returns>An enumerable of the tasks currently scheduled.</returns> 
    protected sealed override IEnumerable<Task> GetScheduledTasks()
    {
        bool lockTaken = false;
        try
        {
            Monitor.TryEnter(_tasks, ref lockTaken);
            if (lockTaken) return _tasks.ToArray();
            else throw new NotSupportedException();
        }
        finally
        {
            if (lockTaken) Monitor.Exit(_tasks);
        }
    }
}

Simple use

Here is the call code.

static void Main(string[] args)
{
    
        TaskFactory fac = new TaskFactory(new LimitedConcurrencyLevelTaskScheduler(5));
         
        //TaskFactory fac = new TaskFactory();
        for (int i = 0; i < 1000; i++)
        {
           
            fac.StartNew(s => {
                Thread.Sleep(1000);
                Console.WriteLine("Current Index {0}, ThreadId {1}",s,Thread.CurrentThread.ManagedThreadId);
            }, i);
        }

        Console.ReadKey();
}

The call is simple
Depending on the order of debug calls, you know.
Once the TaskFactory has been created using LimitedConcurrencyLevelTaskScheduler,
After calling the TaskFacotry.StartNew method.Will enter LimitedConcurrencyLevelTaskScheduler's
QueueTask method.

/// <summary>Queues a task to the scheduler.</summary> 
    /// <param name="task">The task to be queued.</param> 
    protected sealed override void QueueTask(Task task)
    {
        // Add the task to the list of tasks to be processed. If there aren't enough 
        // delegates currently queued or running to process tasks, schedule another. 
        lock (_tasks)
        {
            Console.WriteLine("Task Count : {0} ", _tasks.Count);
            _tasks.AddLast(task);
            if (_delegatesQueuedOrRunning < _maxDegreeOfParallelism)
            {
                ++_delegatesQueuedOrRunning;
                NotifyThreadPoolOfPendingWork();
            }
        }
    }
    

The code is simple, add the Task you just created to the task queue, and then compare the number of tasks you are currently executing with the maximum number of concurrencies allowed. If it is less than that, start notifying the pending task to start executing.
My main question is the NotifyThreadPoolOfPendingWork method.

private void NotifyThreadPoolOfPendingWork()
    {
        ThreadPool.UnsafeQueueUserWorkItem(_ =>
        {
            // Note that the current thread is now processing work items. 
            // This is necessary to enable inlining of tasks into this thread. 
            _currentThreadIsProcessingItems = true;
            try
            {
                // Process all available items in the queue. 
                while (true)
                {
                    Task item;
                    lock (_tasks)
                    {
                        // When there are no more items to be processed, 
                        // note that we're done processing, and get out. 
                        if (_tasks.Count == 0)
                        {
                            --_delegatesQueuedOrRunning;

                            break;
                        }
                        
                        // Get the next item from the queue 
                        item = _tasks.First.Value;
                        _tasks.RemoveFirst();
                    }
                    // Execute the task we pulled out of the queue 
                    base.TryExecuteTask(item);
                }
            }
            // We're done processing items on the current thread 
            finally { _currentThreadIsProcessingItems = false; }
        }, null);
    }

What you see from the code is that you keep running through an endless loop, constantly taking Task s out of _tasks to execute.
Until _task is empty, then exit the loop.There is no limit to concurrency here, only a simple limit when called in QueueTask, but it doesn't seem to have any eggs.
Because as long as the NotifyThreadPoolOfPendingWork method starts, it will run until all Task executions are completed.So how does his concurrency limit?

Always confused, is there something wrong with my understanding, and ask the great God you know to confuse me.

~~Almost drunk.markdown is not very useful and shows some problems.
If you look uncomfortable, you can see here https://www.zybuluo.com/kevinsforever/note/115066
Write using the editor here, but copy will not display properly.
3Q .

Reprinted at: https://www.cnblogs.com/DiscoverPalace/p/4567222.html

Topics: less