What is a blocking queue?
- A linear structure, FIFO first in first out, can be operated at both ends, adding one side and deleting the other
- Support blocking insertion and blocking removal (for example, the production consumer mode demonstrated in the previous chapter)
- The list size is divided into bounded queue and unbounded queue. Unbounded queue is the largest Integer.MAX in Java
This article is mainly based on the ArrayBlockingQueue blocking queue. Others are similar to each other and can be understood by yourself
1. Blocking queue in j.u.c
- ArrayBlockingQueue underlying array structure
final Object[] items;
- LinkedBlockingQueue underlying linked list structure
static class Node<E> { E item; Node<E> next; Node(E x) { item = x; } }
- The underlying data structure of PriorityBlockingQueue is based on priority queue (comparison), similar to leaderboard
private transient Object[] queue; private transient Comparator<? super E> comparator;
- DelayQueue is a delay queue. The bottom layer is based on the priority queue, similar to the delay queue in RocketMq
private final PriorityQueue<E> q = new PriorityQueue<E>();
- There is no storage structure in the synchronous queue. Direct storage is not used. When the other end does not exist, it is always blocked. It is used in the process pool newCacheThread. The bottom layer is divided into queue or stack
public SynchronousQueue(boolean fair) { //Data is transmitted in a fair queue, otherwise it is transmitted in a stack transferer = fair ? new TransferQueue<E>() : new TransferStack<E>(); }
Queue queue, FIFO first in first out. Stack stack, last in, first out
LinkedTransferQueue unbounded blocking queue is equivalent to ArraysBlockingQueue + TransferQueue
2. Common methods in blocking queue
- Add element
add() throws an exception if the queue is full
offer() returns true after adding successfully, otherwise false
put() blocks if the queue is full
The offer(timeout) queue is blocked when it is full, but with a timeout
- Removing Elements
ekement() throws an exception if the queue is empty
peek() returns true after successful removal, otherwise false
take() has been blocking and the waiting queue has a value
The poll(timeout) blocking wait has a value, but with a timeout
3. Actual use of blocking queue
- Chain of responsibility
- Producers and consumers
//Initialize a blocking queue with size 3 static ArrayBlockingQueue<String> queue = new ArrayBlockingQueue<>(3); public static void main(String[] args) throws InterruptedException { new Thread(() -> { int i = 0; while (true) { try { //If the queue is full, it is blocked queue.put("element" + i); System.out.println("Deposit:element" + i); } catch (InterruptedException e) { e.printStackTrace(); } i++; } }).start(); Thread.sleep(100); new Thread(() -> { while (true) { try { // Always blocked waiting queue has value String take = queue.take(); System.out.println("take out:" + take); } catch (InterruptedException e) { e.printStackTrace(); } } }).start(); }
4. Blocking queue source code analysis
1. Member variables
//The queue result is an array final Object[] items; //The index blocking queue to be retrieved next time is executed in the locking code, which is safe /** items index for next take, poll, peek or remove */ int takeIndex; //index to be added next time /** items index for next put, offer, or add */ int putIndex; //Current size of the queue /** Number of elements in the queue */ int count; //Reentry lock /** Main lock guarding all access */ final ReentrantLock lock; //The queue is not empty. Consumer Condition private final Condition notEmpty; //The queue is not full. The producer's Condition /** Condition for waiting puts */ private final Condition notFull;
2. Construction method
public ArrayBlockingQueue(int capacity, boolean fair) { if (capacity <= 0) throw new IllegalArgumentException(); //Initialize queue size this.items = new Object[capacity]; //You can define fair and unfair locks yourself lock = new ReentrantLock(fair); //The familiar condition queue is empty and full notEmpty = lock.newCondition(); notFull = lock.newCondition(); }
3.put() add element
public void put(E e) throws InterruptedException { checkNotNull(e); final ReentrantLock lock = this.lock; //Will call abstractqueuedsynchronizer #acquireinterruptible //The next step is the preemptive lock in AQS. Those that are not preempted are added to the AQS queue and wait for unlock to wake up the logic lock.lockInterruptibly(); try { //The current number of elements is equal to the length of the array. The queue is full while (count == items.length) //Blocking the above Condition source code is analyzed. First add it to the Condition waiting queue, and then release the lock completely //Then it is judged that when the status is CONDITION, the blocking will be synchronized to the AQS queue first and then wake up the thread //In the while, the thread judges that it is in the synchronization queue and jumps out of the loop. Next, it continues to preempt the lock and executes from here notFull.await(); enqueue(e); } finally { lock.unlock(); } }
- enqueue() adds the element and wakes up the consuming thread
private void enqueue(E x) { final Object[] items = this.items; //Assign the value to the specified subscript, because it is thread safe in the locked code items[putIndex] = x; if (++putIndex == items.length) putIndex = 0; count++; //Wake up all consuming threads //do while in doSignal adds the thread in the Condition queue to the AQS queue //Then preempt the lock and wake up the thread that preempts the lock LockSupport.unpark(node.thread); notEmpty.signal(); }
4.take() takes out the element
public E take() throws InterruptedException { final ReentrantLock lock = this.lock; //Preempt lock first lock.lockInterruptibly(); try { //Blocking if queue is empty while (count == 0) notEmpty.await(); return dequeue(); } finally { lock.unlock(); } }
- dequeue() takes out the element and wakes up the production thread
private E dequeue() { final Object[] items = this.items; @SuppressWarnings("unchecked") //Consumption queue subscript element E x = (E) items[takeIndex]; items[takeIndex] = null; if (++takeIndex == items.length) takeIndex = 0; count--; if (itrs != null) itrs.elementDequeued(); //Wake up all add threads notFull.signal(); return x; }
Summary of blocking queues
Overall, after understanding the principle of the previous lock, the code should look very simple.
Blocking queue is a linear structure, FIFO First in first out (FIFO) supports operations at both ends, adding and deleting at the same time Support blocking insertion(Blocking when the queue is full),Support blocking removal(The queue is blocked when it is empty) The principle is to use java Medium Condition To implement a thread queue, when put When the element queue is full, Will be added to Condition Queue, and then completely release the lock in while During the cycle CONDITION Status blocks the current thread Wake up from CONDITION Waiting queue do while To wake up the thread, first synchronize the thread waiting in the queue to AQS queue,Then preempt the lock
That is the whole content of this chapter.
Previous: Use of wait/notify and J.U.C Condition in synchronized thread communication and source code analysis
Next: J. Tool classes and principle analysis in U.C (CountDownLatch, Semaphore, CyclicBarrier)
Read a hundred times, its meaning is self-evident