Realization principle of kotlin coprocess

Posted by smclay on Mon, 31 Jan 2022 13:18:06 +0100

Implementation of traditional runnable interface

In java, many time-consuming behaviors are performed by implementing the runnable interface and running these time-consuming tasks through threads, such as:

public class Task1 implements Runnable{
    @Override
    public void run() {
        try {
            Thread.sleep(2000);
            System.out.println("this is task1");
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}

Here, the thread sleeps for two seconds to simulate the blocking process. However, this brings a problem, that is, if there are multiple time-consuming tasks that implement the runnable interface, and the states of these tasks are interdependent. In other words, the next runnable depends on the variable of the previous runable, which is also very common. Take a simple example:

public class Task1 implements Runnable{
    @Override
    public void run() {
        try {
            Thread.sleep(2000);
            String ret = "ok";
            System.out.println("this is task1");
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}


public class Task2 implements Runnable{
    @Override
    public void run() {
        try {
            Thread.sleep(2000);
            System.out.println("this is task2 ret:");
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}

There is a ret variable in runalbe in task 1. If you want to reference this variable in task 2, how can you solve this problem?

Traditional solutions

There are many solutions to the above problems, such as synchronization between multiple threads, producer consumer mode, etc., but I personally think the better solutions are callback and Rxjava modes. Because multithreading scheduling on the java side will inevitably have performance impact. Especially in the face of the mutual scheduling of hundreds of tasks.

  1. Callback
    The traditional callback method is to implement an interface and add Runnable to a callback method, as follows:
class Task1 implements Runnable{
    @Override
    public void run() {
        try {
            Thread.sleep(2000);
            String ret = "ok";
            System.out.println("this is task1");
            new Task2().doTest(ret);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}


class Task2 implements Runnable,Task{
    String ret;
    @Override
    public void run() {
        try {
            Thread.sleep(2000);
            System.out.println("this is task2");
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
    @Override
    public void doTest(String ret) {
        this.ret = ret;
        run();
    }
}

public interface Task {
    void doTest(String ret);
}

Through this scheme, the variable transfer is realized, but the problem also occurs. The callback is prone to callback hell in the face of multitasking.

  1. RxJava
    This is a model of responsive programming. The task in runnable is regarded as an event. After an event is completed, it notifies the next event to start.
        Single.fromObservable(observer -> {
            new Task1().run();
        }).map(mapper -> {
            new Task2().doTest(mapper);
        }).subscribe(onSuccess -> {
            // do something
        },error -> {
            
        });

Although RxJava has been fully decoupled, it is also relatively concise. But kotlin's co process can be said to be a better solution to this kind of problem

kotlin synergy

According to the above description, kotlin's collaborative process is to solve the problems related to tasks. For example, the above problems are handled by collaborative process,

 GlobalScope.launch { 
        val ret = Task1().doTask()
        Task2().doTask(ret)
    }

This way of writing is clear, and the idea of turning asynchronous into synchronous is in it. To solve the problem in such a concise and clear way, I want to explore how the bottom layer is realized. According to the official documents, a coroutine can be said to be a lightweight thread. From a certain point of view, it is true, because it can suspend the function, not block the thread, and wake up the function. This is true from these perspectives. But the actual situation is completely different from the implementation of threads. In order to explore what a collaborative process is, we might as well look at its implementation.

Implementation of kotlin collaborative process

We know java The threads in the operating system have a one-to-one relationship with the threads in the operating system, but the coroutine is not like this,Analyze the underlying source code of the collaborative process, generally through GlobalScope.launch Start a collaborative process with the following code:
public fun CoroutineScope.launch(
    context: CoroutineContext = EmptyCoroutineContext,
    start: CoroutineStart = CoroutineStart.DEFAULT,
    block: suspend CoroutineScope.() -> Unit
): Job {
    val newContext = newCoroutineContext(context)
    val coroutine = if (start.isLazy)
        LazyStandaloneCoroutine(newContext, block) else
        StandaloneCoroutine(newContext, active = true)
    coroutine.start(start, coroutine, block)
    return coroutine
}	

The three parameters here are The context, thread scheduler, and callback function we wrote will go to by default after clicking the start function and debugging its call

internal fun <R, T> (suspend (R) -> T).startCoroutineCancellable(receiver: R, completion: Continuation<T>) =
    runSafely(completion) {
        createCoroutineUnintercepted(receiver, completion).intercepted().resumeCancellableWith(Result.success(Unit))
    }

Notice this line of code

createCoroutineUnintercepted(receiver,completion).intercepted().resumeCancellableWith(Result.success(Unit))

This method shows that first, create a co process interceptor, suspend its method (co process) through the intercepted method, start it through the resumeCancellableWith method, and find the core implementation of resumeCancellableWith. The code is as follows:

    inline fun resumeCancellableWith(result: Result<T>) {
        val state = result.toState()
        if (dispatcher.isDispatchNeeded(context)) {
            _state = state
            resumeMode = MODE_CANCELLABLE
            dispatcher.dispatch(context, this)
        } else {
            executeUnconfined(state, MODE_CANCELLABLE) {
                if (!resumeCancelled()) {
                    resumeUndispatchedWith(result)
                }
            }
        }
    }

This method indicates that the dispatcher will eventually be executed Dispatch (context, this). Finally, the scheduling of the collaboration process will be handed over to the thread. This is the reason why the current thread will not be blocked during the execution of the coroutine, because it simply means that a thread is taken from the thread pool to execute the blocking function. Here we should know what a collaborative process is. In teacher he's words, a collaborative process is a routine that can work together. In kotlin, any method modified by the suspend function can be called a collaborative process. What does the suspend keyword do, kotlin source code

object Task {
    suspend fun deal(){
        Thread.sleep(500L)
        println("sleep 500ms")
    }
}
fun main() {
   GlobalScope.launch {
        Task.deal()
   }
   Thread.sleep(2000L)
}

Through JD GUI to taskkt Class is decompiled into java, and its suspend function becomes

  @Nullable
  public final Object deal(@NotNull Continuation $completion) {
    Thread.sleep(500L);
    String str = "sleep 500ms";
    boolean bool = false;
    System.out.println(str);
    return Unit.INSTANCE;
  }

There is no mistake. The function modified by suspend will be injected with a parameter of Continuation type during the compilation process to open the definition of Continuation

public interface Continuation<in T> {
    /**
     * The context of the coroutine that corresponds to this continuation.
     */
    public val context: CoroutineContext

    /**
     * Resumes the execution of the corresponding coroutine passing a successful or failed [result] as the
     * return value of the last suspension point.
     */
    public fun resumeWith(result: Result<T>)
}

This is an interface. Context executes in the context environment. The resumeWith function continues to execute the current function and returns the result of the function.
The main function becomes:

 @Nullable
  public final Object invokeSuspend(@NotNull Object $result) {
    Object object = IntrinsicsKt.getCOROUTINE_SUSPENDED();
    switch (this.label) {
      case 0:
        ResultKt.throwOnFailure(SYNTHETIC_LOCAL_VARIABLE_1);
        this.label = 1;
        if (Task.INSTANCE.deal((Continuation<? super Unit>)this) == object)
          return object; 
        Task.INSTANCE.deal((Continuation<? super Unit>)this);
        return Unit.INSTANCE;
      case 1:
        ResultKt.throwOnFailure(SYNTHETIC_LOCAL_VARIABLE_1);
        return Unit.INSTANCE;
    } 
    throw new IllegalStateException("call to 'resume' before 'invoke' with coroutine");
  }
  @NotNull
  public final Continuation<Unit> create(@Nullable Object value, @NotNull Continuation<? super MainKt$main$1> $completion) {
    return (Continuation<Unit>)new MainKt$main$1($completion);
  }
  
  @Nullable
  public final Object invoke(@NotNull CoroutineScope p1, @Nullable Continuation<?> p2) {
    return ((MainKt$main$1)create(p1, p2)).invokeSuspend(Unit.INSTANCE);
  }

The interesting thing here is to mark whether the collaborative process has ended through the state of lable. Change the state of lable and switch to enter the suspended function.

summary

Through the above analysis, the emergence of collaborative process is mainly used to solve the problem of interdependence of time-consuming tasks. If multiple time-consuming tasks are at the same level and not related, it is best to use multithreading for concurrent processing. At the same time, although coprocesses are similar to threads to some extent, they are still scheduled in the way of threads in the underlying scheduling. So you can open a lot of globalscope Launch, because this is essentially the scheduling of threads, and it is certain and will not be blocked. At the same time, the functions in the same coroutineContext are actually in the same thread, which shows why the collaboration can solve the interdependence between tasks

Topics: kotlin