6. Overview of Workflow of volley Source Parsing

Posted by abitlikehomer on Sun, 09 Jun 2019 23:28:52 +0200

Summary of the article
1. Work threads in volley
2. volley working steps
3. RequestQueue initialization and initialization logic

Attachment: Get Volley source code
Demos case source code: https://github.com/HailouWang/DemosForApi


Volley's source code is really worth reading, which contains many very good processing logic, such as: Volley's workflow architecture, thread cache processing, network request data parsing and processing, and the design patterns used in it.

When we have identified the development pain point volley needs to solve, volley provides a loosely coupled architecture implementation, we can easily extend its functions within its framework implementation. To quote a sentence from the design pattern: Everything works for loosely coupled design.

1. There are three types of threads in volley:

  • 1. Main thread. Refresh the acquired data to UI, send network requests, etc.
  • 2. Cache Thread. Cached data is managed, requests from the main thread are first retrieved from the main thread, and then the requests are distributed to the network.
  • 3. Network Thread. Users can set the number of network threads, default is 4. The process of obtaining data from the network is defined, but the network synchronization logic is not included. This design can better release the coupling between threads and network parsing logic.
for (int i = 0; i < DEFAULT_NETWORK_THREAD_POOL_SIZE; i++) {
    NetworkDispatcher networkDispatcher = new NetworkDispatcher(mNetworkQueue, mNetwork,
      mCache, mDelivery);
    mDispatchers[i] = networkDispatcher;

2. volley working steps

In previous Demos Send a simple request, Whether through RequestQueue or through Image Loader, the general usage steps are the same, including the following steps:

  • 2.1. Initialize RequestQueue or Image Loader.
// Instantiate the RequestQueue.
RequestQueue queue = Volley.newRequestQueue(this);
String url ="http://www.google.com";
  • 2.2. Initialization Request
// Request a string response from the provided URL.
StringRequest stringRequest = new StringRequest(Request.Method.GET, url,
            new Response.Listener<String>() {
    public void onResponse(String response) {
        // Display the first 500 characters of the response string.
        mTextView.setText("Response is: "+ response.substring(0,500));
}, new Response.ErrorListener() {
    public void onErrorResponse(VolleyError error) {
        mTextView.setText("That didn't work!");
  • 2.3. Add Request to RequestQueue.
// Add the request to the RequestQueue.

The latter two steps are the use of RequestQueue, so we will focus on the first part: the creation process of RequestQueue.

3. RequestQueue initialization

RequestQueue manages Request, and its function is more to distribute Request to Cache threads and network threads through the add method as a tool class.

  • 3.1. First, the RequestQueue object is obtained through the newRequestQueue method.
    public static RequestQueue newRequestQueue(Context context) {
        //1. Initialize the Network object. The meaning of Network is to parse the Request through the performance Request method and generate the Response object.
        HttpStack stack;
        if (stack == null) {
            if (Build.VERSION.SDK_INT >= 9) {
                stack = new HurlStack();
            } else {
                // Prior to Gingerbread, HttpUrlConnection was unreliable.
                // See: http://android-developers.blogspot.com/2011/09/androids-http-clients.html
                stack = new HttpClientStack(AndroidHttpClient.newInstance(userAgent));
        Network network = new BasicNetwork(stack);
        //2. Initialize RequestQueue with two parameters: the first parameter is the Cache buffer, optional, that is, no buffer.
        //The second parameter is the network object, which is also the main worker for relational analysis of network data.
        RequestQueue queue = new RequestQueue(new DiskBasedCache(cacheDir), network);

        return queue;

Cache class diagram
  • 3.2. RequestQueue instance initialization, call start, the thread will run.
     * Creates the worker pool. Processing will not begin until {@link #start()} is called.
     *Create a workpool that does not start execution until a non-start method is invoked
     * @param cache A Cache to use for persisting responses to disk
     * 1,Cacher. Persisting the corresponding data to the hard disk
     * @param network A Network interface for performing HTTP requests
     * 2,Network Processor. Network interface for handling HTTP requests
     * @param threadPoolSize Number of network dispatcher threads to create
     * 3,Network request distributor. Default four distribution thread pools
     * @param delivery A ResponseDelivery interface for posting responses and errors
     * 4,Response transmitter. Running in the main thread, passing response data and error log information to achieve encapsulation from: Handler
    public RequestQueue(Cache cache, Network network, int threadPoolSize,
            ResponseDelivery delivery) {
        mCache = cache;
        mNetwork = network;
        mDispatchers = new NetworkDispatcher[threadPoolSize];
        mDelivery = delivery;

4. RequestQueue throws Request to the worker thread

RequestQueue throws the Request to the worker thread through the add method. The worker thread contains the Cache thread and the Network thread as we mentioned above.

//1. Add Request to mCurrentRequests, because this tool class also needs to manage Request, such as cancel.
// Tag the request as belonging to this queue and add it to the set of current requests.
synchronized (mCurrentRequests) {

// Process requests in the order they are added.
//2. Packaging Request. Give it SequenceNumber

// If the request is uncacheable, skip the cache queue and go straight to the network.
//3. If Request does not require caching, it is added directly to the Network thread
if (!request.shouldCache()) {
    return request;

// Insert request into stage if there's already a request with the same cache key in flight.
//4. If the Request is not waiting (i.e. mWaiting Requests), it is first thrown to the buffer thread (mCacheQueue).
synchronized (mWaitingRequests) {
    String cacheKey = request.getCacheKey();
    if (mWaitingRequests.containsKey(cacheKey)) {
        // There is already a request in flight. Queue up.
        Queue<Request<?>> stagedRequests = mWaitingRequests.get(cacheKey);
        if (stagedRequests == null) {
            stagedRequests = new LinkedList<>();
        mWaitingRequests.put(cacheKey, stagedRequests);
        if (VolleyLog.DEBUG) {
            VolleyLog.v("Request for cacheKey=%s is in flight, putting on hold.", cacheKey);
    } else {
        // Insert 'null' queue for this cacheKey, indicating there is now a request in
        // flight.
        mWaitingRequests.put(cacheKey, null);

After obtaining the Request, how does the Request Queue proceed? Cache Thread Workflow for volley Source Parsing

Topics: network github Google Android