.when jQuery function have to wait for all ajax requests

Photo by Goran Ivos on Unsplash

.when jQuery function have to wait for all ajax requests

Hey devs. So today I'll show you a way of how I handled multiple ajax requests of unknown length. That means that I don't know how many ajax requests I will send and the jQuery script have to wait for all of them.

Preface

It all started with an optimization of an old code that was horribly slow. I was searching of a way how to squeeze down the processing time of some rules. The old code takes 5 minutes to process 200 rules and 200 rules was the maximum that we allowed to send at once. I was able to refactor the code and squeeze it down to 30 seconds. One would say that that's perfect and will stop. But not me.

What's the problem

The problem I saw was the incremental time it takes to process 200 rules vs 50 rules. So when I sent only 50 rules the time was accepting and was around 10s. Which was a time I wanted to achieve.

Ok, so in my mind a plan started to grow. I focused on splitting the one (big) time consuming ajax to smaller parts and send them at once. Everything will be send to one endpoint which in this case will be a validation.php script. I know that through browser we can have at max ~6 threads opened that can be processed at once by the web server. A limitation of http/1.1 but in my team we are not ready for http/2 (yet). So in my case to split 200 rules to 50 means 4 threads.

The plan was clear. Split the number of rules into smaller chunks, send them in parallel to server, wait for responses and process it.

The only problem was jQuery and the not so well explained documentation. So jQuery has a solution for waiting for all ajax requests. The solution is .when() function. The problem with this is that it takes a finite (known) number of ajax requests as arguments. Which is not true but it seems like it's true because the documentation is let's say subjective/misleading. So you've to start googling.

First solution is something like $(document).ajaxStop() but that waits for any running ajax which is definitely not good fit for me. Another solution was $.when.apply(). This is actually a working solution but there is one better. Spread syntax (...) to the rescue.

I like the spread syntax a lot because for me it's much better readable, easier to understand and close to the documentation than the additional .apply() function which will most probably raise a question what is it for and how it works.

//when() function example with arbitrary number of arguments
$.when(...promises)
.always()
.done(...responses)
.fail(xhr)

Now back to the beginning. I have a good foundation but now I need to split the rules. I need an array of arrays splitted by some number per chunk and the last chunk will hold the rest. This was a good candidate for some .filter or .map functions but it ends up with .reduce.

let rules = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17];
let perChunk = 5; // items per chunk
let result = rules.reduce((resultArray, item, index) => {
  const chunkIndex = Math.floor(index/perChunk);

  if(!resultArray[chunkIndex])
    resultArray[chunkIndex] = []; // start a new chunk

  resultArray[chunkIndex].push(item);

  return resultArray;
}, []);
console.log(result);
//[[1, 2, 3, 4, 5], [6, 7, 8, 9, 10], [11, 12, 13, 14, 15], [16, 17]]

Now when I have the chunks I can create the ajax calls and store them in an array.

let promises = [];
for (let batch of result) {
        let request = $.ajax({
            'url': 'some/path',
            'method': 'POST',
            'data' : batch
        });
        promises.push(request);
}

Ajax requests are now on their way to server and I need to wait until they finish. The only problem I can encounter in this phase is when I have only one ajax. The reason is that when there is only one result the .when() is returning its Promise object. So a small complication. So in one case it's a Promise - an array of objects and in another it's an array of arrays. So I need to check this and adjust it so my function will continue even with one ajax only without additional if-else conditions. And at the end I will merge all the results coming from the server side.

$.when(...promises)
.always(()=>{
    //...
})
.done((...response) => {
//...
    let result = {data: []};
    let is_array = response.every(item=>Array.isArray(item));

    if( !is_array )
         response = [response]; 

    for( let chunk of response ){
        $.merge(result.data,chunk[0]['data']);
    }
//...
});

The only missing part now is when the server will return some error. I don't want to wait for all the ajax until they finish. Rather I want to interrupt the pending requests.

$.when(...promises)
.always()
.done(...response)
.fail(()=>{
     //abort every pending request
     for( let promise of promises){
          promise.abort();
     }
    //...show the error
})

In this phase everything was ready and I just needed to take care of the php script. Problem with php is that that when you want to access the script multiple times there should not be any session lock. Otherwise the request will wait until the previous one finished. A queue of waiting requests... definitely not something I want to achieve because it'll be the same like the one big ajax. So after I did the session validation I immediately closed the session with session_write_close(); which lets the other waiting requests to proceed.

As a result now I have a solution which takes only ~10 seconds for 200 rules.

Yes you get it, so when I have for example 80 rules I'll not send them in 2 chunks (50,30) because 1 chunk of 50 rules always takes 10s but instead i'll split them because 20 rules takes around 4s so to send 4 threads of 20 rules means the time will be lowered to 4s only. So in my case it's useful if I can send in one chunk 5 rules at minimum (that's ~800ms) otherwise it's worth to split it.
Small note: 200 rules is sent only rarely.

Last advice. Always measure and test your code before you make a final decision.