I'm having trouble with some dumb architecture because I'm dumb. I'm trying to loop over YouTube videos posted to Reddit, extract URLs and process them into an .m3u playlist.
Full code at Subreddit to YouTube Source Bookmarklet - Play YouTube music from subreddits in Foobar with foo_youtube
At some point I get the idea that I could check each URL to see if the video is dead or not, and offer an alternative if they're removed.
So I do AJAX requests to YouTube API and if there's an error I'm supposed to react to it and change that item's URL.
But the problem is it only works if the AJAX is NOT async - this takes many seconds, during which the page is jammed.
I'd like to let the AJAX be async but I don't know how I should structure my code.
Here is PSEUDOCODE of how it is now:
var listing = // data from reddit API
$.each(listing, function(key, value) {
var url = // post URL from reddit posts listing
// ( "http://youtu.be/XmQR4_hDtpY&hd=1" )
var aRegex = // to parse YouTube URLs
// ( (?:youtube(?:-nocookie)?.com/....bla...bla )
var videoID = // YouTube video ID, extracted with regex
// ( "XmQR4_hDtpY" )
var itemArtist = // parsed from reddit posts listing
// ( "Awesome Artist" )
var itemTitle = // parsed from reddit posts listing
// ( "Cool Song (Original Mix)" )
var itemURL = // url, further processed
// ( "3dydfy://www.youtube.com/watch?v=XmQR4_hDtpY&hd=1" )
$.ajax({
type: "HEAD",
url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
error: function() {
// If it's no longer available
// (removed, deleted account, made private)
deadVid++; // chalk up another dead one, for showing progress
itemURL = // dead videos should get a different URL
// ( "3dydfy-search://q=Awesome%20Artist%20Cool%20Song....." )
}
});
// further process itemURL!
// at this point I want itemURL changed by the .ajax()'s error callback
// but I'm trying to keep the requests async
// to not jam the page while a hundred HTTP requests happen!
if (condition){
itemURL += // append various strings
}
// Write the item to the .m3u8 playlist
playlist += itemURL + '\n';
}// end .each()
The 'Asynchronous' in AJAX Here's how the various AJAX processes work asynchronously: Once the HTML page loads, data is read from a web server. Without the need to reload the webpage, the data can be updated. Data transfer happens to the web server in the background.
By default, all requests are sent asynchronously (i.e. this is set to true by default). If you need synchronous requests, set this option to false . Cross-domain requests and dataType: "jsonp" requests do not support synchronous operation.
AJAX can access the server both synchronously and asynchronously: Synchronously, in which the script stops and waits for the server to send back a reply before continuing. Asynchronously, in which the script allows the page to continue to be processed and handles the reply if and when it arrives.
Asynchronous JavaScript is an alternative to Ajax in some cases and is a way to lighten the pages. It does not allow full interactivity with the server as does the XMLHttpRequest object.
Basically you want to know
If you push the errors into a list, the results will be ready at the end (order not guaranteed of course).
For the second part, if you keep an array of the ajax promises returned from each $.ajax you can use $.when
and wait for them all to complete using always()
.
As a basic example (other details removed):
var listing = {}; // data from reddit API
var promises = [];
var results = [];
$.each(listing, function(key, value) {
// [snip]
promises.push($.ajax({
type: "HEAD",
url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
error: function() {
//[snip]
results.push({
itemArtist: itemArtist,
videoID: videoID,
url: itemURL});
}
});
);
}
// Wait for all promises to complete (pass or fail)
$.when.apply($, promises).always(function(){
// process list of failed URLs
});
Apologies for any typos. This was coded straight into the answer, but you get the idea.
I note you mention 100s of requests, but the browser will only allow a handful through at a time, so no need for additional processing.
If always
is not working you can add your own deferred objects that resolve on success
or fail
:
var listing = {}; // data from reddit API
var promises = [];
var results = [];
$.each(listing, function(key, value) {
var deferred = $.Deferred();
promises.push(deferred.promise());
// [snip]
$.ajax({
type: "HEAD",
url: "https://gdata.youtube.com/feeds/api/videos/" + videoID,
complete: function(){
// resolve on success or fail
deferred.resolve();
},
error: function() {
//[snip]
results.push({
itemArtist: itemArtist,
videoID: videoID,
url: itemURL});
}
});
);
}
// Wait for all promises to complete (pass or fail)
$.when.apply($, promises).always(function(){
// process list of failed URLs
});
Here is another cool way to chain parallel promises together, without using an array (so long as you do not need the data values passed through to the callbacks):
Simplified code:
var promise; // Undefined is also a resolved promise when passed to $.when()
$.each(listing, function(key, value) {
// Each new Ajax promise is chained, in parallel, with the previous promise
promise = $.when(promise, $.ajax({...}));
});
// When they are all finished, fire a final callback
$.when(promise).always(function(){
// All done!
});
This has had some criticism, mainly from people that feel it is "unclean", but for the simplifying of parallel promise code the trade-off is minimal.
I figured out this was possible when I saw someone use promise = promise.then(newpromise)
to chain events in sequence. After some experimenting I found I could do the same in parallel using promise = $.when(promise, newpromise)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With