Basically, my idea is to do some form of "live cURL results" system which is producing live results when each request is performed, for example I'll have a table with a list of websites that need to be accessed via a cURL request in which upon the result of each cURL response I need data to be sent back to my page using the AJAX Function I initially made the call to the file which performs this loop of requests
<?
foreach($database['table'] as $row) {
curl_init($row['url']);
//the rest of the cURL request etc...
//SEND cURL RESPONSE BACK TO AJAX AFTER EACH ROW!!!
}
<?
I then want it to return the result for each cURL response as they happen, instead of waiting for the full script to complete before returning them all at once.
Is this possible? if so would I still use a normal AJAX request?
How about this?
You break down the entie process into 3
Process 1
You send an ajax request to the php page that access the database and send back the list of urls to the browser page. [simply echo the Urls ] [use a separator '|' to separate the result urls.]
Process 2
The ajax success handler function of the above process will now call a new JavaScript function that does the below.
split the url list int an array.
Process 3 Send the urls list one by one to the server using a another ajax function.
This way only One url is processed at once and results are send back as soon as it is processed
Below is the prototype of function calls in JavaScript, the code is not real, but you get an overall idea about how it works.
function ajaxCall_1()
{
// get url lsit from server
.onSuccess{
process_url(data);
}
}
function process_url(data){
var url_array = data.split('|')
fetch_urls(url_array,0); // sends the first url for processing
}
function fetch_urls(url_array,position){
if(position < url_array.lenght){ // not exceeding the array count
ajax_call2(url_array[position],url_array,position);
}
}
function ajaxCall_2(url,url_array,position)
{
// get url acced by curl and send back the result
.onSuccess{
// do whatever with the data you want
fetch_urls(url_array,position++) // or position+1 // to fetch the next url
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With