I'd like to set a cookie in PHP for my website visitors after they have been on my site for at least 2 minutes.
I guess the sleep() function could do just that, but I read that it might delay loading of the entire page.
Is there any other way to this?
You can create an ajax request from the JavaScript which will load a PHP file after 2 minutes.
In JS:
<script>
setTimeout(function() {
// create the AJAX request to set_the_cookie.php
}, 120000);
</script>
Information about AJAX requests.
In PHP (set_the_cookie.php):
<?php
$value = 'yours_value';
setcookie('cookie_name', $value);
?>
Related
I have a form which submits an AJAX request to one of my controllers which uploads a file using PHP's curl. I want to show the user the status of that (PHP) upload, so I store the PHP upload progress in a session variable. Meanwhile, the submission of the form also starts a setInterval() which makes a different AJAX request to controller which checks the session variable. My problem is that the second AJAX call seems to only fire once (instead of throughout the upload process) and so instead of progressively updating the progress, it just returns 100 at the end. What am I doing wrong?
Here's my code:
(note: I'm using the jQuery form plugin to assist with the file upload. It also adds some additional callbacks)
<script>
var check_progress = function() {
$.ajax(
{
url : '/media/upload_progress',
success : function(data) {
console.log(data);
},
async : false
}
);
var options = {
beforeSend : function() {
$("#MediaSubmitForm").hide();
$("#MediaSubmitForm").after('<img class="hula-hippo" src="/img/hippo-hula.gif" />');
t = setInterval( check_progress, 500 );
},
success : function(data){
$(".hula-hippo").hide();
$("#MediaSubmitForm").after("<h3>Upload complete!</h3><p>Do <strong>you</strong> want to <a href='#'>create a project</a> of your own?</p>");
window.clearInterval(t);
console.log(data);
}
};
$("#MediaSubmitForm").ajaxForm(options);
</script>
Use setTimeout();. setInterval() executes the code after the time specified and setTimeout() executes the code every time it reaches the specific time.
These question explain well of their difference :)
setTimeout or setInterval?
'setInterval' vs 'setTimeout'
setInterval & setTimeout?
setInterval and setTimeout
JavaScript setInterval and setTimeout
And a search of this on SO will solve your problem :)
It sounds like this is a PHP locking issue. See the first comment in the answer to this question:
jQuery: Making simultaneous ajax requests, is it possible?
I have a webpage that i am embedding a script in that could take up to 10 minutes to run backend.
I have tried many different versions of an ajax script loader with a timer.
What i need:
On Page Load, I need to trigger the main working script to run.
The very last line of this script will create a unique text file in a folder, with filename of the user, so that the file appears once the script has completed.
And, then triggered from page load also, would be an AJAX function, to load a second script every 10 seconds,
this seconds script is very minimal, and checks if the user file (from script above) is in the specified dir.
If the file is not there yet (script still working), then it echoes
<img src"../loading.gif">
and if the file is now there (script has finished), then it echoes a link (or maybe a header to another page, i haven't decided about that yet)...
this will mean that on page load, the main processing script starts, and also trigger the ajax script (which will instantly display loading image), and once the main script has finished executing, the loading image will change to a link (or maybe just re-direct you to another page)
sorry for rambling, just tying to give as much info as possible...
ps, i presume i will need a simple load once ajax function to call the main processing script, so that it works in the background, or the main page will take ages to load
my latest attempt:
function MakeRequest()
{
var xmlHttp = getXMLHttp();
xmlHttp.onreadystatechange = function()
{
if(xmlHttp.readyState == 4)
{
HandleResponse(xmlHttp.responseText);
}
}
xmlHttp.open("GET", "processing_script.php?user=<?php echo $_SESSION['user']; ?>", true);
xmlHttp.send(null);
}
function HandleResponse(response)
{
document.getElementById('ResponseDiv').innerHTML = response;
}
To load a file every 5 seconds you could call your function every xy seconds with the setInterval function.
Have you considered using a javascript framework such as jquery? They provide some very easy to use ajax-methods to simplify the whole process. Handling an ajax-request "manually" is always a "pain in the ass" to me.
I have developed application for analysis data ie. domain name. When user provide 10 domains the following javascript code working fine but when user start analysis for 100 domains, below code does not work. I used javascript to redirect to another page after 3 second of form submit because processing assign task takes at least 1 minute time.
function submitForm(){
document.form1.button2.click();
var t=setTimeout("redir()",3000);
}
function redir(){
window.location.href = '<?php echo base_url();?>menu/showmsg';
}
When it is small task it is working fine but if there is big file to process javascript does not work, it wait till task completed from PHP side.
Is there any option in AJAX or JQUERY or any finest code in JavaScript?
A simple way is to use an iframe in the target page that runs the php. The page will load, and the iframe will wait for the php page. You can use some javascript to check for a change or flag in the iframe to move on with the process.
You can sent all requests asynchronously using for example jQuery.ajax( url [, settings] ).
Here the documentation: http://api.jquery.com/jQuery.ajax/
$.ajax({
url: '<?php echo base_url();?>menu/showmsg',
error: function(){
return true;
},
success: function(msg){
// make what do you want
}
});
I have constructed a PHP file which scrapes a web page (using cURL) to obtain some data, and outputs it to the screen in JSON format.
The target website involves some redirects which temporarily outputs data to my PHP file. Once the redirects have completed successfully, the JSON is presented as expected. The problem that I am encountering is that when I try to access the JSON using jQuery's $.ajax() method, it sometimes returns the incorrect data, because it isn't waiting for the redirects to complete.
My question is if it's possible to tell the AJAX request to wait a certain number of seconds before returning the data, thus allowing time for the redirects in the PHP script to execute successfully?
Please note that there is no cleaner solution for the page scrape, the redirects are essential and have to be outputted to the screen for the scraping to complete.
There's always timeout in the settings.
jQuery docs:
timeout Number
Set a timeout (in milliseconds) for the request. This will
override any global timeout set with $.ajaxSetup().
The timeout period starts at the point the $.ajax call is made;
if several other requests are in progress and the browser
has no connections available, it is possible for a request
to time out before it can be sent. In jQuery 1.4.x and below,
the XMLHttpRequest object will be in an invalid state if
the request times out; accessing any object members may
throw an exception. In Firefox 3.0+ only, script and JSONP
requests cannot be cancelled by a timeout; the script will
run even if it arrives after the timeout period.
You should use promise() in jQuery.
You could always store the result of your ajax call and then wait for the redirects to finsih, i.e.:
$.ajax({
success: function(e)
{
var wait = setTimeout(function(){ doSomethingWithData(e.data); }, 5000); //5 sec
}
})
Alternatively, you could set up an Interval to check if something happened (redirect finished) every x amount of ms. I'm assuming your redirects are letting you know they completed?
http://examples.hmp.is.it/ajaxProgressUpdater/
$i=0;
while (true)
{
if (self::$driver->executeScript("return $.active == 0")) {
break;
}
if($i == 20) {
break;
}
$i++;`enter code here`
echo $i;
usleep(10000);
}
I've got a web page that allows to start a certain process and then redirects to another page that displays log file of that process. Since execution takes up to 10 minutes, I want log page to autoupdate itself or load data from the file periodically.
Right now I added
<meta http-equiv="refresh" content="5;url=log.php#bottom" />
to html/head but wondering if there may be a better solution. Can someone give any advice on aproaching this problem?
I do it this way:
var current_length = 0;
function update() {
setTimeout(update, 3000);
$.post("/update_url", { 'current_length': current_length }, function(data) {
if (data.current_length != current_length) return; //it's too old answer
$("#log").html($("#log").html() + data.text);
current_length += data.text.length;
}, "json");
}
update();
The server must skip several bytes at beginning and send json with current_length and the rest of file.
I prefer using memcached to store process output.
You could:
Periodically poll the server to see if there are more messages, basically you would call a PHP script with javascript and would pass the length of the log file in the last poll and then insert into the document the new data. The server would return all the data after that offset and also the new length.
(simpler) Make a long lived PHP script that keeps reading the file and echo and flush it as soon as there's new data. See PHP: How to read a file live that is constantly being written to.
Use AJAX to do this. Easy in jQuery:
<script type="text/javascript">
$(function(){
window.setInterval('updateLog()', 5000);
});
function updateLog() {
$.get('log.php');
}
</script>
Why not use javascript?
Use setInterval and run an AJAX call to log.php periodically.
You could also use an iframe, but the AJAX perdiodical call is a better way of doing it in my opinion.