I have a hosted websolution and a script that update information backend for the site.
My webhost allows cron and I have made a script that wget's to every hour.
The script checks the db-table with posts to update for the oldest updated post and update that. In the end of the script I use javascript refresh to do the same procedure again this time with the second oldest updated post and so on until all posts are run through. If I open the script in my browser it works fine but when my host sets up the cron the script does not continue after the javascrpt refresh.
How can I solve this with another refresh solution that will work until the my statement stops it just by letting cron start the script?
(I changed to this solution from one where all posts were updated in one pageload but since it started to time out I went with this)
script.php
$limit=3600;
//Select the oldest updated post
if($last_update<$update_to_limit){ //check if the post was updated during this run
// Script that update the post and below the java refresh that repeats the script.
?>
<script type="text/javascript">
<!--
window.location.href = "http://www.site.se/script.php"
//-->
</script>
<?php
}else{
echo 'OK : All posts updated within the last : '.$limit.' s';
}
Wget will not run Javascript, if you open your code from the browser, it will of course run it, that is the reason of the difference.
I don't really suggest the method you're trying to use. If you really want, you can call wget again from PHP, use cURL or I guess even header('Location..');. But it would be much nicer to solve this problem in one turn.
If your code times out, I'd recheck the way that PHP code is written, and try to find a better solution, that is not so time-consuming. Afterwards: is the DB good enough, are the indexes set, etc. Or, if you cannot optimize it (or you do not want to), you can use set_time_limit().
(One more thing: Java!=Javascript. If you want to shorten it, write js instead of java)
Related
I need to run a PHP script from another site, without the use of CRON, so that will be called whenever anyone comes or refreshes the page.
The script will perform some kind of update my database, it is possible that it takes several tens of seconds, so I needed to run the PHP script so that it also does not limit the site visitor, from where the script will be called.
But I do not want to make the script really starts up every time someone visits or refreshes the page, I would like to limit one minute and so, before calling the script, I would like to put into MySQL database current time someone (who is the one - the one who was first) arrives or refreshes the page, and in the case where someone just to update the page is first compares the current time with the database from the last call script, and if the difference is less than one minute, so the script does not call, but if more than one minute is executed while the database again writes the current time with the last script execution.
I do not need any response from running the script.
Importantly, it shall not affect the page loading user, where it should be called.
Thanks for help
You could do a jQuery AJAX call in background after the page is load, so the user wont wait the script finish to load the page
http://api.jquery.com/jQuery.ajax/
However, I do not think the way you want to do it is correct. It's possible, but Im not sure if it's usefull.
Can you split your script in different tasks?? So you can do them before loading the page, and the users wont notice any difference.
Ok, if javascript is not an option you can do a little bit more research on php forking. It's basically php version of Thread but much more limited. So you can actually fork a child php process to run in the background while the main process still doing the usual thing. So it won't affect your day to day process.
http://php.net/manual/en/function.pcntl-fork.php
Situation:
My php/html page retrieves the contents of another page on a different domain every 5-10 minutes or so. I use a JavaScript setInterval() and a jquery .load() to request content from the other domain into an element on my page. Each time it retrieves content, javascript compares new content with the previous content and then I make an Ajax call to a php script that sends me an email of what the changes are.
Problem:
It's all working fine and dandy except for the fact that I need a browser constantly open, requesting the updates.
Question:
Is there a way to accomplish this with some sort of 'self executing' script on the server? Something that I would only have to start once, and it continues to run on it's own without needing a browser to be open as long as I want the script to run?
Thanks in advance!
P.S. I'm not a php/javascript expert by any means, but I can get my way around.
I believe the thing you are looking for is a cron job.
If your script relies on Javascript for proper execution, you will need to use a browser to accomplish your goals.
However, if you can alter your script to perform all of the functionality via PHP, perhaps using cURL to request the necessary data, you can use a cron job to execute the script at regular intervals.
If you're running a script at an interval, I would recommend using a bash script instead that runs in the background.
#!/bin/bash
while [ 1 ]
do
php "script.php"
sleep 300
done
Then you can run the script like nohup bash.sh. 300 seconds = 5 minutes.
I am struggling with something.
I have an PHP page that does an ajax call to another page using jQuery $.ajax. It sends the request async to the processing page which then returns a response.
This works fine now but we are making some changes to the backend and the processing (SQL stored procedure) that runs is now taking a lot longer like well over 5 minutes. The wait is is fine because we are dealing with close to 200MM records in SQL.
The thing is I need to be able to send the request to the processing page and not have to wait for a response. The processing page fires off the stored procedure in PHP like this:
$query = $dbh2->prepare('exec sp_name :countID');
$query->bindParam('countID', $countID);
$query->execute();
Now again that stored procedure takes awhile to run and we do not need the results of that to be presented back to the user. There is though some additional PHP code that needs to run after the stored procedure but again nothing needs to be send back to the browser.
I am trying to figure out a way that I can make a call to the processing page and it runs the stored procedure and the other code but the user's browser does not need to wait for the response. Right now if the try to click off the page too soon it basically locks up the browser for awhile and does not finish the processing.
Any insight into this would be great.
Thanks in advance for any help.
Sequenzia, if I understand correctly, then I've been here and found a way through this quagmire after a lot of research.
I provided an answer to a similar question a few months ago. Unfortunately, the OP nor anyone else has ever accepted/commented/upvoted/downvoted - nada.
Run a batch file from my website
And here are some useful references :
Running a background script (unix command)
Ref: http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
Ref: http://www.mathinfo.u-picardie.fr/asch/f/MeCS/courseware/users/help/general/unix/redirection.html
How to compose PHP $shortopts and $longopts
This is the way to interpret parameters passed to a PHP script when run from the command-line, or from another PHP script with shell-exec()
Ref: http://www.php.net/manual/en/function.getopt.php
You might look at setting the timeout option for the $.ajax() method. By setting a timeout of maybe half a second or whatever, the ajax will just timeout and go into the error handler (if any).
Simple question, but I can't seem to find the answer.
My php code takes a really long time to process because I'm generating a report from a large database. I coded an html table to display the results in a web page, but the page loads (and gets sent to clients) before my php code finishes because all the table values are empty. I run the query on phpMyAdmin and it works, but it just takes a long time. Ideas? Are there any other ways I can display the report in a table format besides seeing it in a webpage? Can I make the webpage wait until the code finishes?
There are several approaches
one is using
ob_start();
// processing
ob_flush();
flush();
the next is adding pagination, aka limiting the result size.
SELECT * FROM table LIMIT 0,10
SELECT * FROM table LIMIT 10,10
SELECT * FROM table LIMIT 20,10
of course it all depends on your code, without seeing your code there's only guessing what the reason might be
Can I make the webpage wait until the code finishes?
It's really, really difficult to write PHP code which implements asynchronous database calls - which it would need to do if the PHP script completes before the MySQL script. Just change strip out all the asynchronous handlers in the PHP code and make the MySQL calls blocking and it will not exit before the queries complete - but I very much doubt that is what your code really is doing.
but the page loads (and gets sent to clients)
This is confused too - if you're generating HTMLthen the page is laoded after the HTML is sent to the client - not before.
Simple question
No it's not - it's very confused!
The prudent way
One of the correct approaches, at least one I'd recommend, would be to, upon request from the user, add the job to a queue that is handled by a background process, for example, a PHP command line script running from a cron job. While that is going on, you can periodically request job status from the server via an AJAX call from your webpage, display progress, if you can, and present the user with the result once the job is finished. Since command line PHP scripts don't have time limits, you don't have to worry about timeouts.
Another way is what is implemented, for example, in 37signals' Highrise - they take the request add a job but display a page saying "It will be ready when it's ready," and when it is ready, they send an email to the user saying "Here's your file, come here and download."
The quick fix
To answer the question "Can I make the webpage wait until the code finishes?" – there is the set_time_limit() function that does exactly what you want.
I have a PHP script that takes about 10 minutes to complete.
I want to give the user some feedback as to the completion percent and need some ideas on how to do so.
My idea is to call the php page with jquery and the $.post command.
Is there a way to return information from the PHP script without ending the script?
For example, from my knowledge of this now, if I return the variable, the PHP script will stop running.
My idea is to split the script into multiple PHP files and have the .post run each after a return from the previous is given.
But this still will not give an accurate assessment of time left because each script will be a different size.
Any ideas on a way to do this?
Thanks!
You can echo and flush() output, but that's suboptimal and rather fragile solution.
For long operations it might be good idea to launch script in the background and store/updte script status in shared location.
e.g. you could lanuch script using fopen('http://… call, proc_open PHP CLI process or even just openg long-running script in an <iframe>.
You could store status in the database or in shared memory (using apc_store()).
This will let user to check status of the script at any time (by refreshing page, or using AJAX) and user won't lose track of the script if browser's connection times out.
It also lets you avoid starting same long script twice.