i have develop a eblast application
The program is use to send a email to some recipients
the recipients email will be grab from a xls file
and the program has set it to send 10 email each time and sleep 30 seconds
and use ob_flush(); and flush(); to out put the steam of the process and display in frontend
yesterday my client test it with 9000 recipients (it should take arround 10hours)
and he told me the program has stop, and i found the log file has mark that the program has stopped at 65XX emails,
that mean the program has already sent 6XXX email (arround 7hour)
and this problem will never happen in cron job,but only happen when exec though the web browser
my frd told me because it is all about long time sleep?
and he suggest to use cron job, however my application already has cron job to set,
the client just want to have a feature to send the email immediately
any other solution to solve? use php call a linux command and excu a php email sending script?
Long running processes in Apache or IIS are tricky. The problem is if anything happens like a restart of the webserver or a timeout you lose your work. You are better off keeping this simple and making into a cron job but if you are up for the challenge it is possible to get around.
I've gotten around occasional webserver restarts by saving the state of my process into a database and a script that continually hits the page to check if it's up and working. So when the long running process first loads it checks if it should be running and if it should continue a job or not. In your case that might be the line number of the excel file.
It ends up being lot of extra work and you need to be very careful. From the sounds of your project I would keep it simple by going the cron job route you mentioned.
My solution is, try to set your cronjob to run every minutes.
However, you should save the state of your cronjob so that it didn't run twice.
I usually do it this way (Note that this cron is intended to run every minute):
if(stat_check_file('cron.stat'))
{
die("Found CRON.STAT, Exit!");
}
else
{
stat_create_stat_file('cron.stat');
//do your long process here...
}
stat_delete_stat_file('cron.stat');
function stat_check_file($filename)
{
global $rootdir;
return file_exists($rootdir.'/'.$filename);
}
function stat_create_stat_file($filename){
global $rootdir;
touch($rootdir.'/'.$filename);
}
function stat_delete_stat_file($filename)
{
global $rootdir;
if(stat_check_file($filename))
{
#unlink($rootdir.'/'.$filename);
}
}
Now, on your cronjob, simply load the xls, run it and write log to either database / file.
then, on your panel, read that log and display it so that your client will see right now, there's xxx email sent and xxx email to go.
Related
I have an 80M zip file that I am uploading to 'ShareFile' through their REST API using PHP cURL. The script works fine when run via the browser or on my local Windows machine using the php7 CLI. However when I attempt to run the script via a cron job on our client's shared hosting using /usr/local/bin/ea-php71, the file and script suddenly stop running mid-upload.
I have set my email address to be notified of any and all cron output.
I have a set_error_handler error handler that echoes any received errors and attempts to log them to a file (this never gets called).
I have a register_shutdown_function that also never gets called.
Finally, I have a curl_setopt CURLOPT_PROGRESSFUNCTION callback to see what is happening with the upload - it writes the total upload bytes $uploadTotal and the current upload bytes $uploadCurrent to a log along with a timestamp.
(Edit) - Forgot to mention I've also got the verbose setting enabled CURLOPT_VERBOSE and am outputting stderr to a log file - but nothing relevant is displayed.
Number 4 has given me the most insight into the issue - when the script is run as a cron job the upload stops around:
upload progress: [UploadTotal: 87455948] [UploadCurrent: 34913772]
every time (roughly around 67 seconds).
Some final details ...
When uploading the 80MB file through cron - my error handler and shutdown function never get called, nor does the cron email from any echoed data. The script just seems to simply PAUSE forever (it's almost like the script is waiting for user input or something).
However:
If I run the script locally or change the upload file to a 5MB file - everything works as normal - all functions get called and I receive an email notifying that the script has finished running.
It looks to me like I'm hitting some kind of timeout through cron which, per everything I've read should not be the case. I'm looking for some ideas for what I could be running into or how to troubleshoot this one.
Thank you!
I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.
I have a PHP script that runs on a shared hosting environment server. This PHP script takes a long time to run. It may take 20 to 30 minutes to finish a run. It is a recurring background process. However I do not have control over when the process starts (it could be triggered every five minutes, or every three hours, no one knows).
Anyway, at the beginnin of this script I would like to detect if the previous process is still running, if the earlier run is still running and has not finished, then I would not run the script again. If it is not running, then I run the new process.
In other words, here is a pseudo code. Let's call the script abc.php
1. Start script abc.php
2. Check if an older version of abc.phh is still running. If it is running, then terminate
3. If it is not running, then continue with abc.php and do your work which might take 30 minutes or more
How can I do that? Please keep in mind this is shared hosting.
UPDATE: I was thinking of using a DB detection mechanism. So, when the script starts, it will set a value in a DB as 'STARTED=TRUE', when done, it will set 'STARTED=FALSE'. However this solution is not proper, because there is no garantee that the script will terminate properly. It might get interrupted, and therefore may not update the STARTED value to FALSE. So the DB solution is out of the question. It has to be a process detection of some sort, or maybe a different solution that I did not think off. Thanks.
If this is a CGI process, I would try using exec + ps, if the latter is available in your environment. A quick SO search turns up this answer: https://stackoverflow.com/a/7182595/177920
You'll need to have a script that is responsible for (and separate from) checking to see if your target script is running, of course, otherwise you'll always see that your target script is running based on the order of ops in your "psuedo code".
You can implement a simple locking mechanism: Create a tmp lock file when script starts and check before if the lock file already exists. If it does, dont run the script, it it doesnt create a lock file and run the script. At then end of successful run, delete the lock file so that it will run properly next time.
if(!locked()) {
lock();
// your code here
unlock();
} else {
echo "script already running";
}
function lock() { file_put_contents("write.lock", 'running'); }
function locked() { return file_exists("write.lock"); }
function unlock() { return unlink("write.lock"); }
we are writing a PHP script which creates virtual machines via a RESTful API call. That part is quite easy. Once that request to create the VM is sent to the server, the API request returns with essentially "Machine queued to be created...". When we create a virtual machine, we insert a record into a MySQL database basically with VM label, and DATE-CREATED-STARTED. That record also has a field DATE-CREATED-FINISHED which is NULL.
LABEL DATE-CREATED-STARTED DATE-CREATED-FINISHED
test-vm-1 2011-05-14 12:00:00 NULL
So here is our problem. How do we basically spin/spawn off a PHP worker, on the initial request, that checks the status of the queued virtual machine every 10 seconds, and when the virtual machine is up and running, updates DATE-CREATED-FINISHED. Keep in mind, the initial API request immediately returns "Machine queue to be created." and then exits. The PHP worker needs to be doing the 10 second check in the background.
Can your server not fire a request once the VM has been created?
Eg.
PHP script requests the server via your API to create a new VM.
PHP script records start time and exits. VM in queue on server waits to be created.
Server finally creates VM and calls an update tables php script.
That way you have no polling, no cron scripts, no background threads. etc. But only if you're system can work this way. Otherwise I'd look at setting up a cron script as mentioned by #dqhendricks or if possible a background script as #Savas Alp mentioned.
If your hosting allows, create a PHP CLI program and execute it in the background like the following.
<?php
while (true)
{
sleep(10);
// Do the checks etc.
}
?>
And run it like the following command:
php background.php & // Assuming you're using Linux
If your hosting does not allow running background jobs, you must utilize every opportunity to do this check; like doing it at the beginning of every PHP page request. To help facilitate this, after creating a virtual machine, the resulting page may refresh itself at every 10 seconds!
As variant, you can use Tasks module, and there is sample of task code:
class VMCheck extends \Tasks\Task
{
protected $vm_name;
public function add($vm_name)
{
$this->getStorage()->store(__CLASS__, $vm_name, true);
}
public function execute()
{
do
{
$check = CheckAPI_call($vm_name); //your checking code here
sleep(10);
}
while (empty($check));
}
public function restore($data)
{
$this->vm_name = $data;
}
}
I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.