In php how to get last execution time of a specific file?
I am working on a plugin which is a cron job and I want its last execution time.
how can i get it?
Actually problem is suppose due to some problem my is not executed then how can I know what was it's last execution time.
You can log it somewhere. Otherwise that information isn't available.
Use microtime to get an accurate time measure.
Best option would be too use a database (a flat file database like a simple text file would be fine) and store the time so you can read it later.
But if thats not an option try using the fileatime(). It should work fine as long as your cron job is the only one accessing the file in question
http://www.php.net/manual/en/function.fileatime.php
Related
I have been researching on how to approach this. What I am trying to prevent is an overlapping execution of a cronjob. I would like to run my script in every minute basis because the application is support needs a constant look out. The problem is if it takes quite a long time to finish and the next cron execute will catch up.
I have searched and some posted about PID but did not get on how to do it. I cannot use lock files because it can be unreliable, tried it already.
Is there any other approach on this?
Thank you.
Get each job to write to a database in completion. Then put an if statement at the start of each script to ensure that the other script has run and completed (by checking your database).
Alternatively...
You could have your first script run your second script at the end?
I have a simple question. I‘d like to write a php function to check the database rows and if the number of rows are affected by the last ran query, execute an internal php file. The catch is, that I want it to check the rows, and check the timestamp at the same time so if the time stamp is different and the row count is different, it executes the php file.
The file in question is a sql database backup, so I need it to only execute if there was a change in the database and if the time stamp is older than 43200 seconds (half a day). This would backup the database if there was activities on the site (one activity would back once, two activity would back up twice and anything more than that would be ignored), and if not, it would not do anything. I hope I’m explaining it right.
Cron job is out of question, since it’s dependant on the database changes not just the time.
The code I’m using is like this (without checking the database rows) and is only accessed when a customer access the shopping cart checkout or account page:
<?php
$dbbackuplog = '/path/to/backuptime.log';
if (file_exists($dbbackuplog)) {
$lastRun = file_get_contents($dbbackuplog);
if (time() - $lastRun >= 43200) {
//Its been more than 12 hours so run the backup
$cron = file_get_contents('/file.php');
//update backuptime.log with current time
file_put_contents($dbbackuplog, time());
}
}
?>
I appreciate any input or suggestions.
First of all, you cannot run anything with file_get_contents. That function simply reads the bare contents of the file you ask for and under no circumstances will it run any code. If you want to run the code, you want include or require instead.
Second, your idea about not just triggering but also fully executing backups while a customer is performing an action is, well, I 'm not going to pull any punches, terrible. There's a reason why people use cron for backups (actually more than one reason) and you should follow that example. That's not to say that you are not allowed to affect the behavior of the cron script based on dynamic factors, but rather that the act of taking a backup should always be performed behind the scenes.
I am developing a PHP/MySQL application which entails processing of CSV files but the script always stops before the entire process is completed.
How can I optimize the system to conclusively handle this?
Note I wont be doing the webhosting for this system so I cant be able to extend the PHP maximum execution time.
Thanks
A couple of ideas.
Break the file down into a row set that you know you can process in once shot. Launch multiple processes.
Break down the work so that it can be handled in several passes.
Check out LOAD DATA INFILE. It's a pure MySQL solution.
You could begin/execute this SQL with a PHP script, which could continue to run after the script stops/timeout. Or, better yet, schedule a cron job.
You don't need to have control over config files to extend maximum execution time. You can still use set_time_limit(0) on your code to make it run till the end. The only catch is if you are calling this from the browser. The browser may time-out and leave the page orphaned. I have a site which generates CSV files that take a long time and I put the process to run in the background by ending the session with the browser using buffer flush and send an email notification when the job is finished.
Suggestion one: after you insert one of the rows, remove it from the csv file
Suggestion two: update a file or mysql with last inserted csv row and with next run skip all other entries before that row.
Also, you can add a limit of 30 seconds per execution or 100/1000/X rows per execution (which works best before the script terminates). That will work for both suggestions.
I am looking to make a script that runs and uses the time stamp of the last time it ran as a parameter to retrieve results that have been updated since that time. We were thinking of creating a database table and having it update that and then retrieve the date from there, but I was looking for any other approach that someone might suggest.
Using a database table to store the last run time is probably the easiest approach, especially if you already have that infrastructure in place. A nice thing about this method is that you can write the run time right before the script terminates, in case it runs for a long time and you do not want it to start up again too soon.
Alternatively you could either write a timestamp to file (which has it's own set of issues) or attempt to fish it out of a log file (for example, the web access log if the script is being run that way) but both of those seem harder.
This might work: http://us.php.net/manual/en/function.fileatime.php (pass it $_SERVER['SCRIPT_FILENAME'])
Your best result would be to store your last run time. You could do this in a database if you need historical information, or you can just have a file that stores it.
Depending on how you run the script, you may be able to see it in your logs, but storing it yourself will be easier.
I'm pretty sure I've seen this done in a php script once, although I cant find the script. It was some script that would automatically check for updates to that script, and then replace itself if there was an update.
I don't actually need all that, I just want to be able to make my PHP script automatically run every 30 minutes to an hour, but I'd like to do it without cronjobs, if its possible.
Any suggestions? Or is it even possible?
EDIT: After reading through a possible duplicate that RC linked to, I'd like to clarify.
I'd like to do this completely without using resources outside of the PHP script. AKA no outside cronjobs that send a GET request. I'd also like to do it without keeping the script running constantly and sleeping for 30 minutes
If you get enough hits this will work...
Store a last update time somewhere(file, db, etc...). In a file that gets enough hits add a code that checks if the last update time was more xx minutes ago. If it was then run the script.
You may want to use the PHP's sleep function with specified time to run your code with that interval or you may want to try some online cron job services if you wish.
Without keeping the script running constantly, you'll either have to use something hackish that's not guaranteed to actually run (using regular user pages accesses to run a side routine to see if X amount of time has passed since last run of the script and if so, run it again), or use an external service like cron. There's no way for a regular PHP script to just magically invoke itself.
You can either use AJAX calls from your real visitors to run scheduled jobs in the background (google for "poor man's cron", there are a number of implementations out there) or use some external cron-like service (for example a cronjob on some other machine). In theory you could just run a PHP script with no timeout and make it loop forever and fire off requests at the appropriate time, but the only thing that would achieve is reinventing cron in a very ineffective and fragile way (if the script dies for some reason, it will never start again on its own, while cron would just call it again).
Either way, you will need to set proper execution time so the script does not exceed it.
I found this:
<?php
// name of your file
$myFile="time.db";
$time=file($myFile);
if(time()-3600 > $time[0]){
// an hour has elapsed
// do your thing.
// write the new timestamp to file
$fh = fopen($myFile, 'w') or die("can't open file");
fwrite($fh, time());
fclose($fh);
}
else{
// it hasn't been an hour yet, so do nothing
}
?>
in here
If the host includes a mysql 5.1+ db then perhaps timed triggers are availible to call the script? I like these mission impossible type questions, but need more information on what kind of playground and rules for the best answer.