I have a cron job on my hosting server that is supposed to execute a phpscript every 30 minutes. This php script is used to scrap a csv file and uptate that data into a database (i'm using MySQL for that). When executed manually that script works - it takes it about 45 seconds to finish the process.
Now when that cron job runs i'm getting this message:
Could not open input file.
At first i thought that this might be because it takes over 30 seconds to execute that script so i decided to set the max execution time at the begging of the php script:
ini_set('max_execution_time', 300);
But again the same message came up.
This is my cron job:
/usr/local/bin/php /home/emmaein/domain.com/folder/script.php?token=d8cn3j
P.S: that token get variable is sort of a password so it can't be executed by everybody so basically my php script looks like this:
if($_GET['token'] === 'd8cn3j'){
//open csv
//get data
//update db
} else{
exit('I see you >:)');
}
When PHP is not running inside a web server, you can't access $_GET variables (there's no such thing). Instead you should use command line arguments:
<?php
if ($argc > 1 && $argv[1] === 'd8cn3j') {
// Do stuff
}
And then your crontab becomes:
/usr/local/bin/php /home/emmaein/domain.com/folder/script.php d8cn3j
The concept of $argc (the number of arguments) and $argv (an array of the arguments) is fairly standard among CLI programs, and is documented on PHP's website.
Related
I made a php cli script which is running on loop. I run it from server's terminal(windows). This server also function as a php webserver (xampp).
The php cli script is dealing with hardware i/o stuff (responding and giving logics to a microcontroller board through serial port). Which is always running.
And what i'm trying to accomplish is to make a web-based app (php cgi) to control that cli script. like sending some command to to make it do something.
What i've tried
I have tried using a kind of temporary json file. Which contents is generated by the cgi script.
Then the cli script read that file in every loops. And if there is a change in the json (a timestamp), the script use data inside json to do something accordingly. And then store that timestamp to compare it with the json for next loop.
But this cause a huge load on the server and the cli script become much slower. Which affects the responsiveness of the microcontroller.
the php cli loops is something like this.
<?php
$lastTimestamp=0;
while(true){
//read json file
$json=json_decode(file_get_contents("temp.json"));
if($lastTimestamp < $json->stamp){
//do something with $json->data
...
//update $lastTimestamp
$lastTimestamp = $json->stamp;
}
//rest of microcontroller logic here
...
}
and the temp.json file is something like
{
"stamp": 1557653475,
"data": {
"nd_a": 1,
"nd_b": 1,
"nd_c": 0
}
}
so the question is how to interract with the already running php cli script from the cgi script without using above methods? expecting a better way that is not affecting the server load and performance.
Edit: i also tried using database in place of json file, but the performance is still not good.
This way is inefficient. The simplest way is to make a cron job that executes the cli script.
If you're using a linux system, here is how you setup a cron (on Ubunutu)
https://help.ubuntu.com/community/CronHowto
Then in your cli script, you remove the loop and the job will read the temp.json (which is gonna be done every time the job runs) and compare both timestamps.
Now for keeping the $lastTimestamp I would suggest to create a text file right next to your cli script and store the value of $lastTimestamp in that file.
Then only if there is a difference between the timestamp retrieved from the json and the timestamp in the text file, you overwrite the text file with the new timestamp.
You can use file_put_contents or fopen/fwrite to write into the file.
So your script would be like the following
<?php
$lastTimestamp=0;
$lastTimeStampFilePath = "lastTimeStampFile.txt";
//read text file
$lastTimestamp=file_get_contents($lastTimeStampFilePath);
//read json file
$json=json_decode(file_get_contents("temp.json"));
if($lastTimestamp < $json->stamp){
//do something with $json->data
...
//update $lastTimestamp
file_put_contents($lastTimeStampFilePath, $json->stamp);
}
//rest of microcontroller logic here
Maybe configure the job to run every 2 mins or 5 mins, depending on your requirements.
What I've to do it's a bit complicated.
I've a python script and I want to run it from PHP, but in background. I had seen somewhere that for run a python script in background I have to use the PHP command exec(script.py) to run without wait for a return: and thus far no problem.
First question:
I have to stop this loop script with another PHP command, how to do this?
Second question:
I have to implement a server-side timer, who stops the script at the end of the time.
I found this code:
<?php
$timer = 60*5; // seconds
$timestamp_file = 'end_timestamp.txt';
if(!file_exists($timestamp_file))
{
file_put_contents($timestamp_file, time()+$timer);
}
$end_timestamp = file_get_contents($timestamp_file);
$current_timestamp = time();
$difference = $end_timestamp - $current_timestamp;
if($difference <= 0)
{
echo 'time is up, BOOOOOOM';
// execute your function here
// reset timer by writing new timestamp into file
file_put_contents($timestamp_file, time()+$timer);
}
else
{
echo $difference.'s left...';
}
?>
From this answer.
Then, there is a way to implement it in a MySQL database? (The integration with the script stop is not a problem)
That's actually pretty simple. You can use a memory object caching system. I would recommend memcached. Memory objects from memcached can be accessed literally from anywhere in your system. The only requirement is that a connection to the memcached backend server is supported. (PHP does, Python does, etc.)
Answer to your first question:
Create a variable called stopme with the value 0 in the memcached database.
Connect from your python script to the memcached database and read the variable stopme permanently. Let's say the python script is running when the variable stopme has the value 0.
In order to stop your script from PHP, make a connection from your PHP script to the memcached server and set stopme to 1.
The python script receives the updated value instantly and exits.
Answer to your second question:
It could be done like explained in my answer before through reading shared variables, but additionally I would like to mention that you also could use a cronjob to kill a running script.
I'm trying to create a browser-started self-calling/repeating PHP script on Windows with PHP (currently 5.3.24 but soon will be latest). It will act as a daemon to monitor changes in a database (every few seconds, so cron/schedule is out) and then call other PHP scripts to perform work when changes are found. For the purposes of this question please ignore the fact that I'd be better off doing this in C# or some other language :)
To keep things simple I started out by trying to use popen to run a second PHP script in the background...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\Test.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
// Test.php
SaveToTestTable(1);
Sleep(10);
SaveToTestTable(2);
exit();
If I run BatchMonitor.php in the browser it works fine. As expected it will save 1 to the monitor table, call Test.php which saves 1 to the test table, the original BatchMonitor.php will continue without waiting for a response and save 2 to the monitor table before exiting, then 10 seconds later the test page saves 2 to the test table before exiting. The second script starts fine, the first script does not wait for a reply and all parameters are correctly passed between scripts. With everything working as intended I then changed the system to work as a repeating loop by calling itself (with delay) instead of another script...
// BatchMonitor.php
SaveToMonitorTable(1); // save 1st test entry to see if the script reached this point
$Command = '"" "C:\Program Files (x86)\PHP\v5.3\php.exe" C:\inetpub\wwwroot\BatchMonitor.php --Instance=' . $Data->Instance;
pclose(popen("start /B $Command", "r"));
SaveToMonitorTable(2); // save 2nd test entry to see if the script reached this point
exit();
If I run BatchMonitor.php in the browser it runs once and that is it. It will save 1 to the database, wait 10 seconds and then save 2 to the database before exiting. The page returns successfully with no script or PHP errors but it doesn't repeat as it should.
Both BatchMonitor.php and Test.php use line-for-line identical functions to get the parameters and both files run correctly and identical on the first iteration. If I use exec instead of popen then the page loops correctly with all logic working as expected (with the one obvious flaw of creating a never-ending chain of scripts awaiting for response values that will never come).
Am I missing something obvious? Does popen have some sort of secret rule that prevents a page/process from opening duplicates of itself? Are there any alternatives to using popen or exec? I read about WScript.Shell but it might be a while before I can schedule that to get enabled so for now it's not an option and I'm hoping there is something more standard that I can use.
I dont feel like this should cbe your actual answer, But why do you disbandon scheduled tasks/cronjobs because you want something done every X seconds? Having the script minute.php calling 5seconds.php with ofcouse 5 second intervals in between would create a repeated taak evert 5 seconds right?
Strangely enough you are kinda using the same sort of mechanism from your browser already.
My only concern would be to take the processed time in account and create a safe script which ensures no more than 1 '5seconds.php' can run at any given time.
I've been completely unsuccessful finding an answer to this question. Hopefully someone here can help.
I have a PHP script (a WordPress template, to be specific) that automatically imports and processes images when a user hits it. The problem is that the image processing takes up a lot of memory, particularly if multiple users are accessing the template at the same time and initiating the image processing. My server crashed multiple times because of this.
My solution to this was to not execute the image-processing function if it was already running. Before the function started running, I would check a database entry named image_import_running to see if it was set to false. If it was, the function then ran. The very first thing the function did was set image_import_running to true. Then, after it was all finished, I set it back to false.
It worked great -- in theory. The site hasn't crashed since, I can tell you that. But there are two major problems with it:
If the user closes the page while it's loading, the script never finishes processing the images and therefore never sets image_import_running back to false. The template will never process images again until it's manually set to false.
If the script times out while it's processing images -- and that's a strong possibility if there are many images in the queue -- you have essentially the same problem as No. 1: the script never gets to the point where it sets image_import_running back to false.
To handle No. 1 (the first one of the two problems I realized), I added ignore_user_abort(true) to the script. Did it work? I don't know, because No. 2 is still an issue. That's where I'm stumped.
If I could ask the server whether the script was running or not, I could do something like this:
if($import_running && $script_not_running) {
$import_running = false;
}
But how do I set that $script_not_running variable? Beats me.
I've shared this entire story with you just in case you have some other brilliant solution.
Try using
ignore_user_abort(true); it will continue to run even if the person leaves and closes the browser.
you might also want to put a number instead of true false in the db record and set a maximum number of processes that can run together
As others have suggested, it would be best to move the image processing out of the request itself.
As an interim "fix", store a timestamp alongside image_import_running when a processing job begins (e.g., image_import_commenced). This is a very crude mechanism, but if you know the maximum time that a job can run before timing out, the script can check whether that period of time has elapsed.
e.g., if image_import_running is still true but the current time is more than 10 minutes since image_import_commenced, run the processing anyway.
What about setting a transient with an expiry time that would throttle the operation?
if(!get_transient( 'import_running' )) {
set_transient( 'import_running', true, 30 ); // set a 30 second transient on the import.
run_the_import_function();
}
I would rather store the job into database flagging it pending and set a cron job to execute the processing one job at a time.
For Me i use just this simple idea with a text document. for example run.txt file
in the top script use :
if((file_get_contents('run.txt') != 'run'){ // here the script will work
$file = fopen('run.txt', 'w+');
fwrite($file, 'run');
fclose('run.txt');
}else{
exit(); // if it find 'run' in run.txt the script will stop
}
And add this in the end of your script file
$file = fopen('run.txt', 'w+');
fwrite($file, ''); //will delete run word for the next try ;)
fclose('run.txt');
That will check if script already work by checking runt.txt contents
if run word exist in run.txt it will not run
Running a cron would definitively be a better solution. Idea to store url in a table is a good one.
To answer to the original question, you may run a ps auxwww command with exec (Check this page: How to get list of running php scripts using PHP exec()? ) and move your function in a separated php file.
exec("ps auxwww|grep myfunction.php|grep -v grep", $output);
Just add following on the top of your script.
<?php
// Ensures single instance of script run at a time.
$fileName = basename(__FILE__);
$output = shell_exec("ps -ef | grep -v grep | grep $fileName | wc -l");
//echo $output;
if ($output > 2)
{
echo "Already running - $fileName\n";
exit;
}
// Your php script code.
?>
I am running a script which constantly works over my Database. How ever It is necessary to restart the script once an hour. Obviously I can't do that automatically. I don't want to use daemon, its too complex for me right now. Easier solution is to use cron job but biggest drawback is, it won't stop the last script. Script runs in infinite while(true) loop
However is that possible if I make function in a script, lets say
function exitScript()
{
exit;
}
And then on Cron job if i do something like
php /home/Path/public_html/webservice/myScript.php exitScript and then
php /home/Path/public_html/webservice/myScript.php
What will be the format and How can I run both one by one using cron job or make another PHP who does so?
I need advice.
Here is a little trick easy to made which you can use..
1st set you cron jobs to run on each hour.
* */1 ..... cronjob.php
2nd At start of your script define 1 constant with time:
define('SCRIPT_START_TIME', time());
3rd At your exit script set up a condition check to exit if 59 minutes are passed from this constant to current time.. :)
function exitScript()
{
if((time() - SCRIPT_START_TIME) > 59*60){
exit();
}
}
4th at each while LOOP start the exit script .