Does uploading the files affects or broke associated infinitely running scripts? - php

Just wanted to know, how PHP will work in the following case?
Suppose I have cron script which is running every minute.
OR there is infinite loop script which is processing the queue table.
Now suppose I am updating any related class file which is used in infinite loop script.
Does it generate any error or stop the infinite loop script?
And what are the good practices need to follow in such situation?

Nothing will happen to already running scripts when you change any source code.
The source code is read from the file once at the start of the script, is parsed into bytecode, and then stays in memory until the end of the script. Your program is not actually "running from source" all the time or any such thing and it will not notice any changes to the source code files until it needs to load the file again.
An infinite loop program will only reflect changes when you stop and restart it.
A cron job will pick up any change the next time it runs.

Related

Limiting PHP execution time under CLI

I run a large number of PHP scripts via CLI executed by CRON in the following format :
/usr/bin/php /var/www/html/a15432/run.php
/usr/bin/php /var/www/html/a29821/run.php
In some instances there is a problem with once of the files whereby the execution of the file enters a loop or simply does not terminate due to an error in that particular script.
What I would like to achieve is that if either of the above occurs, PHP CLI simply stops executing the script.
I have searched this and all indications are that I need to use this command, which I have entered at the beginning of each file, but it does not seem to be making any difference.
ini_set('max_execution_time', 30);
Is there any other way I can better protect my server so that one (or more) of these rogue files cannot be allowed to bring down the server by entering some kind of infinite loop and using all of the servers resources trying to process the file? To make the problem worse, these files can be triggered every 5 minutes which means that at times, there are loads of the same files trying to process.
Of course I realise that the correct solution is to fix the files so that there is better error handling and that they are never allowed to enter this state, but at the moment I'd like to understand why this doesn't work as expected.
Thank you.

Deleted process continues creating a file

In my website I had a process which needs to create a file (executed with cronjob). However, I removed both the script and the cronjob, but the file is still created and the process from the cronjob executed. I know this sounds really unbelievable, but is there any way a process can stack in the memory of the server and loop? Can you think of any other reason causing this problem?
I had same issue. Process is executed infinity times and code changes, code removal does not help as it seems that process is "caching" your code.
What I have done:
Log in to SSH to server, use top command and search for PHP process PID, than use kill to terminate this process. (some source about it).
To prevent this:
I have created some file on server and inside every loop and just before starting function (I had recursive function) check if file exists (or check for valid content). If not found - do not execute process.

Creating a File-Based Cron Job in Linux

I am trying to do what I think is a pretty complicated task.
I would like to execute a PHP script, but only if a file named import_products.csv exists in /var/import. The reason why is because that script attempts to import the aforementioned CSV file, and therefore I don't want it to error if it attempts to run the script every minute.
On the other hand, I DO want the script to run every minute (or even 2 or 4 times per minute) if there IS a file. So what I am invisioning is something like this.
I'd create the cron job to run every minute, which does the following:
Check if /var/import/import_products.csv exists
If it does not exist, exit the script now.
Execute product_import.php (this would be stored in the same folder as my cronjob file)
After execution completes, delete /var/import/import_products.csv
I will be using a separate daemon to generate these CSV files, all that is imperative here is that it only runs the PHP script if the file exists.
If there is a way to do this without using cron, that would work too. Basically I just want it to run that PHP script every time a new import_products.csv is written and delete the file afterwards. Cron jobs are the only way I know for sure to do this.
There are a couple of ways I could envision you doing this. The first is, if possible, the easiest, which would be to add in checks to the PHP script itself to see whether or not the file is present. If your product imports will take longer than one minute you'll also have to consider what happens if the file is still there while another import is happening already.
The second way would be to create a bash or shell script of some kind that will check for the existence of the file and then run the command to execute the PHP script if so, then add the shell script to your cron instead of the PHP script itself.
You can include the file checking within the php script through Exception handling, with a little php side overhead.

best way to repeat php script after fixed interval

i want my php script to be executed after every 10 seconds until stop button is pressed
Is using an while loop with sleep() the best way or there is a better way of doing it?
And i want to know that if i run that while loop, will it prevent the other scripts on the page from running?
i mean that during the sleep time, as the PHP script is still running, will browser wait for this script to end of will run other scripts simultaneously?
As far as i have understood, cron wont be helpful in this case as i have to run the script between the time when start and stop button are pressed. Please correct me if i am wrong.
Cron would do the job although, with an interval as small as 10 seconds, you'd probably be better off writing a daemon instead (and running from the command line or init.d, not through a web server).
You just need a way to switch it on and off. That could be something as simple as testing to see if a file exists (and then adding or deleting it as desired).
And i want to know that if i run that while loop, will it prevent the other scripts on the page from running?
You need to make the script stand-alone for this to make any sense at all.

PHP script modified during parsing/execution

I have a PHP script running on a cron that can take a long time to process fully (e.g. 10 minutes). What would happen if this script was modified while it's being parsed? or during execution? The reason I ask is I have a number of scripts across servers that I want to place under version control and i'm concerned about what might happen if the script happens to get updated while it is processing. If it is an issue then I could place some kind of lock on the file while it is running.
Nothing should happen to the running script, because by the time it starts running, PHP would have already parsed it and the opcodes are already in memory, so there's no more disk access.

Categories