I was running a script in my browser which insert like some 100000 records in database. For some reason I deleted the file in which I have written script but it was little surprising for me that the scripts keeps inserting the records in the database even when the script doesn't exist. Why is that?
The process is executing in memory not from disk.
If you wish to stop a running script, you will need to restart your webserver, or kill the php process. Depending on whether it was running from the command line or not.
When a request is made to the web-server:
The PHP file is loaded into memory, it gets compiled and is running.
Since the file is in memory, any changes to file on disk is not applicable to running file.
If that strikes another question in you, as to "Why the hell it gets loaded into memory ?? Why can't it be executed from the disk directly ?":
Memory here is mostly the RAM, which is faster in doing read/write which is the need to compete with processors speed.
HardDisk is a slow memory and therefore, accessing from it, would make your programs very slow to execute.
To match with processors speed, there comes a need for a executable file to be loaded from Slow Memory (Often HardDisks) to a faster memory (Often Ram, or processor's cache sometimes).
And therefore, the reason and answer, as to why your file on disk is not in sync file in memory.
Also please be assured, that this would be working fine in your next request!!
But if you still want to do it immediately, you can consider stopping/restart your Apache/IIS (whichever applicable) Server, as this will kill the process immediately.
#Umair once you run the script then request go to the server and it'll run on server until process complete . so it doen't matter if file is available or not because it's old request which is under process that's why this happening.
Related
I've got a sever which has an action triggered by a frequent cron job.
This is a php application build on Silverstripe (4.0).
The issue I'm facing is the php processes stay alive and also keep database connections open. This means after a few days the site stops working entirely once SQL stops accepting new connections.
The system has two tasks on cron jobs;
One takes a massive CSV file and spits it into smaller sub files which are then imported into the database. This one uses a lock file to prevent it running into a previously running instance. I'm not too sure if this is working.
The second task processes all the records which have been updated in large batches.
Either of these could be the source of the overloading but I'm not sure how to narrow it down.
What's the best way to diagnose the source of the issue?
In terms of debugging, this would be like any other task profiling the application with something like xdebug and kcachegrind. To ensure that processes do not run for too long you can limit the max_execution_time for the PHP.ini for the CLI.
To then let the CLI process run for a long time, but only just enough time add something to set the max execution time on a per row basis:
$allowed_seconds_per_row = 3;
foreach($rows_to_process as $row){
set_time_limit($allowed_seconds_per_row);
$this->process($row);
}
You can also register a shutdown function to record the state as the script ends.
It is likely that the memory is a key cause for failure and debugging focused on the memory usage and that can be controlled by unsetting variable data as needed.
When you overwrite a php file on a server (say, via SFTP) while it is being processed somewhere (perhaps it's a script that takes several seconds to compelete) does it cancel the currently running script or does that finish out even after the overwrite occurs? I suppose I'm asking: does apache load a php script into memory before executing it (and does it hold on to that in memory for the duration of execution)?
does apache load a php script into memory before executing it (and does it hold on to that in memory for the duration of execution)?
Yes.
Nothing at all. The script has already been loaded into memory in its compiled state - no matter how much time it takes, the web server won't load the new file unless you refresh the page.
I have a PHP script running on a cron that can take a long time to process fully (e.g. 10 minutes). What would happen if this script was modified while it's being parsed? or during execution? The reason I ask is I have a number of scripts across servers that I want to place under version control and i'm concerned about what might happen if the script happens to get updated while it is processing. If it is an issue then I could place some kind of lock on the file while it is running.
Nothing should happen to the running script, because by the time it starts running, PHP would have already parsed it and the opcodes are already in memory, so there's no more disk access.
I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
Probably the approach I would recommend is to use tempnam() first and write the contents inside (which may take a while). Once done, you do the file locking, etc.
Not sure if this happens when a PUT request is being done; typically PHP will handle file uploads first before handing over the execution to your script.
Script could crash on these two limitations
max_execution_time
memory_limit
while working with resources, unless you have no other errors in script / check for notice errors too
I'm helping out in a forum that runs on SMF. The site has been lagging recently and our host tells us it's the file uploads that clogs the servers memory and that SMF is using server memory in a non optimized way. There's probably one file upload every hour at most so the load isn't that high.
Any thoughts on this? I don't know php to the extent that i can argue against them.
If PHP is run as an Apache module, used memory will not always be returned when the PHP script ends. There are a couple of ways to fix this:
Use less memory in your script (obviously)
Run your script as CGI instead of as an Apache module (this way, the memory will be returned on script exit)
Restart Apache when the memory needs to be reclaimed. This is not really a good solution, but we do it at Levonline twice a day...
Upgrade your hosting to your own server, where you don't have to think about the hosting provider's other customers, and can use as much memory as you want.