I run PHP script in command line from CRON job. Once it complete a process, sometime PHP process still run on task manager as well as command prompt is active in windows forever, unless i kill manaully process from task manager. I put log file, it shows a code is working properly and end of the script. However, it does not exit from process and consume memory of server. I wonder is this bug of php ?
I run application in windows server 2003 and php is in CLI.
Related
I have Apache server on my windows machine.
From PHP script I start .bat file using exec command(synchronously!).
From my batch script I run another executable(WinDBG). I run it asynchronously(using cmd start command) wait some time and kill if it is still running.
If I run .bat script from cmd it works as planned(asynchronously).
But if script is started from PHP exec command "start" works synchronously. And it is the same code.
I think that the problem is that PHP script run as a windows service. And therefore start work synchronously.
I should start WinDBG asynchronously to have ability terminate it
I have a CLI php app with a command that starts a background process (a Selenium server) and then exits:
php app.php server:start #this should return immediatly
The app also needs to be able to stop the background process in a later invocation:
php app.php server:stop
Since the server process outlives the start/stop script, I cannot manage it by keeping a file descriptor to it open.
I could store the PID of the process on the file system during start and kill that PID in stop. But if the stop command is ran after the background process has died of its own, I risk killing a process that I did not start because the PID might have been reused by the OS for some other process.
Right now my approach is to store not just the PID of the background process, but also the background process' start time and command used. It's OK, but it is hard to make it work consistently across different platforms (I need Linux, Mac, and Windows).
Can you think of a better way to implement such a behaviour?
Okay, this is going to be a very weird request/question.
There is a very long running PHP script that needs to be launched by the user (admin) who is not very technically adept. When running the script through apache, it throws a timeout (502 or 504 Bad Gateway).
Let's just assume that apache can't be configured to fix the timeout issues.
I want to create a button in the admin panel that sends an AJAX call to a PHP script on the server, that PHP script will act as a proxy of sorts to launch a shell command. The shell command will then execute the long running PHP script with certain arguments... but I don't want it to wait for the long running script to finish. The proxy PHP script can exit and return true/false based on if the shell command actually started (this part is optional).
Essentially, have PHP launch a shell command which launches a PHP script.
How can I pull something like this off?
Have you tried shell_exec. It worked for me...
http://php.net/manual/en/function.shell-exec.php
I have a Windows 2012 Server that runs IIS and SQL Server 2012.
I am trying to run a PHP script from the command prompt. The script takes around 1 hour to run. If I run it straight from the command line like this - c:\PHP>php.exe "D:\Web\phpScript.php" it runs fine. Again it takes around 1 hour to run but it completes fine.
The thing is I need to run it from another PHP page. So this code - exec('start c:\php\php.exe "D:\Web\phpScript.php"'); in PHP runs the script. When I run it from PHP like that it runs good for around 30 minutes or so but for some reason Windows ends up killing the process after around 30 minutes.
I have been watching the task manager on Windows and cannot see any difference in the way the process runs compared to when I run it straight from the command prompt or when I use PHP to run the command. They both show up as a background process and look exactly the same in the task manager but for some reason Windows is killing the one that runs from PHP and not the one ran straight from the command prompt.
I have even tried running the PHP one in Realtime thinking maybe if it had higher priority it would not get killed but that did not help.
I am really stuck with this.
Any help would be great.
Thanks!
If it has to do with your PHP configuration, you can force allowing a certain execution time with this at the beginning of the script
set_time_limit(60*60*2); // allows 2 hours execution time
Then to execute the external file just use include('D:\Web\phpScript.php'); in the script.
http://php.net/manual/en/function.set-time-limit.php
Otherwise if its a server problem, beats me. You could, of course...run it in your web browser instead of in the command prompt, if PHP is installed on the machine.
I am trying to run my php scripts in Gearman worker code but also want to monitor
besides that if they are taking more than the expected run time ,I want to kill those scripts.Each script has to run in a timely fashion(say running every 10 minutes) and the Gearman client picks ,the script which are ready to run and send s them to Gearman worker.
I tried using the following options :
1) Tried using an independent script,a normal php script which monitors the running process.
But this normal scripts will not inform Gearman that job got killed and Gearman thinks that the job that got killed is still running.
So that made me think I have to synchronize the process of monitoring and process of running php scripts in the same worker.
Also these jobs need to be restarted and the client takes care of them.
2) I am running my php scripts using the following command :
cd /home/amehrotra/include/core/background;php $workload;(this is blocking does not go to the next line until the script finishes execution).
I tried using exec , but exec does not execute the scripts
exec ("/usr/bin/php /home/amehrotra/include/core/background/$workload >/dev/null &");
3) Tried running 2 workers ,one for running php script another for monitoring but Geraman client does not connect to two workers.
Not the coolest plan, but try to use database as central place where everything is controlled.
It will take some resources and time for your workers but it is the cost to make it manageable.
Worker will need to check for commands (stop/restart) that are assigned to him via db. and he can also save some data into db so you can see what is happening.