set time limit on java code execution through shell_exec - php

I have developed an online java code editor at http://joomla5.guru99.com/try-java-editor.html I am invoking javac using shell_exec function of php and executing java code.
$result = shell_exec('javac' .$soucejavafile. '2>&1');
and running classfile by
$result= shell_exec('java' .$classfile. '2>&1');
Now for security purpose, I want to set time limit for this java code execution. For example, java code execution should be stopped after some amount time and all it's processes must be killed
I have tried
ulimit and ps commands but couldn't able to achieve this.
Please assist me in the correct direction and please help me to make this possible.
Regards.

You can do it in 3 ways:
1) Call pcntl_fork in PHP and check the timeout in parent process. Kill it if it exceeds using linux kill command.
2) Include timeout in a bash script that you will invoke using shell_exec, see this example:
http://www.bashcookbook.com/bashinfo/source/bash-4.0/examples/scripts/timeout3
3) Use proc_open / proc_terminate functions
Personally I would go with number 3, it's the cleanest. If you need quick and dirty, use number 2.

Related

Running PHP file using command line as background process

I have centos VPS hosting and installed WHM/cPanel . I want to run a php script using command line for unlimited time.
My script is look like:
<?php
set_time_limit(0);
while(true)
{
//code to send me email
sleep(600);
}
?>
I know that this script should be run for unlimited time.
I have used these commands:
php myfile.php &
nohup php myfile.php &
I found these commands on stackoverflow. And these are running fine. But after one hours, It stop automatically.
I think, i am doing right. But i do not know, which is killing that process.
If not,
i want to know that How to run this script for unlimited time.
What you are doing is correct. It should run. I have PHP scripts that run for much longer than an hour. They run for days on end. I also have programs that are not PHP that should run forever, but don't. It is because they die due to a bug in the program. For example, xscreensaver dies on me about once a week. To keep it running, I use this shell script (which you can use to keep your PHP running):
while:
do
xscreensaver &
wait
done
Now, running that shell script will start the program again if it ever dies for any reason.
If you have console access try using cronjob to run this file.
see : https://www.centos.org/docs/5/html/Deployment_Guide-en-US/ch-autotasks.html
also : http://alvinalexander.com/linux/linux-crontab-file-format-example
Make sure you use php-command in cron-job
Note : You'll require admin rights to edit/work with cron-jobs

non-stopping program in PHP

Is it possible to do a non-stopping program in PHP? For example, using 2% of processor and some memory all of the time. If it's not possible, can you tell me what direction I should be looking for c++ non-stopping program (on UNIX server) and how to pass variables from PHP to c++.
EDIT:
First: I have max execution time which is stopping it (but I need it for other scripts in problem of bugs).
Second: I don't want to burn server so while true it's not the best idea (it have to have some max memory and processor usage).
You can use CLI
Create your php file and run it on the command line, it won't stop unless the code ends
You can limit the memory usage: php -d memory_limit=128M my_script.php this is changing php.ini directives so you can edit on your own instead of defining it every time
you can do something like this
// run-forever.php
while(true) {
// your executive code
usleep(500) // time in us - something like yield to not ocupy the CPU
}
and then you can run: php run-forever.php
btw if you tend to use web based php you'll have to define set_time_limit(0); before a while loop.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

Kill background php script (shared hosting)

I created a script that runs in the background using the ignore_user_abort() function. However, I was foolish enough not to insert any sort of code to make the script stop and now it is sending e-mails every 30 seconds...
Is there any way to stop the script? I am in a shared hosting, so I don't have access to the command prompt, and I don't know the PID.
Is there any way to stop the script? I am in a shared hosting, so I don't have access to the command prompt, and I don't know the PID.
Then no.
But are you sure you don't have any shell access? Even via PHP? If you do, you could try....
<?php
print `ps -ef | grep php`;
...and if you can identify the process from that then....
<?php
$pid=12345; // for example.
print `kill -9 $pid`;
And even if you don't have access to run shell commands, you may be able to find the pid in /proc (on a linux system) and terminate it using the POSIX extension....
<?php
$ps=glob('/proc/[0-9]*');
foreach ($ps as $p) {
if (is_dir($p) && is_writeable($p)) {
print "proc= " . basename($p);
$cmd=file_get_contents($p . '/cmdline');
print " / " . file_get_contents($p . '/cmdline');
if (preg_match('/(php).*(myscript.php)/',$cmd)) {
posix_kill(basename($p), SIGKILL);
print " xxxxx....";
break;
}
print "\n";
}
}
I came to this thread Yesterday! I by mistake had a infinite loop in a page which was not supposed to be visited and that increased my I/O to 100 and CPU usage to 100 I/O was because of some php errors and it was getting logged and log file size was increasing beyond anyone can think.
None of the above trick worked on my shared hosting.
MY SOLUTION
In cPanel, go to PHP Version (except that of current)
Select any PHP Version for time being.
And then Apply Changes.
REASON WHY IT WORKED
The script which had infinite loop with some php errors was a process so I just needed to kill it, changing php version reinforce restart of services like php and Apache, and as restart was involved earlier processes were killed, and I was relaxed as I/O and CPU usage stabilized. Also, I fixed that bug before hand changing the php version :)
how did you deploy the script? surely you can just remove it (if that's an acceptable option). otherwise modify it and insert some logic to only allow it to send a mail once every n mins/hours/days based on the server time?
re. stopping the script from executing (or rather the system trying to execute it) how did you schedule it for execution? is it some type of gui to a crontab or something? can you not just undo what you did there (seeing as you have no access to the command line/terminal)?
rob ganly
Simply .
Call the support, get it cancelled.
Next time, don't execute something you can't control.

PHP and shell_exec

I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!
shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;
First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.
A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.
Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/
Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.
I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.

Categories