I have a problem with my php script in windows. It run from Task scheduler like task on specific time. Sometimes it is 5 tasks per hour, sometimes 30. Script communicates with database on localhost XAMPP server to get some data. Script run an external program ArticleReader using function exec() - program generate an XML file, if a block is added on the page - and after create new task for new block. Specific blocks on the pages are added almost every 10 minutes.
Everything works fine, when about 30 scripts are running at once. When there are more scripts, it does not work anymore. Few scripts still run, but others wont start. I try to increase memory_limit in php.ini, also turn off the task after the specific time, but without result. PHP script run as administrator, without error.
<?php
include("include/db_connect.php");
include("include/global_variable.php");
include("include/functions.php");
//ini_set('max_execution_time', '2700');
//ini_set('memory_limit','16M');
if(isset($argv) AND isset($argv[1]) AND is_numeric($argv[1]) AND isset($argv[2]) AND is_numeric($argv[2]) AND isset($argv[3]) AND is_numeric($argv[3])){
$article = new Article_id($argv[1],$argv[2]);
$category = new Category_id($article->get_category_id());
$reader_path = "C:\\xampp\htdocs\projects\main_app\ArticleReader\ArticleReader.exe";
$next = $argv[3] ? " -n " : "";
$file_name = $category->get_name().$argv[1]."-".($argv[3] ? (((int)$argv[3])+1).".part" : "").".xml";
$result_path = "C:\\xampp\htdocs\projects\main_app\ArticleReader\\results\articles\\".$file_name;
exec(escapeshellcmd($reader_path.' -r -s -w "'.$article->get_url().'" -m '.$next.' -o "'.$result_path.'" & exit'));
if (file_exists(XML_PATH_LOCAL.$file_name)) {
if($xml = utf8_encode(str_replace(array("&", "&"), array("&", "&"), file_get_contents(XML_PATH_LOCAL.$file_name)))){
if($data = json_decode(json_encode(simplexml_load_string($xml)),TRUE)){
if(!isset($data["error"])){
if($argv[3] < 5)
Article_services::create_task_scheduler_next($argv[1],$argv[2],$argv[3]+1);
}
else
Article_services::create_task_scheduler_repeat($argv[1],$argv[2],$argv[3],(isset($argv[4]) ? $argv[4]+1 : 0));
}
}
}
}
Could you help me with this problem please? Thank you
Related
I have a LOT ( almost 300 ) old SVN repositories to migrate to git using git2svn.
After considering GOLANG and PYTHON, I finally decided that the easiest way is to use PHP . Might be a bad questionable decision, but it's seemed easy.
So, after 15 minutes , I did have a script that is more or less running ok in tests . Ugly script , but it is a one-timer.
The problem is that the process takes a lot of time , even for simple almost empty repos is can take 30sec. and even a minute. On big ones - even 10min - so before taking it into production, I would like to have some feedback mechanism - so I can actually see what is going on .
..as of now ,the script does output the command feedback like so :
$cmd = "cd ".$GITrepoPath." && svn2git svn://127.0.0.1/". $repoName . " --username " .$SVNusername ." --authors authors.txt --notags --nobranches --notrunk";
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
..but this is only after each repo was finished processing .. not like the real cmd execution where I can see the steps .
The only question I found that might be close to what I need was here - but honestly - I did not understood much from the answer ...
I know it is just a one-timer script - but the use case had me interested in how to actually achieve that ( and if it is possible ).
I am on a win7 local machine , but would like to know also for *nix if possible .
shell_exec waits until the process closes. You have to create the process and listen to it, the same as CMD. Use exec function in this way:
$cmd = ''; // your command here
$output_storage = [];
$output_showed = [];
$result = null;
exec($cmd, $output_storage, $result);
while( $result === null ){
$diff = array_diff($output_storage, $output_showed);
if( $diff ){
// all new outputs here as $diff
$output_showed = $diff;
}
}
I suggest instead running a script or program in the background that runs the command and then updates a record in a database, you could then use AJAX or whatever to poll the server for record changes. This allows a nice environment for the user.
The column in the database table could be named something like "finished" and once that boolean is true then you know its complete and the output could be stored in the database.
I was just wondering if it's possible to detect the current execution timing for a running script, I am creating an application to ping some computers on the network. As this is being done from a Linux machine the pinging system differs from Windows.
On a linux machine, if the computer is off then the server will hang on the primary message after issuing the ping command and not have any more output.. Will just hang (with my experience with linux pinging)
So I have this script:
$Computer_Array = array(
"Managers" => "192.168.0.5",
"Domain Controller" => "192.168.0.1"
"Proxy Controller" => "192.168.0.214"
);
foreach ($Computer_Array AS $Addresses){
exec('ping'.$Addresses, $Output);
}
Later on this will be used to display statistics.. Now the problem is, as the managers computer is subject to both power conditions such as on or off when issuing the ping command, just hangs.. So i'm wondering if there is a method to capture the microtime(); of the current executing function, if it exceeds a threshold then move on to the next element. I would rather keep this to core PHP, but if such solution can only be done via AJAX or another language, then I would have to consult the developer if it's alright to integrate an external method.
The ping command allows you to specify how long it will wait before giving up:
ping -c 5 -t 1 127.0.0.2
This will return after one second, regardless of how many pings have been sent. The exact command line arguments would vary between platforms.
Alternatively, if you can use pcntl, look into pcntl_alarm(); it will deliver a SIGALRM signal to your application after a certain amount of time that can be caught.
Lastly, and I haven't tested this myself, you could try using proc_open() and use stream_select() on one of the pipes; if nothing has happened on the pipe after a certain time you can then kill off the process.
If you want to do this with PHP, or run into a similar issue, here's an example using code from php execute a background process
The PHP script would need write permissions to the output files. This concept would essentially work for anything, from a ping to another PHP script.
function isRunning($pid){
try{
$result = shell_exec(sprintf("ps %d", $pid));
if( count(preg_split("/\n/", $result)) > 2){
return true;
}
}catch(Exception $e){}
return false;
}
$cmd = "ping 127.0.0.1";
$outputfile = "output";
$pidfile = "pid";
$start = microtime(true);
// Don't last longer than 10 seconds
$threshold = 2;
// Ping and get pid
exec(sprintf("%s > %s 2>&1 & echo $! > %s", $cmd, $outputfile, $pidfile));
$pid = `tail -n 1 $pidfile`;
// Let the process run until you want to stop it
while (isRunning($pid)){
// Check output here...
if ((microtime(true)-$start) > $threshold){
$o = `kill $pid`;
die("Timed out.");
}
}
$end = microtime(true);
$time = $end - $start;
echo "Finished in $time seconds\n";
I'm using cronjob to run php script that will be executed every 1 minute
I need also to make sure only of copy is running so if this php script is still running after 2 minutes, cronjob should not run another version.
currently I have 2 options and I would like to see your feedback and if you have any more options
Option 1: create a tmp file when the php script start and remove it when php script finish (and check if the file exists) ---> the problem for me with this option is that if I have my php script crash for any reason, it will not run again (the tmp file will not be deleted)
Option 2: run a bash script like the one below to control the php script execution ---> good but looking for something that can be done within php
#!/bin/bash
function rerun {
BASEDIR=$(dirname $0)
echo $BASEDIR/$1
if ps -ef | grep -v grep | grep $1; then
echo "Running"
exit 0
else
echo "NOT running";
/usr/local/bin/php $BASEDIR/$1 &
exit $?
fi
}
rerun myphpscript.php
PS: I just saw "Mutex class" at http://www.php.net/manual/en/class.mutex.php but not sure if it's stable and anyone tried it.
You might want to use my library ninja-mutex which provides simple interface for handling mutex. Currently it can use flock, memcache, redis or mysql to handle lock.
Below is an example which uses memcache:
<?php
require 'vendor/autoload.php';
use NinjaMutex\Lock\MemcacheLock;
use NinjaMutex\Mutex;
$memcache = new Memcache();
$memcache->connect('127.0.0.1', 11211);
$lock = new MemcacheLock($memcache);
$mutex = new Mutex('very-critical-stuff', $lock);
if ($mutex->acquireLock(1000)) {
// Do some very critical stuff
// and release lock after you finish
$mutex->releaseLock();
} else {
throw new Exception('Unable to gain lock!');
}
I often use the program flock that comes with many linux distributions directly in my crontabs like:
* * * * * flock -n /var/run/mylock.LCK /usr/local/bin/myprogram
Of cause it is still possible to actually start two simultaneously instances of myprogram if you do it by hand, but crond will only make one.
Flock being a small compiled binary, makes it super fast to launch compared to a eventually larger chunk of php code. This is especially a benefit if you have many longer running executions, which it is not perfectly clear that you actually have.
If you're not on a NFS mount, you can use flock() (http://php.net/manual/en/function.flock.php):
$fh = fopen('guestbook.txt','a') or die($php_errormsg);
$tries = 3;
while ($tries > 0) {
$locked = flock($fh,LOCK_EX | LOCK_NB);
if (! $locked) {
sleep(5);
$tries--;
} else {
// don't go through the loop again
$tries = 0;
}
}
if ($locked) {
fwrite($fh,$_REQUEST['guestbook_entry']) or die($php_errormsg);
fflush($fh) or die($php_errormsg);
flock($fh,LOCK_UN) or die($php_errormsg);
fclose($fh) or die($php_errormsg);
} else {
print "Can't get lock.";
}
From: http://docstore.mik.ua/orelly/webprog/pcook/ch18_25.htm
I found the best solution for me is creating a separate database user for your Script and limit the concurent connection to 1 for that user.
I'm a little sorry to ask this as I know that it has been asked many times before on here but all of the answers I have found do not work in my situation. Either I am doing something fundamentally wrong or I am trying to do something which is just not possible.
I need to be able to fork a background process from a PHP file accessed via a web browser (served up by an Apache web server running on Windows). I need the foreground process to finish, and therefore the browser to stop waiting for the server whilst the forked process continues in the background. I am using PHP 5.3.
I have tried numerous suggested solutions all with varying degree of fail:
shell_exec('D:\php5.3\php.exe sleep.php > out 2>out2' );
Whether ran through command line, or through browser, the foreground process did not complete until the background one did. Adding a "&" in at the end didn't seem to make any difference
pclose(popen("start D:\php5.3\php.exe sleep.php","r"));
This one worked fine through command line, but when accessed via browser, it waited for both foreground and background processes to finish.
exec("nohup D:/php5.3/php.exe -f sleep.php > out 2>out2");
This didn't seem to work at all
$commandString = "start /b D:/php5.3/php.exe d:\\webroot\\other\\tests\\sleep.php";
pclose(popen($commandString, 'r'));
Worked in command line, waited in browser
$WshShell = new COM("WScript.Shell");
$oExec = $WshShell->Run("D:/php5.3/php-win.exe -f d:\\webroot\\other\\tests\\sleep.php", 0, false);
didn't work at all - hung when trying to fork the new process.
Can anyone help? Am I missing something really obvious?!
I know I can queue the tasks up in a database and run in batch, but I need this to operate in as close to real time as possible so do not want to introduce more delays by queueing things up that I can then only run max once a minute.
You might try pcntl_fork functions. This can fork a child process (serving the web page) while continuing to do the rest of the work in the work in the background. It is however something that needs to be used with caution. Just read through the php.net documentation to get a feel for some of the complexities of using this approach.
even this post is outdated but may it be useful for others,
this worked for me, it took hours to find the solution!
function execInBack($phpscript, $args = array(), $logfile = '', $stdoutfile = '') {
if (count($args) == 0) {
$args[] = md5(mt_rand());
}
$arg = '';
foreach ($args as $a) {
$arg .= "\"$a\" ";
}
$cmd = "start /b c:\php\php5.3.10\php.exe ./$phpscript $arg";
$p = array();
$desc = array();
if ($logfile && $stdoutfile) {
$desc = array(
1 => array("file", $stdoutfile, "w"),
2 => array("file", $logfile, "a")
);
}
proc_close(proc_open($cmd, $desc, $p, getcwd()));
}
I need to build a system that a user will send file to the server
then php will run a command-line tool using system() ( example tool.exe userfile )
i need a way to see the pid of the process to know the user that have start the tool
and a way to know when the tool have stop .
Is this possible on a Windows vista Machine , I can't move to a Linux Server .
besides that the code must continue run when the user close the browser windows
Rather than trying to obtain the ID of a process and monitor how long it runs, I think that what you want to do is have a "wrapper" process that handles pre/post-processing, such as logging or database manipulation.
The first step to the is to create an asynchronous process, that will run independently of the parent and allow it to be started by a call to a web page.
To do this on Windows, we use WshShell:
$cmdToExecute = "tool.exe \"$userfile\"";
$WshShell = new COM("WScript.Shell");
$result = $WshShell->Run($cmdToExecute, 0, FALSE);
...and (for completeness) if we want to do it on *nix, we append > /dev/null 2>&1 & to the command:
$cmdToExecute = "/usr/bin/tool \"$userfile\"";
exec("$cmdToExecute > /dev/null 2>&1 &");
So, now you know how to start an external process that will not block your script, and will continue execution after your script has finished. But this doesn't complete the picture - because you want to track the start and end times of the external process. This is quite simple - we just wrap it in a little PHP script, which we shall call...
wrapper.php
<?php
// Fetch the arguments we need to pass on to the external tool
$userfile = $argv[1];
// Do any necessary pre-processing of the file here
$startTime = microtime(TRUE);
// Execute the external program
exec("C:/path/to/tool.exe \"$userfile\"");
// By the time we get here, the external tool has finished - because
// we know that a standard call to exec() will block until the called
// process finishes
$endTime = microtime(TRUE);
// Log the times etc and do any post processing here
So instead of executing the tool directly, we make our command in the main script:
$cmdToExecute = "php wrapper.php \"$userfile\"";
...and we should have a finely controllable solution for what you want to do.
N.B. Don't forget to escapeshellarg() where necessary!