I'm running IIS on a Windows Server w/PHP 5.3. I have two scripts; let's call them initiator.php and worker.php. A user calls initiator.php and in this script a variable is defined; let's call it $input. I would like to take this $input variable and pass it to worker.php like so:
$oShell = new COM('Wscript.Shell');
$oShell->Run("\"C:/Program Files (x86)/PHP/v5.3/php/worker.php -a $input",0,False);
In worker.php I have the following to pick up the $input variable passed from initiator.php.
$aCliOpts = getopt('a:');
$input_from_initiator = $aCliOpts['a'];
This works great. initiator.php's $input variable is successfully passed to worker.php which picks it up and initiator.php keeps chugging. However, worker.php then takes it's own $input_from_initiator variable, runs through some quick code of it's own and creates a third variable called $output_from_worker. It is this variable that I need initiator.php to read a little ways into it's processing. This is where I'm getting hung up.
I've tried passing the variable back to initiator.php from worker.php the same way it a variable as passed in the beginning and this did not work. I've also tried to use:
header('Location: initiator.php?var=value')
using HTTP GET params but to no avail.
My last resort is for worker.php to write this variable's value to disk then have initiator.php read from disk. I hate to do this due to the latent disk I/O. Speed is very important to this script.
Is there a way two PHP processes can pass variables between each other in memory?
Have a look at file_get_contents() http://php.net/file_get_contents, which you can pass a URL to. So you could use the Query String like:
$var = file_get_contents('http://site.tld/worker.php?input='.$input);
And in worker.php, simply echo your result.
It's a shame you're running on Windows, because the sys5 extension on *nix is marvelous!
You can always use files or a database etc for communication.
Thanks for the help although I ended up doing something a little different than my original question. I was trying to run different cURL requests. curl_multi_exec() ended up working great for me.
Related
I'm having trouble running the "exec" function in my PHP website. I am able to run it several times with an executable that just takes in a variable argument and returns some test message. However, when I use an executable that does some image processing, where I want to pass an image from the website that a user uploads as an argument, it does not seem to be executing the executable at all. I even have some cout commands in the executable to ensure its running, but these are not being displayed on the website. So I think that for some reason the php cannot run the executable? I am able to run it fine from my desktop...
Here's an example of the code that isn't working on my PHP website:
$imgtest1="/uploaded_files/me.jpg";
$imgtest2="/uploaded_files/clusteroutput.jpg";
$nosuppix = 400;
$noweight = 100;
$executabletest = exec("ImgProc $imgtest1 $nosuppix $noweight $imgtest2");
echo $executabletest;
Is there a way to debug or get an error output from the exec function? Is there something I'm missing when passing an image to the executable? The executable uses several DLL files which are in the same folder as the executable. Do they need to be packaged together for some reason? I apologize, but I really don't know what's left to test...
***Edit: I'm now able to get it to run if I write out all the code in the escapeshellcmd itself... how come I'm not able to just pass the variables?
$cmdinput = escapeshellcmd('SuperpixelsFinal "D:/WebPages/TALIA ART/TALIA ART/uploaded_files/me.jpg" 400 100 "D:/WebPages/TALIA ART/TALIA ART/uploaded_files/clusteroutput.jpg"');
For anyone else having this issue, it is because I was using escapeshellcmd when I should have used escapeshellarg for each individual string and then run the exec with double quotes using only variables like so:
$executabletest = exec("SuperpixelsFinal $imgtest1 $nosuppix $noweight $imgtest2");
I am using a timer function in matlab to continuously execute a certain script. Within this script, I am using urlread to retrieve data from webservices, which works like a charm.
I am now trying to use urlread to execute a simple http-request within this script to insert data into a mysql-database. Thus, I simply specify the url-string and define the value to be parsed to the php parser.
Code-within script being executed in timer-function:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value));
urlread(db_url);
clear db_url
My problem is the following: When I run the timer, it works fine for one execution, but then stops displaying the following error:
"Either this URL could not be parsed or the protocol is not supported."
What is going wrong? When I check my mysql database, I see that one new line has been added to my database, which means it generally works, just won't execute multiple times within the timer.
Any idea what is going wrong? Many thanks in advance!
I figured out what the problem was. The value variable is an array with increasing in size each iteration. Thus, what I needed to do was specify value(end), like so:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value(end)));
urlread(db_url);
clear db_url
i need to load data as array to memory in PHP.but in PHP if i write $array= array("1","2"); in test.php then this $array variable
is initialized every time user requests.if we request test.php 100 times by clicking 100 times browser refresh button then this $array variable will be executed 100 times.
but i need to execute the $array variable only one time for first time request and subsequent request of test.php must not execute the $array variable.but only use that memory location.how can i do that in PHP.
but in JAVA SEVRVLET it is easy to execute,just write the $array variable in one time execution of init() method of servlet lifecycle method and subsequent request of that servlet dont execute init() method but service() method but service() method always uses
that $array memeory location.
all i want to initilize $array variable once but use that memory loc from subsequent request in PHP.is there any possiblity in PHP?
PHP works differently than a Java Servlet container. Every new request basically starts an entirely new instance of the PHP interpreter, therefore you don't have a global adress space across requests (you do have a session per user which gets usually persisted to a file to keep variables across requests for one user).
A thing that might come close to it would be to use memcached with PHP as your "database", but you will have to send a request to the memcached server every time you need your array. That is why I think your array (if it doesn't change) is best kept and initialized in the PHP file.
use session
start the session when user opens test.php and set the array in that session
<?php
session_start();
if(!isset($_SESSION['user_action'])){
$_SESSION['user_action'] = array("1","2");
}
?>
That code will just verify if the session variable "user_action" is set, if it isn't then it will set with that array.
Then you can change that variable later.
All variables are destroyed at request shutdown, there is no built-in mechanism to do what you want in php.
PHP has different kind of execution.
In general, it's impossible in PHP and it's OK.
You can try the following:
<?php
/* test.php */
if (empty($GLOBALS['array'])) {
$GLOBALS['array'] = array("1", "2");
}
?>
i have a php script that accepts a POST request as a listener to a web service then process all the data to two final arrays,
I'm looking for a way to initiate a second script that GET's those serialized arrays and do some more processing.
include() will not be good for me since i actually want to "free" or "end" the first script after passing the data
your help is much appreciated as always :)
EDIT - OK so looks like queue might be the solution! i never did anything like this before any examples or reference?
Does it need to happen immediately? Otherwise you could set up a cronjob that does that every X minutes. You'll have to make some kind of queue in which your first script sticks "requests" to the second script. The cronjob then processes the requests in the queue.
You should get into the habit of writing php scripts that are just a collection of functions (no auto-ran scripts, per se). This way you can include a script file at the top of the script your talking about and then call the function that does what you want.
For instance:
<?php
include('common_functions.php');
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
?>
In Fact:
Just to be consistent with what I'm saying:
<?php
include('common_functions.php');
do_your_stuff();
function do_your_stuff() {
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
}
?>
Obviously you should use better function & variable names, haha.
I'd do it all in one request. It cuts down on latency and makes the whole operation more efficient.
Remember you can have a long running request, but still service other requests. Apache will just spawn another php process to handle the other request from the webservice even though the first has not completed. As long as the script doesn't lock a shared resource (database file etc) this will work just fine.
That said, you should use cURL to call the second script. then post the unserialized array. cUrl will handle the rest.
I'm having a little problem with the following:
When I execute this line:
echo exec(createDir($somevariable));
I get this error:
Warning: exec() [function.exec]: Cannot execute a blank command in /home/mydir/myfile.inc.php on line 32
Any ideas.
Thanks.
exec() expects a string argument, which it would pass on to your operating system to be executed. In other words, this is a portal to the server's command line.
I'm not sure what function createDir() is, but unless it's returning a valid command line string, it's probably failing because of that.
In Linux, you might want to do something like
exec('/usr/bin/mkdir '.$path);
...on the other hand, you should abstain from using exec() at all costs. What you can do here, instead, is take a look at mkdir()
With exec you can execute system calls like if you were using the command line. It hasn't to do anything with executing PHP functions.
To create a directory you could do the following:
exec( 'mkdir [NAME OF DIRECTORY]' );
I'd guess that your createDir() function doesn't return anything. Might also be worth checking that $somevariable is also set to something sensible
You're misunderstanding the purpose of exec(). If all you want to do is create a directory then you should use mkdir().
I think I've derived from other posts and comments what it is you actually want to do:
I think createDir() is a PHP function you've written yourself. It does more than just make a directory - it populates it, and that might take some time.
For some reason you believe that the next command gets run before createDir() has finished working, and you thought that by invoking createDir() using exec() you could avoid this.
Tell me in a comment if this is way out, and I'll delete this answer.
It's seems unlikely that createDir() really does keep working after it's returned (if it does, then we call that 'asynchronous'). It would require the programmer to go out of their way to make it asynchronous. So check that assumption.
Even so, exec() is not for invoking PHP functions. It is for invoking shell commands (the kind of thing you type in at a command prompt). As many of us have observed, it is to be avoided unless you're very careful - the risk being that you allow a user to execute arbitrary shell commands.
If you really do have to wait for an asynchronous function to complete, there are a couple of ways this can be done.
The first way requires that the asynchronous function has been written in an amenable manner. Some APIs let you start an asynchronous job, which will give you a 'handle', then do some other stuff, then get the return status from the handle. Something like:
handle = doThreadedJob(myParam);
# do other stuff
results = getResults(handle);
getResults would wait until the job finished.
The second way isn't as good, and can be used when the API is less helpful. Unfortunately, it's a matter of finding some clue that the job is finished, and polling until it is.
while( checkJobIsDone() == false ) {
sleep(some time interval);
}
I'm guessing createDir() doesn't have a return value.
Try exec("mkdir $somevariable");