i have a php script that accepts a POST request as a listener to a web service then process all the data to two final arrays,
I'm looking for a way to initiate a second script that GET's those serialized arrays and do some more processing.
include() will not be good for me since i actually want to "free" or "end" the first script after passing the data
your help is much appreciated as always :)
EDIT - OK so looks like queue might be the solution! i never did anything like this before any examples or reference?
Does it need to happen immediately? Otherwise you could set up a cronjob that does that every X minutes. You'll have to make some kind of queue in which your first script sticks "requests" to the second script. The cronjob then processes the requests in the queue.
You should get into the habit of writing php scripts that are just a collection of functions (no auto-ran scripts, per se). This way you can include a script file at the top of the script your talking about and then call the function that does what you want.
For instance:
<?php
include('common_functions.php');
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
?>
In Fact:
Just to be consistent with what I'm saying:
<?php
include('common_functions.php');
do_your_stuff();
function do_your_stuff() {
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
}
?>
Obviously you should use better function & variable names, haha.
I'd do it all in one request. It cuts down on latency and makes the whole operation more efficient.
Remember you can have a long running request, but still service other requests. Apache will just spawn another php process to handle the other request from the webservice even though the first has not completed. As long as the script doesn't lock a shared resource (database file etc) this will work just fine.
That said, you should use cURL to call the second script. then post the unserialized array. cUrl will handle the rest.
Related
I have a php script that sends data to another script and processes it async (at least I hope to get it likewise). Here is the code of called.php
include_once("../caller.php");
chdir(__DIR__);
fclose(STDOUT); //THIS
fclose(STDIN); //THIS
fclose(STDERR); //THIS
function giveCake($arg1,$arg2){
global $mysqli;
$sleep = 15; //script has to sleep
(...) code amongst sleep (...)
sleep($sleep);
$_SESSION; //would session variable of the user be available if the script is called as described?
//script caller.php is firstly initiated by a script with pre-defined $_SESSION
//now that I'm thinking maybe it won't since it is called from the command line...
pcntl_exec("/usr/bin/php",Array($_SERVER['argv'][1]));
}
if (!isset($_SERVER["HTTP_HOST"])) { //check if it comes from within the server? localhost?
$arg1 = parse_str($argv[1], $_GET);
$arg2 = parse_str($argv[1], $_POST);
if($arg1 && $arg2){
giveCake($arg1,$arg2);
}
}
And my concerns are given in the title, as so:
By closing the file operations (as in the beginning of called.php) does this affect all other scripts that might be using file operations or only the ones affected as in the moment of this execution?
If called using cURL would I let the script vulnerable to inappropriate execution? Although I think I would most certainly have access to $_SESSION that would leave it easily spoofable if someone would want to execute it. Any way to counter this?
Considering the arguments I would need to transfer between scripts could easily achieve a ton of bytes, as in each array around 400 bytes * x arrays would there be any problem regarding execution?
Thank you very much for your help, I hope you don't consider this to be highly broad since I've tried and detailed all my concerns explicitly and would like help in the whole process (easier than fragmenting it). Please help as you can, tyvm.
Q1: File operations always affect the script currently in execution, of course including all libraries loaded via require or include.
Q2: Depending on where the caller and the callee sit, you could limit access for example by restricting access to certain IPs and maybe access method via .htaccess.
Like:
<Limit GET POST>
order deny,allow
deny from all
allow from 1.2.3.4
</Limit>
Q3: Also depending on the connection between the two scripts, usually there should be no problem with big data amounts if you have enough bandwidth available.
We have some scripts in operation that handle data in the range of some hundred megabytes regularly. It may be necessary to extend or turn off script execution time limits, by setting max_execution_time in php.ini or by using ini_set(), or use set_time_limit() (which is a different approach).
pcntl_exec() will simply replace the current process by the new one. There is actually no communication happening. I'm wondering how you can think that some asynchronous communication is happening.
Also I'm unsure what $_SERVER['argv'][1] should do here. Don't you mean argv[0]?
So at the moment you just presented a bunch of not-working code. That's too less.
I'm running IIS on a Windows Server w/PHP 5.3. I have two scripts; let's call them initiator.php and worker.php. A user calls initiator.php and in this script a variable is defined; let's call it $input. I would like to take this $input variable and pass it to worker.php like so:
$oShell = new COM('Wscript.Shell');
$oShell->Run("\"C:/Program Files (x86)/PHP/v5.3/php/worker.php -a $input",0,False);
In worker.php I have the following to pick up the $input variable passed from initiator.php.
$aCliOpts = getopt('a:');
$input_from_initiator = $aCliOpts['a'];
This works great. initiator.php's $input variable is successfully passed to worker.php which picks it up and initiator.php keeps chugging. However, worker.php then takes it's own $input_from_initiator variable, runs through some quick code of it's own and creates a third variable called $output_from_worker. It is this variable that I need initiator.php to read a little ways into it's processing. This is where I'm getting hung up.
I've tried passing the variable back to initiator.php from worker.php the same way it a variable as passed in the beginning and this did not work. I've also tried to use:
header('Location: initiator.php?var=value')
using HTTP GET params but to no avail.
My last resort is for worker.php to write this variable's value to disk then have initiator.php read from disk. I hate to do this due to the latent disk I/O. Speed is very important to this script.
Is there a way two PHP processes can pass variables between each other in memory?
Have a look at file_get_contents() http://php.net/file_get_contents, which you can pass a URL to. So you could use the Query String like:
$var = file_get_contents('http://site.tld/worker.php?input='.$input);
And in worker.php, simply echo your result.
It's a shame you're running on Windows, because the sys5 extension on *nix is marvelous!
You can always use files or a database etc for communication.
Thanks for the help although I ended up doing something a little different than my original question. I was trying to run different cURL requests. curl_multi_exec() ended up working great for me.
I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.
I have to call a php function wich takes one second to response, in a "for" loop :
for ($i=0; $i<count($wsdlTab); $i++)
{
$serverLoadTab[$i] = $this->getServerLoad($wsdlTab[$i]);
}
My problem is that I would like to call my getServerLoad($wsdlTab[$i]) function simultaneous for each row of my $wsdlTab[$i], to not have to wait one second on each loop.
That is the reason why I need to call that function in a thread.
I have seen various ways to "emulate" threads, but I have not found any way with my limitations :
I have to get the return value of my getServerLoad($wsdlTab[$i]), and put in in an array
The Apache server is on Windows
Thanks in advance for your responses.
You should check out Gearman for parallel processing: http://gearman.org/
PHP doesn't really have asynchronous or threading built-in, as you've discovered.
What I might do in a case like this is separate the script I need to execute in a parallel, putting it into its own, small and self-contained PHP file. Then I'd execute that in a separate thread, storing the result somewhere I could monitor in the original thread. Once all the scripts have returned and filled the results, or with some given timeout, I would then continue with processing.
So, for instance,
prepareResults(); // something like clearing a db row or zeroing out a file or whatever
for ($i=0; $i<count($wsdlTab); $i++) {
exec('./doServerLoad.php ' . $wsdlTab[$i] . ' &');
}
while (!waitingForResults()) { // checking the results table/row/file
}
$serverLoadTab = parseTheResults();
May be request WSDLs in multi threads/forks using pcntl_fork() or curl_multi, and save them to the local drive. Then just parse them:
$soapClient = new SoapClient('WSDLSTemp/wsdl1.wsdl');
// .. do whatever you want...
i need to load data as array to memory in PHP.but in PHP if i write $array= array("1","2"); in test.php then this $array variable
is initialized every time user requests.if we request test.php 100 times by clicking 100 times browser refresh button then this $array variable will be executed 100 times.
but i need to execute the $array variable only one time for first time request and subsequent request of test.php must not execute the $array variable.but only use that memory location.how can i do that in PHP.
but in JAVA SEVRVLET it is easy to execute,just write the $array variable in one time execution of init() method of servlet lifecycle method and subsequent request of that servlet dont execute init() method but service() method but service() method always uses
that $array memeory location.
all i want to initilize $array variable once but use that memory loc from subsequent request in PHP.is there any possiblity in PHP?
PHP works differently than a Java Servlet container. Every new request basically starts an entirely new instance of the PHP interpreter, therefore you don't have a global adress space across requests (you do have a session per user which gets usually persisted to a file to keep variables across requests for one user).
A thing that might come close to it would be to use memcached with PHP as your "database", but you will have to send a request to the memcached server every time you need your array. That is why I think your array (if it doesn't change) is best kept and initialized in the PHP file.
use session
start the session when user opens test.php and set the array in that session
<?php
session_start();
if(!isset($_SESSION['user_action'])){
$_SESSION['user_action'] = array("1","2");
}
?>
That code will just verify if the session variable "user_action" is set, if it isn't then it will set with that array.
Then you can change that variable later.
All variables are destroyed at request shutdown, there is no built-in mechanism to do what you want in php.
PHP has different kind of execution.
In general, it's impossible in PHP and it's OK.
You can try the following:
<?php
/* test.php */
if (empty($GLOBALS['array'])) {
$GLOBALS['array'] = array("1", "2");
}
?>