How PHP scripts bufferred (?) on server? - php

I have PHP application, that runs about 2-3 minutes before it return something to browser (some database processing stuff).
I want to know, if I can change php file with it while script is running. I assume, there is a buffer in Apache/PHP.
I have situation like this:
// This is index.php
include "DatabaseController.php"; // class inside, I create instance at start
include "ImagesController.php"; // class inside, I create instance at start
include "helpers.php"; // there are just functions, no classes
$db = new Database();
$img = new Images();
// for loop doing job here (2-3 minutes)
// end
What will happen, when I replace "DatabaseController.php" file while script is running?
I tried to test it, and it looks like "job part" is still using old version of DatabaseController, when I replace.
But... what will happen, when I replace "helpers.php" file? It contains only functions, without classes that may be instantiated at the beginning of script.
How this buffering works in general?

Its not really being buffered. You should read up on Compilers. In summary, the code you write will first need to be compiled before it can be executed. Changes you make to the source after it has been compiled will not take effect until the next request when it will be recompiled again.

Related

Trigger function just before exit

I'm using DHTMLX Scheduler on the front end and DHTMLX Connector on the backend as part of my radio automation app. Every time a user edits the calendar, an AJAX call is made to a file that looks like this:
require_once("dhtmlxScheduler_v4/connector/scheduler_connector.php");
require_once('QDRAconf.php');
$res = mysql_connect($QDRAconf['mysqlHost'], $QDRAconf['mysqlUser'], $QDRAconf['mysqlPass']);
mysql_select_db($QDRAconf['mysqlDb']);
// init the schedulerconnector
$conn = new SchedulerConnector($res);
// render the table
$conn->render_table("events","id","start_date,end_date,text");
This file is my "shim" that hooks up the fronted to the back end. I want to run another PHP script that writes the changes to my crontab, but it needs to happen after the DHTMLX library has updated the database. Trouble is, the DHTMLX library will automatically exit whenever it thinks it's done: sometimes it might not get past the first require_once('...') line so I can't just put require_once('cronwriter.php'); at the last line of the script.
My solution to this was to create a class with a destructor that updates the crontab with the latest changes. Since the php manual states that destructors will still be run if the exit() or die() function is called, I added a dummy class with a destructor that runs cronwriter.php script: (I added this to the beginning of the file.)
class ExitCatcher
{
function __destruct()
{
require_once('cronwriter.php');
}
}
//init the class
$ExitCatcher = new ExitCatcher;
For some reason, it doesn't work.
register_shutdown_function may offer a quick solution; but, you might save yourself some future trouble by inspecting the cause of that library's sporadic process haltings.
A good place to start might be...
your browser's JS console for JS errors
your JS console's network tab for AJAX errors
your server's error logs for PHP errors

PHP: "run" file/script rather than include?

I have a php file that returns html based on certain parameters, but it also saves this output in a separate directory (basically a custom made caching process).
Now I want to build a separate php file that automatically updates the cache based on array of known possible parameters.
So I want to "load" or "run" rather than "include" the file several times with the different parameters so that it will save the results in the cache folder.
Is there a php function that will allow me to simply load this other file and perhaps tell me when it is done? If not, do I need to use ajax for something like this or maybe PHP's curl library??
At the present I was thinking about something along the following lines:
<?php
$parameters = array("option1", "option2", "option3");
//loop through parameters and save to cache folder
foreach ($parameters as $parameter){
//get start time to calculate process time
$time_start = microtime(true);
sleep(1);
//I wish there was some function called run or load similar to jquery's 'load'
run("displayindexsearch.php?p=$parameter");
//return total time that it took to run the script and save to cache
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Process Time: {$time} seconds";
}
?>
why don't you include the file, but create functions for the things you want to do inside the file. That way, at the time when you want to run, you simply call the function. This seems to be the correct way to do what you are trying to do if I understand it correctly.
The best solution I found is to use:
file_get_contents($url)
So where the questions looks for a replacement for run, I substitute file_get_contents($url).
Note that I was getting errors when I used a relative path here. I only had success when I used http://localhost/displayindexsearch.php?p=parameter etc.
Please take a look at this question.
Launch php file from php as background process without exec()
It might be helping for you.

How to pass variables between scripts running in parallel

I'm running IIS on a Windows Server w/PHP 5.3. I have two scripts; let's call them initiator.php and worker.php. A user calls initiator.php and in this script a variable is defined; let's call it $input. I would like to take this $input variable and pass it to worker.php like so:
$oShell = new COM('Wscript.Shell');
$oShell->Run("\"C:/Program Files (x86)/PHP/v5.3/php/worker.php -a $input",0,False);
In worker.php I have the following to pick up the $input variable passed from initiator.php.
$aCliOpts = getopt('a:');
$input_from_initiator = $aCliOpts['a'];
This works great. initiator.php's $input variable is successfully passed to worker.php which picks it up and initiator.php keeps chugging. However, worker.php then takes it's own $input_from_initiator variable, runs through some quick code of it's own and creates a third variable called $output_from_worker. It is this variable that I need initiator.php to read a little ways into it's processing. This is where I'm getting hung up.
I've tried passing the variable back to initiator.php from worker.php the same way it a variable as passed in the beginning and this did not work. I've also tried to use:
header('Location: initiator.php?var=value')
using HTTP GET params but to no avail.
My last resort is for worker.php to write this variable's value to disk then have initiator.php read from disk. I hate to do this due to the latent disk I/O. Speed is very important to this script.
Is there a way two PHP processes can pass variables between each other in memory?
Have a look at file_get_contents() http://php.net/file_get_contents, which you can pass a URL to. So you could use the Query String like:
$var = file_get_contents('http://site.tld/worker.php?input='.$input);
And in worker.php, simply echo your result.
It's a shame you're running on Windows, because the sys5 extension on *nix is marvelous!
You can always use files or a database etc for communication.
Thanks for the help although I ended up doing something a little different than my original question. I was trying to run different cURL requests. curl_multi_exec() ended up working great for me.

How can I run a php script exactly once - No sessions

I have the following question: how can I run a php script only once? Before people start to reply that this is indeed a similar or duplicate question, please continue reading...
The situation is as follows, I'm currently writing my own MVC Framework and I've come up with a module based system so I can easily add new functionality to my framework. In order to do so, I created a /ROOT/modules directory in which one could add the new modules.
So as you can imagine, the script needs to read the directory, read all the php files, parse them and then is able to execute the new functionality, however it has to do this for all the webbrowsers requests. This would make this task about O(nAmountOfRequests * nAmountOfModules) which is rather big on websites with a large amount of user requests every second.
Then I figured, what if I would introduce a session variable like: $_SESSION['modulesLoaded'] and then simply check if its set or not. This would reduce the load to O(nUniqueAmountOfRequests * nAmountOfModules) but this is still a large Big O if the only thing I want to do is read the directory once.
What I have now is the following:
/** Load the modules */
require_once(ROOT . DIRECTORY_SEPARATOR . 'modules' . DIRECTORY_SEPARATOR . 'module_bootloader.php');
Which exists of the following code:
<?php
//TODO: Make sure that the foreach only executes once for all the requests instead of every request.
if (!array_key_exists('modulesLoaded', $_SESSION)) {
foreach (glob('*.php') as $module) {
require_once($module);
}
$_SESSION['modulesLoaded'] = '1';
}
So now the question, is there a solution, like a superglobal variable, that I can access and exists for all requests, so instead of the previous Big Os, I can make a Big O thats only exists of nAmountOfModules? Or is there another way of easily reading the module files only once?
Something like:
if(isFirstRequest){
foreach (glob('*.php') as $module) {
require_once($module);
}
}
At the most basic form, if you want to run it once, and only once (per installation, not per user), have your intensive script change something on the server state (add a file, change a file, change a record in a database), then check against that every time a request to run it is issued.
If you find a match, it would mean the script was already run, and you can continue with the process without having to run it again.
when called, lock the file, at the end of the script, delete the file. only called once. and as so not needed any longer, vanished in nirvana.
This naturally works the other way round, too:
<?php
$checkfile = __DIR__ . '/.checkfile';
clearstatcache(false, $checkfile);
if (is_file($checkfile)) {
return; // script did run already
}
touch($checkfile);
// run the rest of your script.
Just cache the array() to a file and, when you upload new modules, just delete the file. It will have to recreate itself and then you're all set again.
// If $cache file does not exist or unserialize fails, rebuild it and save it
if(!is_file($cache) or (($cached = unserialize(file_get_contents($cache))) === false)){
// rebuild your array here into $cached
$cached = call_user_func(function(){
// rebuild your array here and return it
});
// store the $cached data into the $cache file
file_put_contents($cache, $cached, LOCK_EX);
}
// Now you have $cached file that holds your $cached data
// Keep using the $cached variable now as it should hold your data
This should do it.
PS: I'm currently rewriting my own framework and do the same thing to store such data. You could also use a SQLite DB to store all such data your framework needs but make sure to test performance and see if it fits your needs. With proper indexes, SQLite is fast.

Stop PHP with ajax

I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.

Categories