I have a php file that returns html based on certain parameters, but it also saves this output in a separate directory (basically a custom made caching process).
Now I want to build a separate php file that automatically updates the cache based on array of known possible parameters.
So I want to "load" or "run" rather than "include" the file several times with the different parameters so that it will save the results in the cache folder.
Is there a php function that will allow me to simply load this other file and perhaps tell me when it is done? If not, do I need to use ajax for something like this or maybe PHP's curl library??
At the present I was thinking about something along the following lines:
<?php
$parameters = array("option1", "option2", "option3");
//loop through parameters and save to cache folder
foreach ($parameters as $parameter){
//get start time to calculate process time
$time_start = microtime(true);
sleep(1);
//I wish there was some function called run or load similar to jquery's 'load'
run("displayindexsearch.php?p=$parameter");
//return total time that it took to run the script and save to cache
$time_end = microtime(true);
$time = $time_end - $time_start;
echo "Process Time: {$time} seconds";
}
?>
why don't you include the file, but create functions for the things you want to do inside the file. That way, at the time when you want to run, you simply call the function. This seems to be the correct way to do what you are trying to do if I understand it correctly.
The best solution I found is to use:
file_get_contents($url)
So where the questions looks for a replacement for run, I substitute file_get_contents($url).
Note that I was getting errors when I used a relative path here. I only had success when I used http://localhost/displayindexsearch.php?p=parameter etc.
Please take a look at this question.
Launch php file from php as background process without exec()
It might be helping for you.
Related
So, I'm working on a time-sensitive website in PHP on my CentOS server. I have a random time selected in the future, within 24 hours of the present. At that point, I need a PHP file to execute, and a new date to be selected and the same file to be opened. How is this possible to accomplish? I looked briefly at cronjobs, but I couldn't find a way to make them open at a specific, random, time.
You can use at command, run your PHP file and in the end, register make another call to at for the next time. Something like this
<?php
// your PHP code in here, and then find out when is the next call time
$time = date('H:i', intval($time)); // or another good way to make sure time value is safe to use as a shell argument, like using escapeshellarg()
$run_me = "/usr/bin/env php " . __FILE__;
exec("echo '$run_me' | at '$time'");
One possible workaround is to run a script from a cron job, say, every 10 minutes. On the top of the script, check a specific file which is supposed to contain a timestamp. If the current time is greater than the value from the file, do the job, and write the new timestamp value into this file.
$time_to_run = intval(file_get_contents('my.timestamp'));
if(time() >= $time_to_run) {
do stuff
file_put_contents(time() + random value, 'my.timestamp');
}
If you need more granularity, a better option would be to run it as a daemon (see advices here) and just loop forever (probably with some sleep() inside) until the time comes.
I have the following question: how can I run a php script only once? Before people start to reply that this is indeed a similar or duplicate question, please continue reading...
The situation is as follows, I'm currently writing my own MVC Framework and I've come up with a module based system so I can easily add new functionality to my framework. In order to do so, I created a /ROOT/modules directory in which one could add the new modules.
So as you can imagine, the script needs to read the directory, read all the php files, parse them and then is able to execute the new functionality, however it has to do this for all the webbrowsers requests. This would make this task about O(nAmountOfRequests * nAmountOfModules) which is rather big on websites with a large amount of user requests every second.
Then I figured, what if I would introduce a session variable like: $_SESSION['modulesLoaded'] and then simply check if its set or not. This would reduce the load to O(nUniqueAmountOfRequests * nAmountOfModules) but this is still a large Big O if the only thing I want to do is read the directory once.
What I have now is the following:
/** Load the modules */
require_once(ROOT . DIRECTORY_SEPARATOR . 'modules' . DIRECTORY_SEPARATOR . 'module_bootloader.php');
Which exists of the following code:
<?php
//TODO: Make sure that the foreach only executes once for all the requests instead of every request.
if (!array_key_exists('modulesLoaded', $_SESSION)) {
foreach (glob('*.php') as $module) {
require_once($module);
}
$_SESSION['modulesLoaded'] = '1';
}
So now the question, is there a solution, like a superglobal variable, that I can access and exists for all requests, so instead of the previous Big Os, I can make a Big O thats only exists of nAmountOfModules? Or is there another way of easily reading the module files only once?
Something like:
if(isFirstRequest){
foreach (glob('*.php') as $module) {
require_once($module);
}
}
At the most basic form, if you want to run it once, and only once (per installation, not per user), have your intensive script change something on the server state (add a file, change a file, change a record in a database), then check against that every time a request to run it is issued.
If you find a match, it would mean the script was already run, and you can continue with the process without having to run it again.
when called, lock the file, at the end of the script, delete the file. only called once. and as so not needed any longer, vanished in nirvana.
This naturally works the other way round, too:
<?php
$checkfile = __DIR__ . '/.checkfile';
clearstatcache(false, $checkfile);
if (is_file($checkfile)) {
return; // script did run already
}
touch($checkfile);
// run the rest of your script.
Just cache the array() to a file and, when you upload new modules, just delete the file. It will have to recreate itself and then you're all set again.
// If $cache file does not exist or unserialize fails, rebuild it and save it
if(!is_file($cache) or (($cached = unserialize(file_get_contents($cache))) === false)){
// rebuild your array here into $cached
$cached = call_user_func(function(){
// rebuild your array here and return it
});
// store the $cached data into the $cache file
file_put_contents($cache, $cached, LOCK_EX);
}
// Now you have $cached file that holds your $cached data
// Keep using the $cached variable now as it should hold your data
This should do it.
PS: I'm currently rewriting my own framework and do the same thing to store such data. You could also use a SQLite DB to store all such data your framework needs but make sure to test performance and see if it fits your needs. With proper indexes, SQLite is fast.
I want to set up a simple cache feature with php. I want the script to get data from somewhere, but not to do it on every page view, but only every hour.
I know i can have a cron job that runs a php script every hour.
But I was wondering if this can be achieved without cron, just inside the php script that created the page based on the data fetched (or cached). I'm really looking the simplest solution possible. It doesn't have to be accurate
I would use APC as well, but in either case you still need some logic. Basic file cache in PHP:
if (file_exists($cache_file) and time() - filemtime($cache_file) < 3600)
{
$content = unserialize(file_get_contents($cache_file));
}
else
{
$content = your_get_content_function_here();
file_put_contents($cache_file, serialize($content));
}
You only need to serialize/unserialize if $content is not a string (e.g. an array or object).
Why just don't use APC ?
you can do
apc_store('yourkey','yourvalue',3600);
And then you can retrive the content with:
apc_fetch();
I have php scripts that call perl scripts to do various things and sometimes I get it where it just goes on and on without getting a response back, this is based on the variable that is being passed to the perl script and I am doing a lot of different ones in succession so I can't get really debug it directly since I don't have a response from perl...
I would really like to just be able to set a php function or block of code to timeout after a certain number of seconds.. I have been searching on this but haven't found anything yet on how to do this,
I was thinking something like this could work but I don't think it would dynamically update the $time variable, but maybe there is a way to get this to work? Any advice is appreciated
$time = time();
$timeout = $time + 5; //just as an example
do {
// do stuff
} while ($time < $timeout)
Your best bet would be to use proc_open, sleep for your timeout amount and then call proc_terminate if the process still hasn't completed.
See http://us3.php.net/manual/en/book.exec.php for details on the proc_* family.
Well, I'm not so sure this question would have an answer based on how I asked it, so what I am going to do is do the perl call where php doesn't wait for a response and have perl write the output to a text file, then have php read this after specified number of seconds, I think this is the simplest way to do this, its just for a small app i am running on a local server
i have a php script that accepts a POST request as a listener to a web service then process all the data to two final arrays,
I'm looking for a way to initiate a second script that GET's those serialized arrays and do some more processing.
include() will not be good for me since i actually want to "free" or "end" the first script after passing the data
your help is much appreciated as always :)
EDIT - OK so looks like queue might be the solution! i never did anything like this before any examples or reference?
Does it need to happen immediately? Otherwise you could set up a cronjob that does that every X minutes. You'll have to make some kind of queue in which your first script sticks "requests" to the second script. The cronjob then processes the requests in the queue.
You should get into the habit of writing php scripts that are just a collection of functions (no auto-ran scripts, per se). This way you can include a script file at the top of the script your talking about and then call the function that does what you want.
For instance:
<?php
include('common_functions.php');
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
?>
In Fact:
Just to be consistent with what I'm saying:
<?php
include('common_functions.php');
do_your_stuff();
function do_your_stuff() {
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
}
?>
Obviously you should use better function & variable names, haha.
I'd do it all in one request. It cuts down on latency and makes the whole operation more efficient.
Remember you can have a long running request, but still service other requests. Apache will just spawn another php process to handle the other request from the webservice even though the first has not completed. As long as the script doesn't lock a shared resource (database file etc) this will work just fine.
That said, you should use cURL to call the second script. then post the unserialized array. cUrl will handle the rest.