Single PHP script with multiple CRON jobs > optimize into > Single CRON? - php

I have a script that I run for multiple clients.
Same script, I'm just using a different GET variable to load the client credentials.
eg.
example.com/script.php?client=lego
example.com/script.php?client=nike
example.com/script.php?client=stackoverflow
I've setup multiple crons to hit the script at midnight, with each cron having a different client GET variable.
What would be the best way to run a single CRON but process all clients? So I don't need to setup a CRON each time for each client.

There can be various solutions but without knowing the code what comes to my mind is.
Delete all crons and setup just one.
example.com/script.php
Inside script.php wrap whatever you earlier had in a function, create an array of clients and call that function for every client by passing username. For example
<?php
// if you have lots of clients and script can exhaust time limit
ini_set('max_execution_time', 0);
$clients = ['lego', 'nike', 'stackoverflow'];
foreach ($clients as $client) {
myScript($client);
}
function myScript($client)
{
// Whatever you had in script.php earlier replacing $_GET['client'] with $client.
}
Hope it answers your question.

Related

Laravel: Return a value from App::call()

I'm going on a limb here; I'm trying to direct a long running script to Artisan. Is it possible for App::call() to return a string value or maybe even send an email once the long running script finishes?
I'm trying to look for more info on this, but is it right to assume that if Artisan is running I can redirect the user to something like a waiting page, maybe a looping gif?
Use Queue::push() with an appropriate driver (database, perhaps) to push the long-running job to a queue.
The last thing the long-running job should do is send some indication that it's finished.
Here's some sample code:
Queue::push(function($job) use ($id)
{
Artisan::call('my-command', ['arg1', 'arg2']);
$job->delete();
});
// Then at the end of your my-command script:
$jobModel = LongRunningJob::find($id);
$jobModel->finishedDate = Carbon::now();
$jobModel->save();
Of course you can then poll the database to determine whether the long-running command has finished.

Progress Bar when running function inside foreach loop

I have a foreach loop that calls a function to set values to an array. Sometimes it takes hours to complete depending on how many times it has to run thru the function to complete.
What I would like to have is a progress bar or at least a 1/1000 completed type progress indicator.
Is this possible? If so how could I implement this into my code? Would it be in the function or in the foreach loop? Been researching and found some examples using for and $i++ but I am not really sure how to implement that since I am already using a foreach loop.
Thanks much.
function scrape_amazon($links) {
//my code runs here to set all values in $ret array.
}
foreach($links as $link) {
$ret = scrape_amazon($link);
}
PHP probably isn't really the right tool for this task, however what you could do is:
Launch the slow code as a background process, and output progress to a file.
Have a PHP script that polls that file for progress information (either by page refresh or AJAX)
Launching the background process can be done in several ways, including:
Launch via cron every 60 seconds, and poll for new jobs spooled in some readable area
Launch via a fork/exec mechanism from a web page
Launch as a daemon at system startup
It will take some effort to avoid problems with multiple executions and/or overlap.
I use this, which well, not an ajax, do only flushing, but not so ugly.
I place an image
<img src='progress.gif' height=18 width=0 name=probar>
Then set on every event done on server a echo a line, then flush:
echo "<script language='JavaScript'>\ndocument.probar.width=".(($sys["probar_width"]/$task_all)*$task_i).";\n</script>\n";
flush();
If your server (eg. apache) use caching (eg. gzip is enabled) it won't work well.

Can a PHP script be scheduled to run at a specific time or after a specific amount of time has expired?

I am writing a social cloud game for Android and using PHP on the server. Almost all aspects of the game will be user or user-device driven, so most of the time the device will send a request to the server and the server will, in turn, send a response to the device. Sometimes the server will also send out push messages to the devices, but generally in response to one user's device contacting the server.
There is one special case, however, where a user can set a "timer" and, after the given time has elapsed, the server needs to send the push messages to all of the devices. One way to do this would be to keep the timer local to the user's device and, once it goes off, send the signal to the server to send the push messages. However, there were several reasons why I did not want to do it this way. For instance, if the user decides not to play anymore or loses the game, the timer should technically remain in play.
I looked around for a method in PHP that would allow me to do something like this, but all I came up with were alarms, which are not what I need. I also thought of cron jobs and, indeed, they have been recommended for similar situations on this and other forums, but since this is not a recurring event but, rather, a one time event to take place at an arbitrary point in time, I did not know that a cron job is what I want either.
My current best solution involves a cron job that runs once a second and checks to see if one of these events is to occur in the next second and, if so, sends out the push messages. Is this the proper way to handle this situation, or is there a better tool out there that I just haven't found yet?
cron is great for scripts run on a regular basis, but if you want a one-off (or two-off) script to run at a particular time you would use the unix 'at' command, and you can do it directly from php using code like this:
/****
* Schedule a command using the AT command
*
* To do this you need to ensure that the www-data user is allowed to
* use the 'at' command - check this in /etc/at.deny
*
*
* EXAMPLE USAGE ::
*
* scriptat( '/usr/bin/command-to-execute', 'time-to-run');
* The time-to-run shoud be in this format: strftime("%Y%m%d%H%M", $unixtime)
*
**/
function scriptat( $cmd = null, $time = null ) {
// Both parameters are required
if (!$cmd) {
error_log("******* ScriptAt: cmd not specified");
return false;
}
if (!$time) {
error_log("******* ScriptAt: time not specified");
return false;
}
// We need to locate php (executable)
if (!file_exists("/usr/bin/php")) {
error_log("~ ScriptAt: Could not locate /usr/bin/php");
return false;
}
$fullcmd = "/usr/bin/php -f $cmd";
$r = popen("/usr/bin/at $time", "w");
if (!$r) {
error_log("~ ScriptAt: unable to open pipe for AT command");
return false;
}
fwrite($r, $fullcmd);
pclose($r);
error_log("~ ScriptAt: cmd=${cmd} time=${time}");
return true;
}
soloution 1 :
your php file can include a ultimate loop
$con = true;
while($con)
{
//do sample operation
if($end)
$con = false;
else
sleep(5); // 5 seconds for example
}
soloution 2 :
use cron jobs -- Depend on yout CP you can follow the instruction and call your php program at the specific times
limit : in cron job the minimum time between two calling is 1 minute
soloution 3 :
use a shell script and call your php program when ever you want
You can make PHP sleep for a certain amount of time - it will then resume the code afterwards but this is seriously not recommended because when a script sleeps it still uses up processor resources, and if you had multiple scripts sleeping for long periods of time it would put impossible load on your server.
The only other option that I know of is Cron. As #Pete says, you can manage Cron jobs from within PHP, e.g.:
http://net.tutsplus.com/tutorials/php/managing-cron-jobs-with-php-2/
This is going to involve a fair bit of coding, but I think it is your best options.
Another option is to have your user's browser call a PHP function using an Ajax request and JavaScript's setTimeout as suggested by the accepted answer in this question:
how to call a function in PHP after 10 seconds of the page load (Not using HTML)

controlling app flow between different scripts in PHP

i have a php script that accepts a POST request as a listener to a web service then process all the data to two final arrays,
I'm looking for a way to initiate a second script that GET's those serialized arrays and do some more processing.
include() will not be good for me since i actually want to "free" or "end" the first script after passing the data
your help is much appreciated as always :)
EDIT - OK so looks like queue might be the solution! i never did anything like this before any examples or reference?
Does it need to happen immediately? Otherwise you could set up a cronjob that does that every X minutes. You'll have to make some kind of queue in which your first script sticks "requests" to the second script. The cronjob then processes the requests in the queue.
You should get into the habit of writing php scripts that are just a collection of functions (no auto-ran scripts, per se). This way you can include a script file at the top of the script your talking about and then call the function that does what you want.
For instance:
<?php
include('common_functions.php');
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
?>
In Fact:
Just to be consistent with what I'm saying:
<?php
include('common_functions.php');
do_your_stuff();
function do_your_stuff() {
$array_1 = whatever_you_do_with_post_values();
$array_2 = other_thing_you_do_with_post_values();
// this function is located in 'common_functions.php'
do_stuff_with_arrays($array_1,$array_2);
}
?>
Obviously you should use better function & variable names, haha.
I'd do it all in one request. It cuts down on latency and makes the whole operation more efficient.
Remember you can have a long running request, but still service other requests. Apache will just spawn another php process to handle the other request from the webservice even though the first has not completed. As long as the script doesn't lock a shared resource (database file etc) this will work just fine.
That said, you should use cURL to call the second script. then post the unserialized array. cUrl will handle the rest.

Speeding up a PHP App

I have a list of data that needs to be processed. The way it works right now is this:
A user clicks a process button.
The PHP code takes the first item that needs to be processed, takes 15-25 secs to process it, moves on to the next item, and so on.
This takes way too long. What I'd like instead is that:
The user clicks the process button.
A PHP script takes the first item and starts to process it.
Simultaneously another instance of the script takes the next item and processes it.
And so on, so around 5-6 of the items are being process simultaneously and we get 6 items processed in 15-25 secs instead of just one.
Is something like this possible?
I was thinking that I use CRON to launch an instance of the script every second. All items that need to be processed will be flagged as such in the MySQL database, so whenever an instance is launched through CRON, it will simply take the next item flagged to be processed and remove the flag.
Thoughts?
Edit: To clarify something, each 'item' is stored in a mysql database table as seperate rows. Whenever processing starts on an item, it is flagged as being processed in the db, hence each new instance will simply grab the next row which is not being processed and process it. Hence I don't have to supply the items as command line arguments.
Here's one solution, not the greatest, but will work fine on Linux:
Split the processing PHP into a separate CLI scripts in which:
The command line inputs include `$id` and `$item`
The script writes its PID to a file in `/tmp/$id.$item.pid`
The script echos results as XML or something that can be read into PHP to stdout
When finished the script deletes the `/tmp/$id.$item.pid` file
Your master script (presumably on your webserver) would do:
`exec("nohup php myprocessing.php $id $item > /tmp/$id.$item.xml");` for each item
Poll the `/tmp/$id.$item.pid` files until all are deleted (sleep/check poll is enough)
If they are never deleted kill all the processing scripts and report failure
If successful read the from `/tmp/$id.$item.xml` for format/output to user
Delete the XML files if you don't want to cache for later use
A backgrounded nohup started application will run independent of the script that started it.
This interested me sufficiently that I decided to write a POC.
test.php
<?php
$dir = realpath(dirname(__FILE__));
$start = time();
// Time in seconds after which we give up and kill everything
$timeout = 25;
// The unique identifier for the request
$id = uniqid();
// Our "items" which would be supplied by the user
$items = array("foo", "bar", "0xdeadbeef");
// We exec a nohup command that is backgrounded which returns immediately
foreach ($items as $item) {
exec("nohup php proc.php $id $item > $dir/proc.$id.$item.out &");
}
echo "<pre>";
// Run until timeout or all processing has finished
while(time() - $start < $timeout)
{
echo (time() - $start), " seconds\n";
clearstatcache(); // Required since PHP will cache for file_exists
$running = array();
foreach($items as $item)
{
// If the pid file still exists the process is still running
if (file_exists("$dir/proc.$id.$item.pid")) {
$running[] = $item;
}
}
if (empty($running)) break;
echo implode($running, ','), " running\n";
flush();
sleep(1);
}
// Clean up if we timeout out
if (!empty($running)) {
clearstatcache();
foreach ($items as $item) {
// Kill process of anything still running (i.e. that has a pid file)
if(file_exists("$dir/proc.$id.$item.pid")
&& $pid = file_get_contents("$dir/proc.$id.$item.pid")) {
posix_kill($pid, 9);
unlink("$dir/proc.$id.$item.pid");
// Would want to log this in the real world
echo "Failed to process: ", $item, " pid ", $pid, "\n";
}
// delete the useless data
unlink("$dir/proc.$id.$item.out");
}
} else {
echo "Successfully processed all items in ", time() - $start, " seconds.\n";
foreach ($items as $item) {
// Grab the processed data and delete the file
echo(file_get_contents("$dir/proc.$id.$item.out"));
unlink("$dir/proc.$id.$item.out");
}
}
echo "</pre>";
?>
proc.php
<?php
$dir = realpath(dirname(__FILE__));
$id = $argv[1];
$item = $argv[2];
// Write out our pid file
file_put_contents("$dir/proc.$id.$item.pid", posix_getpid());
for($i=0;$i<80;++$i)
{
echo $item,':', $i, "\n";
usleep(250000);
}
// Remove our pid file to say we're done processing
unlink("proc.$id.$item.pid");
?>
Put test.php and proc.php in the same folder of your server, load test.php and enjoy.
You will of course need nohup (unix) and PHP cli to get this to work.
Lots of fun, I may find a use for it later.
Use an external workqueue like Beanstalkd which your PHP script writes a bunch of jobs too. You have as many worker processes pulling jobs from beanstalkd and processing them as fast as possible. You can spin up as many workers as you have memory / CPU. Your job body should contain as little information as possible, maybe just some IDs which you hit the DB with. beanstalkd has a slew of client APIs and itself has a very basic API, think memcached.
We use beanstalkd to process all of our background jobs, I love it. Easy to use, its very fast.
There is no multithreading in PHP, however you can use fork.
php.net:pcntl-fork
Or you could execute a system() command and start another process which is multithreaded.
can you implementing threading in javascript on the client side? seems to me i've seen a javascript library (from google perhaps?) that implements it. google it and i'm sure you'll find something. i've never done it, but i know its possible. anyway, your client-side javascript could activate (ajax) a php script once for each item in separate threads. that might be easier than trying to do it all on the server side.
-don
If you are running a high traffic PHP server you are INSANE if you do not use Alternative PHP Cache: http://php.net/manual/en/book.apc.php . You do not have to make code modifications to run APC.
Another useful technique that can work along with APC is using the Smarty template system which allows you to cache output so that pages do not have to be rebuilt.
To solve this problem, I've used two different products; Gearman and RabbitMQ.
The benefit of putting your jobs into some sort of queuing software like Gearman or Rabbit is that you have multiple machines, they can all participate in processing items off the queue(s).
Gearman is easier to setup, so I'd suggest poking around with it a bit first. If you find you need something more heavy duty with queue robustness; Look into RabbitMQ
http://www.danga.com/gearman/
http://pear.php.net/package/Net_Gearman (PEAR library)
You can use pcntl_fork() and family to fork a process - however you may need something like IPC to communicate back to the parent process that the child process (the one you fork'd) is finished.
You could have them write to shared memory, like via memcache or a DB.
You could also have the child process write the completed data to a file, that the parent process keeps checking - as each child process completes the file is created/written to/updated, and parent process can grab it, one at a time, and them throw them back to the callee/client.
The parent's job is to control the queue, to make sure the same data isn't processed twice and also to sanity check the children (better kill that runaway process and start over...etc)
Something else to keep in mind - on windows platforms you are going to be severely limited - I dont even think you have access to pcntl_ unless you compiled PHP with support for it.
Also, can you cache the data once its been processed, or is it unique data every time? that would surely speed things up..?

Categories