Lets imagine my situation (it's fake, of course)...
I have web-site that have 1000 active users. I need to do some stuff that will be slow (database clean-up, backups) etc.. For example, it may take 5 minutes.
I want to accomplish that when user opens my web-site and some params are executed (for example, he is first visitor in one week)... user somehow sends signal to server - that slow process must be run now, but he doesn't wait for it's execution (those several minutes)... he just sees notification that server is making some updates or whatever. Any other user that opens my web-site in that time also sees that notification.
When process is executed - web-site returns to it normal behavior. And it all happens automatically.
Is this possible without cron ?
Check Execution Operators. They may be what you are looking for.
Something like this
$execute = `/path/to/php/script.php`;
// If you needed the output
echo $execute;
Once this script runs, set a flag (could be in the database or a simple write in a file or something). When the script ends, delete this flag.
In all your other pages - check this flag. If it is ON then display the message.
On your page you could have a database value i.e. script_runninng then check if the value = 0, send the command, if = 1 do nothing. Then make sure you set the value to 1 at the beginning and 0 at the end of your command.
You are can use CURL for this task. In database or textfile save time last run script and one day in week run this script (of course, update date before execute script):
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://www.exaple.com/service_script.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$result_curl = curl_exec($ch);
curl_close($ch);
Related
I have a site that sends urls via email to my client, once they receive them they click the link, it loads in a browser and completes some Ajax that calls a PHP script. There are several AJAX functions being called in this script.
They have requested that this process be automated so they dont have to click the link and wait approximately 15 minutes each time for all the Ajax to be complete.
Ideally, without recoding the functionality and continuing to use the exact same Ajax I would love to automate this process. So I would like to run a cron that loads a script that calls these URLS instead.
Is this possible?
I have tried several things, but nothing is happening when I load the script. When I mean nothing I mean neither errors, nor the functionality of the script. (I have error reporting turned on).
E.g.
cUrl
init_set('display_errors',1);
error_reporting(E_ALL);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/myscript.php?varialbe=sample_get_content");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
exec
exec('http://example.com/myscript.php');
simply opening the script...
$contents = file_get_contents('http://example.com/myscript.php?varialbe=sample_get_content');
I know that another option is to rebuild the functionality so that Im not using AJAX, but I would prefer not do that as it will take time.
EDIT: The actual script URL itself being called changes due to change in GET variables, so I cannot run it directly via cron (or can I?)
Suggested approach.
In script that sends link, instead of sending a link with unique GET data, have it do this:
exec("./myscript.php $param_1 $param_2");
In myscript.php replace:
$param_1 = $_GET['param_1'];
$param_2 = $_GET['param_2'];
With
$param_1 = $argv[1];
$param_2 = $argv[2];
http://php.net/manual/en/reserved.variables.argv.php
Also add
#!/path/to/phpcgibin -q
to myscript.php before the <? and make sure to upload as ascii
I have a VERY labor intensive PHP script, which does several api calls to a server elsewhere.
I need to run this script to keep certain data on my server, synchronized with data on the remote server.
I want this script to start every time a specific type of user visits a specific page.
My problem is however, if a user that is qualified goes to this page, page load-time is REDONCULOUS, even though the data the script processes, doesn't effect the page itself in any way.
So, what i was wondering is, how can i run this script using the same conditions, but run it only on my server?
In other words, how can i run this script and stop the browser from waiting for its output?
EDIT: useful information: Using XAMPP for Windows, PHP 5.5, Apache 2.4.
EDIT 2: Using curl seems to be the best option, but it doesn't want to actually run my script.
Here's the call:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/tool/follow/PriceTableUpdate.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
And here is the actual PriceTableUpdate.php:
<?php
ini_set('max_execution_time', 3600);
$marketData=simplexml_load_file("http://www.eve-markets.net/api/xml?listid=20158key=JKgLjgYvlY6nP");
foreach ($marketData->marketList->type as $type) {
$ItemID = (int)$type['id'];
$ItemBuy = $type->buy->price;
$ItemSell = $type->sell->price;
$ItemMedian = $type->median->price;
mysqli_query($con,"UPDATE piprices SET `ItemBuyPrice` = $ItemBuy, `ItemSellPrice` = $ItemSell, `ItemMedianPrice` =$ItemMedian WHERE `piprices`.`ItemID` = $ItemID");
}
?>
EDIT 3:
Using the above DOES work, in case anyone ever wants to ask this question again.
You have to remember though, that since you are using curl, the php file no longer uses variables you've set before, so you will need to define your database connection in the php file again.
Why not using AJAX in this? When the page loads and meets your specific conditions, make an AJAX request to the server and start the script without waiting for a response back to the browser.
You can probably make a separate call to your php script with the onLoad event - that is, you wait until the page is loaded, then call this other script "in the background". The latter can be done with the following lines of code (I found this by following a link http://www.paul-norman.co.uk/2009/06/asynchronous-curl-requests/ posted by #Gavriel in a comment to one of the other answers):
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.yoursite.com/background-script.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
?>
Adding this code anywhere in your page should cause the script to be executed without delay for the page load - you won't even need to use a onLoad event in that case.
If I understand what you want to do, then one possible solution is to run the other php in another thread. In php you can do this by calling it via curl: http://php.net/curl
You should separate the browser-request from the background-data-request.
When the user does the access, then create an item in a message-queue server. put required data within that message. let the queue executed from the same/different machine.
if you are not split the request, you webserver worker process stays alive until php has full executed you script. the also say, blocking the browser.
you can "flush" the current output, but server still wait until php is ready to close the browser connection.
Try to use exec() function
If your server is a unix-like os, terminating your command with & will launch the command and won't wait the end of it
exec('./mylongcommand &');
Never tried this, but it should work...
Have a long running, background, process that processes jobs from a queue; something similar to beanstalkd.
When this process comes across a job named, for example, 'sync.schizz' it will start your sync . Now you just need to pop a job into the queue when your special visitor swings by - which will be lightning fast.
There are a couple ways. One, you can fork the execution:
<?php
# only works if pcntl_fork is installed on POSIX system
# will NOT work on windows
# returns new child id
$pid = pcntl_fork();
if ($pid == -1) {
die("could not fork\n");
} else if ($pid) {
$my_pid = getmypid();
print " I am the parent. pid = $my_pid \n";
pcntl_wait($status); //Protect against Zombie children
} else { # child pid is 0 in child
$my_pid = getmypid();
print " I am the child. pid = $my_pid \n";
}
# example output
# I am the parent. pid = 23412
# I am the child. pid = 23414
Or, fork the process at the OS system level (with command &, assuming you are running PHP on linux/unix). So, PHP may be used to execute a shell script.
Also, some people have suggested Ajax, though you need to be careful to consider the case where multiple scripts can be fired off at the same time. What will this do to load on the server, and resource locking? There probably also needs to be some locking logic to ensure only one script is executing at a time, along with a 'heartbeat' that lets you know if the process is alive or dead.
Using the curl function display in the OP, i got it to work.
I forgot to add mysql connection details, which i didn't need before ( when using include ).
I've been completely unsuccessful finding an answer to this question. Hopefully someone here can help.
I have a PHP script (a WordPress template, to be specific) that automatically imports and processes images when a user hits it. The problem is that the image processing takes up a lot of memory, particularly if multiple users are accessing the template at the same time and initiating the image processing. My server crashed multiple times because of this.
My solution to this was to not execute the image-processing function if it was already running. Before the function started running, I would check a database entry named image_import_running to see if it was set to false. If it was, the function then ran. The very first thing the function did was set image_import_running to true. Then, after it was all finished, I set it back to false.
It worked great -- in theory. The site hasn't crashed since, I can tell you that. But there are two major problems with it:
If the user closes the page while it's loading, the script never finishes processing the images and therefore never sets image_import_running back to false. The template will never process images again until it's manually set to false.
If the script times out while it's processing images -- and that's a strong possibility if there are many images in the queue -- you have essentially the same problem as No. 1: the script never gets to the point where it sets image_import_running back to false.
To handle No. 1 (the first one of the two problems I realized), I added ignore_user_abort(true) to the script. Did it work? I don't know, because No. 2 is still an issue. That's where I'm stumped.
If I could ask the server whether the script was running or not, I could do something like this:
if($import_running && $script_not_running) {
$import_running = false;
}
But how do I set that $script_not_running variable? Beats me.
I've shared this entire story with you just in case you have some other brilliant solution.
Try using
ignore_user_abort(true); it will continue to run even if the person leaves and closes the browser.
you might also want to put a number instead of true false in the db record and set a maximum number of processes that can run together
As others have suggested, it would be best to move the image processing out of the request itself.
As an interim "fix", store a timestamp alongside image_import_running when a processing job begins (e.g., image_import_commenced). This is a very crude mechanism, but if you know the maximum time that a job can run before timing out, the script can check whether that period of time has elapsed.
e.g., if image_import_running is still true but the current time is more than 10 minutes since image_import_commenced, run the processing anyway.
What about setting a transient with an expiry time that would throttle the operation?
if(!get_transient( 'import_running' )) {
set_transient( 'import_running', true, 30 ); // set a 30 second transient on the import.
run_the_import_function();
}
I would rather store the job into database flagging it pending and set a cron job to execute the processing one job at a time.
For Me i use just this simple idea with a text document. for example run.txt file
in the top script use :
if((file_get_contents('run.txt') != 'run'){ // here the script will work
$file = fopen('run.txt', 'w+');
fwrite($file, 'run');
fclose('run.txt');
}else{
exit(); // if it find 'run' in run.txt the script will stop
}
And add this in the end of your script file
$file = fopen('run.txt', 'w+');
fwrite($file, ''); //will delete run word for the next try ;)
fclose('run.txt');
That will check if script already work by checking runt.txt contents
if run word exist in run.txt it will not run
Running a cron would definitively be a better solution. Idea to store url in a table is a good one.
To answer to the original question, you may run a ps auxwww command with exec (Check this page: How to get list of running php scripts using PHP exec()? ) and move your function in a separated php file.
exec("ps auxwww|grep myfunction.php|grep -v grep", $output);
Just add following on the top of your script.
<?php
// Ensures single instance of script run at a time.
$fileName = basename(__FILE__);
$output = shell_exec("ps -ef | grep -v grep | grep $fileName | wc -l");
//echo $output;
if ($output > 2)
{
echo "Already running - $fileName\n";
exit;
}
// Your php script code.
?>
I was wondering if there's a way to have a php script on my web server email me whenever a file from another web server changes.
For instance, there's this file that changes frequently: http://media1.clubpenguin.com/play/en/web_service/game_configs/paper_items.json
I blog about a game and that file is very important for creating post on updates before my competitors. I often forget to check it though.
Is there a way to have a script email me whenever that file updates, or check that file to see if it has updated, and email me if it has?
Use crontab to setup checking script to run once a minute and compare this file with your locally stored version (or use md5 checksums instead - it will differ if file changes).
file_get_contents('http://url-to-file', 'checkfile.tmp');
if (md5(file_get_contents('lastfile.tmp')) != md5(file_get_contents('checkfile.tmp')))
{
//copy checkfile to lastfile
unlink('lastfile.tmp');
copy('checkfile.tmp', 'lastfile.tmp');
//send email or do something you want ;)
}
You need have this two files in same folder.
old.json
scriptForCron.php
In scriptForCron.php write:
$url='http://media1.clubpenguin.com/play/en/web_service/game_configs/paper_items.json';
$ch = curl_init($url);
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
$execute = curl_exec($ch);
$fp=fopen('old.json','w+');
$oldjson=fread($fp,filesize('old.json'));
if($execute!=$oldjson){
mail('your#mail.com','Yoohoo', 'File changed');
fputs($fp,$execute);
}
fclose($fp);
And then add scriptForCron.php to cron job.
You can ask hosting support for it.
This code does not check for updates in realtime - it would be pretty much impossible - but every 1 hour/minute.
First, save a file on your system which has the same contents as this. Name it any way, for example paper_items.json.
Now make a file named checkitems.php. Read the file which changes frequently, compare if it's contents are equal to your paper_items.json. If equal, nothing to do, if not, save the online file to your local paper_items.json and use PHP's mail() to email you something like "there was a change".
Finally, set up a cron job to run this every n (for example 1) hour or 1 minute, etc.
I'm building a forum that will allow users to upload images. Images will be stored on my web server temporarily (so I can upload them with a progress bar) before being moved to an S3 bucket. I haven't figured out a brilliant way of doing this, but I think the following makes sense:
Upload image(s) to the web server using XHR with progress bar
Wait for user to submit his post
Unlink images he did not end up including in his post
Call a URL that uploads the remaining images to S3 (and update image URLs in post body when done)
Redirect user to his post in the topic
Now, since step 4 can take a considerable amount of time, I'm looking for a cron like solution, where I can call the S3 upload script in the background and not have the user wait for it to complete.
Ideally, I'm looking for a solution that allows me to request a URL within my framework and pass some image id's in the query, i.e.:
http://mysite.com/utils/move-to-s3/?images=1,2,3
Can I use a cURL for this purpose? Or if it has to be exec(), can I still have it execute a URL (wget?) instead of a PHP script (php-cli)?
Thanks a heap!
PHP's
register_shutdown_function()
is your friend [reference].
The shutdown function keeps running, while your script terminated.
Thus, if everything is available, submit the finale page and exit. The the registered shutdown function continues and performs the time-consuming job.
In my case, I prepared a class CShutdownManager, which allows to register several method to be called after script termination. For example, I use CShutdownManager to delete temporary files no longer needed.
Try the following statement:
shell_exec(php scriptname);
I found the solution to this problem which I'm happy to share. I actually found it on SO, but it needed some tweaking. Here goes:
The solution requires either exec() or shell_exec(). It doesn't really matter which one you use, since all output will be discarded anyway. I chose exec().
Since I am using MAMP, rather than a system-level PHP install, I needed to point to the PHP binary inside the MAMP package. (This actually made a difference.) I decided to define this path in a constant named PHP_BIN, so I can set a different path for local and live environments. In this case:
define(PHP_BIN, '/Applications/MAMP/bin/php/php5.3.6/bin/php');
Ideally, I wanted to execute a script inside my web framework instead of some isolated shell script. I wrote a wrapper script that accepts a URL as an argument and named it my_curl.php:
if(isset($argv[1]))
{
$url = $argv[1];
if(preg_match('/^http(s)?:\/\//', $url))
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
}
In this SO question I found the way to execute a shell command in the background. This is necessary because I don't want the user to have to wait until it's finished.
Finally, I run this bit of PHP code to execute the other (or ANY) web request in the background:
exec(PHP_BIN . ' /path/to/my_curl.php http://site.com/url/to/request
&> /dev/null &');
Works like a charm! Hope this helps.