I have a site that sends urls via email to my client, once they receive them they click the link, it loads in a browser and completes some Ajax that calls a PHP script. There are several AJAX functions being called in this script.
They have requested that this process be automated so they dont have to click the link and wait approximately 15 minutes each time for all the Ajax to be complete.
Ideally, without recoding the functionality and continuing to use the exact same Ajax I would love to automate this process. So I would like to run a cron that loads a script that calls these URLS instead.
Is this possible?
I have tried several things, but nothing is happening when I load the script. When I mean nothing I mean neither errors, nor the functionality of the script. (I have error reporting turned on).
E.g.
cUrl
init_set('display_errors',1);
error_reporting(E_ALL);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/myscript.php?varialbe=sample_get_content");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
exec
exec('http://example.com/myscript.php');
simply opening the script...
$contents = file_get_contents('http://example.com/myscript.php?varialbe=sample_get_content');
I know that another option is to rebuild the functionality so that Im not using AJAX, but I would prefer not do that as it will take time.
EDIT: The actual script URL itself being called changes due to change in GET variables, so I cannot run it directly via cron (or can I?)
Suggested approach.
In script that sends link, instead of sending a link with unique GET data, have it do this:
exec("./myscript.php $param_1 $param_2");
In myscript.php replace:
$param_1 = $_GET['param_1'];
$param_2 = $_GET['param_2'];
With
$param_1 = $argv[1];
$param_2 = $argv[2];
http://php.net/manual/en/reserved.variables.argv.php
Also add
#!/path/to/phpcgibin -q
to myscript.php before the <? and make sure to upload as ascii
Related
The main reason is because I don't want to hold up the current PHP process. I want users to be able to navigate around during the script execution.
The script in question (importer.php) updates a txt file with a percentage as it completes, javascript intercepts this txt file and outputs the percentage using a timer every 5 seconds to keep the user updated (all in the form of a load bar).
I've been able to launch the script like so:
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
pclose(popen($cmd, "r"));
exit;
This runs the script, but hangs the current process until importer.php completes. Is there a way to get out of the current process and launch this using another one instead?
I read that using & at the end of the cmd tells the script to not wait, but I believe this is a *nix command and since I'm running on a Windows box, I can't use it... unless perhaps there is an alternative for Windows?
According to the documentation at http://php.net/passthru you should be able to execute your command using that, as long as you redirect your output.
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
// Use passthrough here, and redirect the output to a temp textfile.
passthru($cmd . '>%TEMP%\importerOutput.txt');
exit;
I was able to resolve this issue by using a WshShell Object; WScript.Shell
$WshShell = new COM("WScript.Shell");
$WshShell->Run('"C:\/path\/to\/v5.4\/php-win.exe" -f "C:\/path\/to\/code\/snippet\/importer.php" var1 var2 var3', 0, false);
Note: I have spaces in my file structure so I needed to add quotes around the paths to the files. I was also able to pass variables, var1, var2, and var3. I've also used \/ to escape my slashes.
I'll break the Run array down a bit for my case:
The first; is the command you want to run (path to php, path to script, and variables to pass).
The second; 0 - Hides the window and activates another window (link below for more options).
The third; false - Boolean value indicating whether the script should wait for the program to finish executing before continuing to the next statement in your script. If set to true, script execution halts until the program finishes.
For more information on WScript.Shell visit http://msdn.microsoft.com/en-us/library/d5fk67ky(v=vs.84).aspx for details.
Hope this helps someone else!
I have a VERY labor intensive PHP script, which does several api calls to a server elsewhere.
I need to run this script to keep certain data on my server, synchronized with data on the remote server.
I want this script to start every time a specific type of user visits a specific page.
My problem is however, if a user that is qualified goes to this page, page load-time is REDONCULOUS, even though the data the script processes, doesn't effect the page itself in any way.
So, what i was wondering is, how can i run this script using the same conditions, but run it only on my server?
In other words, how can i run this script and stop the browser from waiting for its output?
EDIT: useful information: Using XAMPP for Windows, PHP 5.5, Apache 2.4.
EDIT 2: Using curl seems to be the best option, but it doesn't want to actually run my script.
Here's the call:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/tool/follow/PriceTableUpdate.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
And here is the actual PriceTableUpdate.php:
<?php
ini_set('max_execution_time', 3600);
$marketData=simplexml_load_file("http://www.eve-markets.net/api/xml?listid=20158key=JKgLjgYvlY6nP");
foreach ($marketData->marketList->type as $type) {
$ItemID = (int)$type['id'];
$ItemBuy = $type->buy->price;
$ItemSell = $type->sell->price;
$ItemMedian = $type->median->price;
mysqli_query($con,"UPDATE piprices SET `ItemBuyPrice` = $ItemBuy, `ItemSellPrice` = $ItemSell, `ItemMedianPrice` =$ItemMedian WHERE `piprices`.`ItemID` = $ItemID");
}
?>
EDIT 3:
Using the above DOES work, in case anyone ever wants to ask this question again.
You have to remember though, that since you are using curl, the php file no longer uses variables you've set before, so you will need to define your database connection in the php file again.
Why not using AJAX in this? When the page loads and meets your specific conditions, make an AJAX request to the server and start the script without waiting for a response back to the browser.
You can probably make a separate call to your php script with the onLoad event - that is, you wait until the page is loaded, then call this other script "in the background". The latter can be done with the following lines of code (I found this by following a link http://www.paul-norman.co.uk/2009/06/asynchronous-curl-requests/ posted by #Gavriel in a comment to one of the other answers):
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.yoursite.com/background-script.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
?>
Adding this code anywhere in your page should cause the script to be executed without delay for the page load - you won't even need to use a onLoad event in that case.
If I understand what you want to do, then one possible solution is to run the other php in another thread. In php you can do this by calling it via curl: http://php.net/curl
You should separate the browser-request from the background-data-request.
When the user does the access, then create an item in a message-queue server. put required data within that message. let the queue executed from the same/different machine.
if you are not split the request, you webserver worker process stays alive until php has full executed you script. the also say, blocking the browser.
you can "flush" the current output, but server still wait until php is ready to close the browser connection.
Try to use exec() function
If your server is a unix-like os, terminating your command with & will launch the command and won't wait the end of it
exec('./mylongcommand &');
Never tried this, but it should work...
Have a long running, background, process that processes jobs from a queue; something similar to beanstalkd.
When this process comes across a job named, for example, 'sync.schizz' it will start your sync . Now you just need to pop a job into the queue when your special visitor swings by - which will be lightning fast.
There are a couple ways. One, you can fork the execution:
<?php
# only works if pcntl_fork is installed on POSIX system
# will NOT work on windows
# returns new child id
$pid = pcntl_fork();
if ($pid == -1) {
die("could not fork\n");
} else if ($pid) {
$my_pid = getmypid();
print " I am the parent. pid = $my_pid \n";
pcntl_wait($status); //Protect against Zombie children
} else { # child pid is 0 in child
$my_pid = getmypid();
print " I am the child. pid = $my_pid \n";
}
# example output
# I am the parent. pid = 23412
# I am the child. pid = 23414
Or, fork the process at the OS system level (with command &, assuming you are running PHP on linux/unix). So, PHP may be used to execute a shell script.
Also, some people have suggested Ajax, though you need to be careful to consider the case where multiple scripts can be fired off at the same time. What will this do to load on the server, and resource locking? There probably also needs to be some locking logic to ensure only one script is executing at a time, along with a 'heartbeat' that lets you know if the process is alive or dead.
Using the curl function display in the OP, i got it to work.
I forgot to add mysql connection details, which i didn't need before ( when using include ).
I need to build a system that a user will send file to the server
then php will run a command-line tool using system() ( example tool.exe userfile )
i need a way to see the pid of the process to know the user that have start the tool
and a way to know when the tool have stop .
Is this possible on a Windows vista Machine , I can't move to a Linux Server .
besides that the code must continue run when the user close the browser windows
Rather than trying to obtain the ID of a process and monitor how long it runs, I think that what you want to do is have a "wrapper" process that handles pre/post-processing, such as logging or database manipulation.
The first step to the is to create an asynchronous process, that will run independently of the parent and allow it to be started by a call to a web page.
To do this on Windows, we use WshShell:
$cmdToExecute = "tool.exe \"$userfile\"";
$WshShell = new COM("WScript.Shell");
$result = $WshShell->Run($cmdToExecute, 0, FALSE);
...and (for completeness) if we want to do it on *nix, we append > /dev/null 2>&1 & to the command:
$cmdToExecute = "/usr/bin/tool \"$userfile\"";
exec("$cmdToExecute > /dev/null 2>&1 &");
So, now you know how to start an external process that will not block your script, and will continue execution after your script has finished. But this doesn't complete the picture - because you want to track the start and end times of the external process. This is quite simple - we just wrap it in a little PHP script, which we shall call...
wrapper.php
<?php
// Fetch the arguments we need to pass on to the external tool
$userfile = $argv[1];
// Do any necessary pre-processing of the file here
$startTime = microtime(TRUE);
// Execute the external program
exec("C:/path/to/tool.exe \"$userfile\"");
// By the time we get here, the external tool has finished - because
// we know that a standard call to exec() will block until the called
// process finishes
$endTime = microtime(TRUE);
// Log the times etc and do any post processing here
So instead of executing the tool directly, we make our command in the main script:
$cmdToExecute = "php wrapper.php \"$userfile\"";
...and we should have a finely controllable solution for what you want to do.
N.B. Don't forget to escapeshellarg() where necessary!
I'm building a forum that will allow users to upload images. Images will be stored on my web server temporarily (so I can upload them with a progress bar) before being moved to an S3 bucket. I haven't figured out a brilliant way of doing this, but I think the following makes sense:
Upload image(s) to the web server using XHR with progress bar
Wait for user to submit his post
Unlink images he did not end up including in his post
Call a URL that uploads the remaining images to S3 (and update image URLs in post body when done)
Redirect user to his post in the topic
Now, since step 4 can take a considerable amount of time, I'm looking for a cron like solution, where I can call the S3 upload script in the background and not have the user wait for it to complete.
Ideally, I'm looking for a solution that allows me to request a URL within my framework and pass some image id's in the query, i.e.:
http://mysite.com/utils/move-to-s3/?images=1,2,3
Can I use a cURL for this purpose? Or if it has to be exec(), can I still have it execute a URL (wget?) instead of a PHP script (php-cli)?
Thanks a heap!
PHP's
register_shutdown_function()
is your friend [reference].
The shutdown function keeps running, while your script terminated.
Thus, if everything is available, submit the finale page and exit. The the registered shutdown function continues and performs the time-consuming job.
In my case, I prepared a class CShutdownManager, which allows to register several method to be called after script termination. For example, I use CShutdownManager to delete temporary files no longer needed.
Try the following statement:
shell_exec(php scriptname);
I found the solution to this problem which I'm happy to share. I actually found it on SO, but it needed some tweaking. Here goes:
The solution requires either exec() or shell_exec(). It doesn't really matter which one you use, since all output will be discarded anyway. I chose exec().
Since I am using MAMP, rather than a system-level PHP install, I needed to point to the PHP binary inside the MAMP package. (This actually made a difference.) I decided to define this path in a constant named PHP_BIN, so I can set a different path for local and live environments. In this case:
define(PHP_BIN, '/Applications/MAMP/bin/php/php5.3.6/bin/php');
Ideally, I wanted to execute a script inside my web framework instead of some isolated shell script. I wrote a wrapper script that accepts a URL as an argument and named it my_curl.php:
if(isset($argv[1]))
{
$url = $argv[1];
if(preg_match('/^http(s)?:\/\//', $url))
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
}
In this SO question I found the way to execute a shell command in the background. This is necessary because I don't want the user to have to wait until it's finished.
Finally, I run this bit of PHP code to execute the other (or ANY) web request in the background:
exec(PHP_BIN . ' /path/to/my_curl.php http://site.com/url/to/request
&> /dev/null &');
Works like a charm! Hope this helps.
Lets imagine my situation (it's fake, of course)...
I have web-site that have 1000 active users. I need to do some stuff that will be slow (database clean-up, backups) etc.. For example, it may take 5 minutes.
I want to accomplish that when user opens my web-site and some params are executed (for example, he is first visitor in one week)... user somehow sends signal to server - that slow process must be run now, but he doesn't wait for it's execution (those several minutes)... he just sees notification that server is making some updates or whatever. Any other user that opens my web-site in that time also sees that notification.
When process is executed - web-site returns to it normal behavior. And it all happens automatically.
Is this possible without cron ?
Check Execution Operators. They may be what you are looking for.
Something like this
$execute = `/path/to/php/script.php`;
// If you needed the output
echo $execute;
Once this script runs, set a flag (could be in the database or a simple write in a file or something). When the script ends, delete this flag.
In all your other pages - check this flag. If it is ON then display the message.
On your page you could have a database value i.e. script_runninng then check if the value = 0, send the command, if = 1 do nothing. Then make sure you set the value to 1 at the beginning and 0 at the end of your command.
You are can use CURL for this task. In database or textfile save time last run script and one day in week run this script (of course, update date before execute script):
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://www.exaple.com/service_script.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$result_curl = curl_exec($ch);
curl_close($ch);