I have a site that sends urls via email to my client, once they receive them they click the link, it loads in a browser and completes some Ajax that calls a PHP script. There are several AJAX functions being called in this script.
They have requested that this process be automated so they dont have to click the link and wait approximately 15 minutes each time for all the Ajax to be complete.
Ideally, without recoding the functionality and continuing to use the exact same Ajax I would love to automate this process. So I would like to run a cron that loads a script that calls these URLS instead.
Is this possible?
I have tried several things, but nothing is happening when I load the script. When I mean nothing I mean neither errors, nor the functionality of the script. (I have error reporting turned on).
E.g.
cUrl
init_set('display_errors',1);
error_reporting(E_ALL);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/myscript.php?varialbe=sample_get_content");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
exec
exec('http://example.com/myscript.php');
simply opening the script...
$contents = file_get_contents('http://example.com/myscript.php?varialbe=sample_get_content');
I know that another option is to rebuild the functionality so that Im not using AJAX, but I would prefer not do that as it will take time.
EDIT: The actual script URL itself being called changes due to change in GET variables, so I cannot run it directly via cron (or can I?)
Suggested approach.
In script that sends link, instead of sending a link with unique GET data, have it do this:
exec("./myscript.php $param_1 $param_2");
In myscript.php replace:
$param_1 = $_GET['param_1'];
$param_2 = $_GET['param_2'];
With
$param_1 = $argv[1];
$param_2 = $argv[2];
http://php.net/manual/en/reserved.variables.argv.php
Also add
#!/path/to/phpcgibin -q
to myscript.php before the <? and make sure to upload as ascii
The main reason is because I don't want to hold up the current PHP process. I want users to be able to navigate around during the script execution.
The script in question (importer.php) updates a txt file with a percentage as it completes, javascript intercepts this txt file and outputs the percentage using a timer every 5 seconds to keep the user updated (all in the form of a load bar).
I've been able to launch the script like so:
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
pclose(popen($cmd, "r"));
exit;
This runs the script, but hangs the current process until importer.php completes. Is there a way to get out of the current process and launch this using another one instead?
I read that using & at the end of the cmd tells the script to not wait, but I believe this is a *nix command and since I'm running on a Windows box, I can't use it... unless perhaps there is an alternative for Windows?
According to the documentation at http://php.net/passthru you should be able to execute your command using that, as long as you redirect your output.
$cmd = '"C:\/path\/to\/v5.4\/php" importer.php';
// Use passthrough here, and redirect the output to a temp textfile.
passthru($cmd . '>%TEMP%\importerOutput.txt');
exit;
I was able to resolve this issue by using a WshShell Object; WScript.Shell
$WshShell = new COM("WScript.Shell");
$WshShell->Run('"C:\/path\/to\/v5.4\/php-win.exe" -f "C:\/path\/to\/code\/snippet\/importer.php" var1 var2 var3', 0, false);
Note: I have spaces in my file structure so I needed to add quotes around the paths to the files. I was also able to pass variables, var1, var2, and var3. I've also used \/ to escape my slashes.
I'll break the Run array down a bit for my case:
The first; is the command you want to run (path to php, path to script, and variables to pass).
The second; 0 - Hides the window and activates another window (link below for more options).
The third; false - Boolean value indicating whether the script should wait for the program to finish executing before continuing to the next statement in your script. If set to true, script execution halts until the program finishes.
For more information on WScript.Shell visit http://msdn.microsoft.com/en-us/library/d5fk67ky(v=vs.84).aspx for details.
Hope this helps someone else!
I have a script that uploads a file to my webserver via http, what i need is the server to send a response to the http post but then continue to run some commands.
For example
file is posted via http
Server saves the file the inserts data into the database and sends a response to the http.
But i need the server to continue to run after the response and continue running some editing on the file.
example php
function uploadFile(){
$imageName = $tmpname . '.jpg';
move_uploaded_file ($_FILES["foto"]["tmp_name"], $targetDir . $imageName);
$data = Array('ALL THE DATABASE DATE')
$returnid = $this->uploader_model->addData($data);
//I NEED IT TO RETURN THIS TO THE AJAX HTTP REQUEST
echo json_encode(array(
'returned' => 'Successfully uploaded..',
'id' => $returnid
));
//I NOW NEED IT TO KEEP RUNNNG SOME EDITING ON THE IMAGE HERE IE..
shell_exec(" run some image editing here 2>&1");
}
I have seen ignore user abort like
ignore_user_abort(1); // run script in background
set_time_limit(0); // run script forever
but does this mean the script will just run forever and crash the server if it is getting multiple requests? just a bit confused on how to use this function correctly in this scenario
Any help please
Your script won't run forever unless you make it, e.g. by using an infinite loop. Having said that, that's not necessarily a good way to go about it.
To ensure the best user experience, you should make sure your script executes quickly by deferring any time-consuming tasks. For example, you could create a task queue and run the queued tasks with cron.
Alternatively, if you do decide to use the solution you described in your question, you may be better off running that with AJAX after a file has been successfully uploaded. AJAX requests may be terminated when the user navigates off the webpage, so you'd still need to make sure the server continues executing your script.
BTW, You should also run flush() if you actually want to send something to the client.
I have a VERY labor intensive PHP script, which does several api calls to a server elsewhere.
I need to run this script to keep certain data on my server, synchronized with data on the remote server.
I want this script to start every time a specific type of user visits a specific page.
My problem is however, if a user that is qualified goes to this page, page load-time is REDONCULOUS, even though the data the script processes, doesn't effect the page itself in any way.
So, what i was wondering is, how can i run this script using the same conditions, but run it only on my server?
In other words, how can i run this script and stop the browser from waiting for its output?
EDIT: useful information: Using XAMPP for Windows, PHP 5.5, Apache 2.4.
EDIT 2: Using curl seems to be the best option, but it doesn't want to actually run my script.
Here's the call:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/tool/follow/PriceTableUpdate.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
And here is the actual PriceTableUpdate.php:
<?php
ini_set('max_execution_time', 3600);
$marketData=simplexml_load_file("http://www.eve-markets.net/api/xml?listid=20158key=JKgLjgYvlY6nP");
foreach ($marketData->marketList->type as $type) {
$ItemID = (int)$type['id'];
$ItemBuy = $type->buy->price;
$ItemSell = $type->sell->price;
$ItemMedian = $type->median->price;
mysqli_query($con,"UPDATE piprices SET `ItemBuyPrice` = $ItemBuy, `ItemSellPrice` = $ItemSell, `ItemMedianPrice` =$ItemMedian WHERE `piprices`.`ItemID` = $ItemID");
}
?>
EDIT 3:
Using the above DOES work, in case anyone ever wants to ask this question again.
You have to remember though, that since you are using curl, the php file no longer uses variables you've set before, so you will need to define your database connection in the php file again.
Why not using AJAX in this? When the page loads and meets your specific conditions, make an AJAX request to the server and start the script without waiting for a response back to the browser.
You can probably make a separate call to your php script with the onLoad event - that is, you wait until the page is loaded, then call this other script "in the background". The latter can be done with the following lines of code (I found this by following a link http://www.paul-norman.co.uk/2009/06/asynchronous-curl-requests/ posted by #Gavriel in a comment to one of the other answers):
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.yoursite.com/background-script.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
?>
Adding this code anywhere in your page should cause the script to be executed without delay for the page load - you won't even need to use a onLoad event in that case.
If I understand what you want to do, then one possible solution is to run the other php in another thread. In php you can do this by calling it via curl: http://php.net/curl
You should separate the browser-request from the background-data-request.
When the user does the access, then create an item in a message-queue server. put required data within that message. let the queue executed from the same/different machine.
if you are not split the request, you webserver worker process stays alive until php has full executed you script. the also say, blocking the browser.
you can "flush" the current output, but server still wait until php is ready to close the browser connection.
Try to use exec() function
If your server is a unix-like os, terminating your command with & will launch the command and won't wait the end of it
exec('./mylongcommand &');
Never tried this, but it should work...
Have a long running, background, process that processes jobs from a queue; something similar to beanstalkd.
When this process comes across a job named, for example, 'sync.schizz' it will start your sync . Now you just need to pop a job into the queue when your special visitor swings by - which will be lightning fast.
There are a couple ways. One, you can fork the execution:
<?php
# only works if pcntl_fork is installed on POSIX system
# will NOT work on windows
# returns new child id
$pid = pcntl_fork();
if ($pid == -1) {
die("could not fork\n");
} else if ($pid) {
$my_pid = getmypid();
print " I am the parent. pid = $my_pid \n";
pcntl_wait($status); //Protect against Zombie children
} else { # child pid is 0 in child
$my_pid = getmypid();
print " I am the child. pid = $my_pid \n";
}
# example output
# I am the parent. pid = 23412
# I am the child. pid = 23414
Or, fork the process at the OS system level (with command &, assuming you are running PHP on linux/unix). So, PHP may be used to execute a shell script.
Also, some people have suggested Ajax, though you need to be careful to consider the case where multiple scripts can be fired off at the same time. What will this do to load on the server, and resource locking? There probably also needs to be some locking logic to ensure only one script is executing at a time, along with a 'heartbeat' that lets you know if the process is alive or dead.
Using the curl function display in the OP, i got it to work.
I forgot to add mysql connection details, which i didn't need before ( when using include ).
I want to download a large amount of Files to my Server. I have a List of different Files to download and Locations where to put them. This all is not a Problem, i use wget to download the File, execute this with shell_exec
$command = 'wget -b -O' . $filenameandpathtoput . ' ' . $submission['url'];
shell_exec($command);
This works great, the Server starts all the Threads and the Files are downloaded in no Time.
Problem is, I want to notify the User when the Files are downloaded... And this does not work with my current way of doing things. So how would you implement this?
Any Suggestions would be helpful!
I guess that you are able to check whether all files are in place with something like
function checkFiles ()
{
foreach ($_SESSION["targetpaths"] as $p)
{
if (!is_file($p)) return false;
}
return true;
}
Now all you have to do is to call a script on your server that calls this function every second (or so). You can either accomplish this with Meta Refresh (forcing the browser to reload the page after n seconds) or by using AJAX (have a look at jQuery's .getJSON, for example).
If the script is called and the files are not yet all downloaded, print something like "Please wait" and refresh again later. Otherwise, show the success message. Thats all.
You can consider using exec to run the external wget command. Your PHP script will block till the external command completes. Once it completes you can echo the name of the completed file.