how can I invoke a php script on a remote server from my server code ?
I'm currently using:
header('Location: http://www.url.com/script.php?arg1=blabla');
in my code, but it doesn't work.
thanks
If you mean by invoking just "calling" it, so you only need it to run, then you can use curl.
If you mean by invoking that you want it to act the same as include, then you can't trough http (the server does ofcourse not return code, but runs it). You might be able to obtain the file trough other means (ftp?), and then include it, but that seems like a bit of a hack.
If you mean by invoking that you want to redirect the user to the page, then this should work:
header('Location: http://www.site.nl/');
exit;
(your script continues to run after a header call, so you might need to call that exit). How doens't your code work for you? (I'm guessing because you want one of the other options)
If you only want to invoke the script you can simply use $result = file_get_contents('http://www.example.com/');.
Your version using header() will as said above redirect the user.
Use cURL, it gives you much wider manipulation options.
Related
usually when i want to redirect from one php page to another php page of same project i'm using
header("location:somepage.php");
This will cause more calls between client and server. What i want to do is instead of sending redirect header, i want to stop execution of requested page and pass request object or request information to the another page which i want to redirect. In this case single request will be enough. I guess this kind of functionality available in jsp. Is same thing available in php which i don't know?
As #DanSherwin commented, you probably want to use include. You might do something like this:
firstpage.php:
if(/* Some condition when you want to do a redirect */){
include 'somepage.php';
exit;
}
This runs the code from somepage.php immediately, as though it was cut and pasted into firstpage.php**, and then it exits right afterward as though you redirected away from firstpage.php.
** caveat: watch out for variable scope.
I am experiencing some very strange behavior when including a php file.
I need to load a script that is not on the same domain as the page that will be calling it.
I have already created a system that works using cURL, but I just recently found out that many of the sites that will need to have access to this script, do not have cURL installed.
I did, however, notice that these sites have allow_url_fopen set to on. With this knowledge I got started creating a new system that would let me just include the script on the remote site.
Just testing this out, I coded the script test.php as follows:
<?php
echo("test");
?>
I include this script on the remote page using:
<?php
include("http://mydomain.com/script.php");
?>
and it works no problem and "test" is printed at the top of the page.
However, if I add a function to the script and try to call the function from the page, it crashes.
To make it worse, this site has php errors turned off and I have no way of turning it on.
To fully make sure that I didn't just mess up the code, I made my test.php look like this:
<?php
function myfunc()
{
return "abc";
}
?>
Then on the page including the file:
<?php
include("http://mydomain.com/script.php");
echo(myfunc());
?>
And it crashes.
Any ideas would be greatly appreciated.
This is not odd behavior, but since you load the file over the internet (note in this case the World Wide Web), the file is interpreted before it is sent to your include function.
Since the script is interpreted no functions will be visible, but only the output of the script.
Either load it over FTP or create an API for the functions.
My guess: The PHP of http://mydomain.com/script.php is interpreted by the web server of mydomain.com. All you're including is the result of that script. For a simple echo("test"), that's "test". Functions do not produce any output and are not made available to the including script. Confirm this by simply visiting http://mydomain.com/script.php in your browser and see what you get. You would need to stop mydomain.com from actually interpreting the PHP file and just returning it as pure text.
But: this sounds like a bad idea to begin with. Cross-domain includes are an anti-patterns. Not only does it open you up to security problems, it also makes every page load unnecessarily slow. If cross-domain inclusions is the answer, your question is wrong.
You are including the client side output from test.php rather than the server-side source code. Rename test.php to test.phpc to prevent executing the script. However this is dangerous out of security point of view.
My web hosting provider does not permit to use curl FOLLOWLOCATION option so I'm trying to
do it manually by using the header function.
My problem is that I need to keep my PHP script running and to be able to get the redirected URL data for parsing.
How do I do that?
Technically the PHP script continues running after the header () function is called. How you get URL data is another question. Can you not use get_file_contents () or readfile () on the URL?
You read the RAW data the request returns, you check for the redirect header(s), fetch the related URL(s) and do a new get with that URL (dry, rinse, repeat). As simple as that...
Alternatively you could stop being so lazy, check the curl_setopt documentation in the PHP reference manual and find solutions - by reading the comments at the bottom of the page - on how to solve this problem of course.
I have a very similar setup to the person here:
PHP Background Processes
i.e a very long script that takes up to 10 minutes. However, I need the person who calls the script redirected back to the homepage while the script works. In other words, I need the user experience to be something like this:
click the update button
script begins to execute, can take up to 10 minutes
user is redirected back to the home page immediately
Is this possible using only PHP? I gather I will need
ignore_user_abort(true);
set_time_limit(0);
But how do I redirect the user? I can't use javascript because output only gets written to the page at long increments, and I want the redirection to be immediate. Can I use headers? Or will that mess with things?
Alternatively, I could use the cron job approach, but I have zero experience in making a cron job or having it run php code (is that even possible?)
Thanks,
Mala
Update:
Using headers to redirect does not work - the page will not load until the script is done. However, eventually the webserver times out and says "Zero-Sized Reply: The requested URL could not be retrieved" (although the script continues running). I guess my only option is to go with the cron job idea. Ick!
The most obvious solution to me would be splitting the redirect and the background calculation in two separate files and let the redirect script execute the 10-minute script:
job.php:
<?php
// do the nasty calculation here
redirect.php:
<?php
// start the script and redirect output of the script to nirvana, so that it
// runs in the background
exec ('php /path/to/your/script/job.php >> /dev/null 2>&1 &');
// now the redirect
header ('Location /index.php');
Assumptions for this to work: You should be on a Linux host with either safe_mode disabled or having set the safe_mode_exec_dir appropriately. When you're running under windows, the exec string needs to be adapted, while the rest about safe_mode remains true.
Notice: When you need to pass arguments to the script, use escapeshellarg() before passing it on, see also the PHP manual on exec
I've tried several methods and none seems to work, I've even tried to use register_shutdown_function() but that also failed. I guess you're stuck with making a cron job.
I just remembered something (but I haven't tested it), you can try to do something like this:
set_time_limit(0);
ignore_user_abort(true);
ob_start(); // not sure if this is needed
// meta refresh or javascript redirect
ob_flush(); // not sure if this is needed
flush();
// code to process here
exit();
Not sure if it'll work but you can try it out.
I have a similar situation with processing logins.
To keep it short...
I get a PDT, IPN and each sends me a logging email.
An email is sent to client on IPN VERIFIED to give serial number and password to client.
As PDT and IPN I use goto to send me a logging email instead of a bunch of sequential ifs.
On reading many answers I studied each to figure what would suit my isssue.
I finally used...
<?php
ignore_user_abort(TRUE); // at very top
As I worked through the progressive checks (no ifs), if they failed I use for example...
$mcalmsg .= "Check [serialnbr]\r\n";
if (empty($_POST['serialnbr']))
{ header('Location: '.$returnurl.'?error=1');
$mcalmsg .= "Missing [serialnbr]\r\n";
goto mcal_email; // Last process at end of script
}
else
{$serialnbr=strtoupper(htmlspecialchars(trim($_POST['serialnbr'])));
$mcalmsg .= "[serialnbr]=$serialnbr\r\n";
}
This (so far) is working just as needed.
Of course there is more in the script but follows the same concept.
Where this says location, there are also 3 information pages that can each be displyed using the same concept.
mcal_email: //last process before ending, always gets here after all else from goto or clearing all checks.
// compose email and send
?> // end of script
Why not try the header approach and see what happens? You could also try a call to php header method and see if this does the trick. I would work on trial and error to see what will solve your problem.
Preamble: My app is mod_rewrite enabled and I have index.php page that downloads vaious pages based on Request_URI and prints them as page content.
Problem: File() or File_get_contents() function is excellent at downloading my other app pages. But as soon as I try to use it to download a page that is session enabled, I start having problems.
The main problem is that when I try to pass existing session id to url from the page I download, e.g.
$url = "http://localhost/EmplDir/AdminMenu.php";
return implode('',file($url. "&" . session_name() . "=". session_id()));
My page never loads (or file() never loads content).
I suspect I shoud use curl functions here, but it has too many options.
My be an advice which curl options to use to make downloadable pages know about current PHP session would be helpful.
P.S. The above seems to be true both for Windows and Linux.
You didn't separate the query string from the rest of the URL with a ?
Try
return file_get_contents($url. "?" . session_name() . "=". session_id());
You will also need to be sure the server doesn't use the session.use-only-cookies configuration setting.
There's no reason why the script shouldn't see the query string and act on it, you can persuade yourself by writing a script which just does var_dump($_GET) and requesting that as above. If you see the query arguments in the output then you simply need to debug your script to see why it doesn't behave as expected given the session id.
NOTE: I'm assuming that you wanting to request a file from the same domain as your application, otherwise using your session id for a remote site doesn't make much sense.
If your script doesn’t alter any superglobal variables, you could just include it:
ob_start();
include $_SERVER['DOCUMENT_ROOT'].'/EmplDir/AdminMenu.php';
return ob_get_clean();
session_name and session_id gives you the current scripts session; Not the remove server. You need to use something that understands http. Curl would do, or you can use something like SimpleBrowser, which completely emulates a browser.