I'm using a PHP script to detect of a referral URL is a proxy. This is a very simplified version but it works great.
The problem is that I'm trying to use the same script on my other web server but for reasons am not copying over the script. What I'm doing instead is using a get_file_contents.
My problem is that when I use get_file_contents it detects it as a proxy. Is there anyway around this, possibly by changing the port?
<?php $stop = file_get_contents("http://mysite.com/file.php"); echo $stop; ?>
Any help would be great, Thanks!
file_get_contents with a remote URL is very different from a local URL -- you are actually running the script on mysite.com and simply getting the output of that script on your local server. This actually sends another HTTP request to mysite.com, so the referrer for that request is different from the referrer for your original request.
Related
Server has nginx and php on it.
When i am trying to echo file_get_contents(); with any website as input - everything works fine
when i am trying to echo file_get_contents(); with url of page which is on the same server - timeout shows up.
I tried using cURL as well. It won't work, drops timeouts also.
Check your internet limitation access on your network! I have my own solved!
The URL in question : http://www.roblox.com/asset/?id=149996624
When accessed in a browser, it will correctly download a file (which is an XML document). I wanted to get the file in php, and simply display its contents on a page.
$contents = file_get_contents("http://www.roblox.com/asset/?id=149996624");
The above is what I've tried using (as far as I know, the page does not expect any headers). I get a 500 HTTP error. However, in Python, the following code works and I receive the file.
r = requests.get("http://www.roblox.com/asset/?id=147781188")
I'm confused as to what the distinction is between how these two requests are sent. I am almost 100% it is not a header problem. I've also tried the cURL library in PHP to no avail. Nothing I've tried in PHP seems to succeed with the URL (with any valid id parameter); but Python is able to bring success nonchalantly.
Any insight as to why this issue may be happening would be great.
EDIT : I have already tried copying Python's headers into my PHP request.
EDIT2 : It also appears that there are two requests happening upon navigating to the link.
Is this on a linux/mac host by chance? If so you could use ngrep to see the differences on the request themselves on the wire. Something like the following should work
ngrep -t '^(GET) ' 'src host 127.0.0.1 and tcp and dst port 80'
EDIT - The problem is that your server is responding with a 302 and the PHP library is not following it automatically. Cheers!
$output = file_get_contents("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
var_dump($output);
HTTP 505 Status means the webserver does not support the HTTP version used by the client (in this case, your PHP program).
What version of PHP are you running, and what HTTP/Web package(s) are you using in your PHP program?
[edit...]
Some servers deliberately block some browsers -- your code may "look like" a browser that the server is configured to ignore. I would particularly check the user agent string that your code is passing along to the server.
Check in your PHP installation (php.ini file) if the allow_url_fopen is enabled.
If not, any calls to file_get_contents will fail.
It works fine for me.
That site could be blocking the server that you're using to access it.
When you run the URL from your browser, your own ISP is used to get the information and display in your browser. But when you run from PHP, the ISP of your web host is used to get the information, then it passes it back to you.
Maybe you can do this to check and see what kind of headers its returning for you?
$headers=get_headers("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
print_r($headers);
I have an ssl certificate on my web-site. Once images are loaded on the page from another site, it causes warnings kind of "the page contains both secure and nonsecure items", so you have to press OK or you see "broken" ssl connection in the browser. One of the ways to escape that warnings is to use http page instead of https, correct?
But, as far as I know, there is another way to exclude that warnings using php or just using javascript. I believe the images are loaded to the temporary folder on my server and are loaded as https images at the same time.
Could anybody tell me the best way to do that?
Browsing the forum didn't help me a lot.
Thank you.
So,
how to load
<?php echo '<img src="http://www.not_my_site.com/image.jpg" alt="">'; ?>
with no warnings on my page https://my_site.com/index.php ?
You cannot surpress the error as it's a browser thing.
The only way would be to wrap those calls using an https call on your site. Something like:
<?php echo '<a href="https://my_site.com/external.php?resource=http://www.not_my_site.com/image.jpg" alt="">'; ?>
You will have to write the external.php script to make the request on the client's behalf, and then return the content over your existing SSL connection. You only NEED to do this for external HTTP-only resources.
The process would work as follows:
The end user's web browser makes an HTTPS request to your external.php script.
Check for a saved copy of the resource. If you've got it cached then skip to step 6, returning the cached resource.
Your server forwards on the call to the HTTP resource specified as the resource.
The remote server responds to the request.
Save a copy of the resource for caching.
Your web server external.php script then returns that response over the SSL connection.
The web browser only makes 1 request, your web server just has to make an additional one.
This is the only way you'll be able to get rid of the message.
Looks even simpler to retrieve the image: use curl to download indirect image file
It happens cause your making non-secure (HTTP) calls from a secured-page (HTTPS).
try changing your code to:
<?php echo '<a href="https://www.not_my_site.com/image.jpg" alt="">'; ?>
I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.