Selective 302 redirect - php

I am making cURL calls from php scripts on one domain (mac2cash.com) to another (thebookyard.com), both hosted on the same Apache server and the same IP address. This has been working fine but I need to add some new functionality to the site and I have just created a new php script at the root level of the same target domain as the cURL call that is working, but when I call this new script using the same code I used on the working script, this is returning the message "Found: the document has moved here".
The target scripts for the working and failing cURL calls are at the root level of the same domain. I have checked they have the same unix permissions. But if I simply change the php file name in the working script to the name of the target script in the failing call, this now fails too with the same 302 redirect message.
I even duplicated the 'working' target script (byasd_api.php) on the target domain to a new file (byasd_api_copy.php) and I get the 302 message if I make a cURL call to that from the calling script that was working even though the code is exactly the same!
I cannot see what the difference is between the two files. Is there some kind of cacheing going on where newly created files are not being treated the same?
For reference, here is the calling code:
$header=array("Host:thebookyard.com");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, HTTP_SERVER_IP."/byasd_api.php");
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'http://www.mac2cash.com');
curl_setopt($ch, CURLOPT_POST,3);
curl_setopt($ch, CURLOPT_POSTFIELDS,$post_data);
$output = curl_exec($ch);
curl_close($ch);
The 'byasd_api.php' script name is the only thing I am changing.
I've spent some hours googling for a solution so would appreciate any suggestions.

Your apache have configurated to search favicon.ico in each call the 302 is because not find the ico
GET http://theboo....com/favicon.ico [HTTP/1.1 302 Found 151ms]
Change the cofiguration or add the favicon.ico file.
Maybe the configuration try to find the ico file only in the root

It turns out the reason for the difference in behaviour was that the working script name was included as a rewrite condition in the htaccess file which was redirecting http to https. Changing the CURL url to "https://".HTTP_SERVER_IP."/byasd_api.php" stopped the "Found: the document has moved here" error but the call was then failing because CURL was trying to validate the SSL certificate for the IP address rather than the domain.
The solution to that was to add the following :
curl_setopt($ch, CURLOPT_RESOLVE, array("www.thebookyard.com:443:".HTTP_SERVER_IP,));
This still allows the call to be to the IP address (which is much faster than via the domain name) but CURL validates the SSL certificate against the domain name.

Related

Problem with getting source code of a web page using PHP curl

I have absolutely no problem in getting source code of the webpage in my local server with this:
$html = file_get_contents('https://opac.nlai.ir');
And I was also okay on my host using code below until just a few days ago:
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, 'http://opac.nlai.ir');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 10);
$result = curl_exec($curl);
But today I figured out that now the site is using ssl and it is not working with http anymore and force redirect to https. So I did some search & found this as a fix:
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($curl CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
The code above works just fine for sites like e.g. "https://google.com" (and any other https sites that I've tried! )
But not for that specific website ("https://opac.nlai.ir")
In that case, page takes about a minute to load (!) and finally with var_dump($result) , I get "bool(false)"
I don't know how that website can be different from other websites and I really want to know what cause the problem.
Sorry for my English.
Just filling this answer for the records.
As said in question comments, the website where you were trying to get its source code was blocking your requests due your server location area (that executed the requests). It seems it only responds to specific IP location.
The solution verified by #Hesam has been to use cUrl via a proxy IP located in allowed location area, and he found one at least running well.
He followed the instructions found in this other SO post:
How ot use cUrl via a proxy

Cannot identify local issuer

I'm getting this error:
SSL problem: can't identify local issuer
once I call the function to save an image or retrieve user's Facebook image:
file_get_contents()
What I've done is:
I have my website running on Azure *.azurewebsites.com
I added my custom domain and ssl certificate which were both bought from GoDaddy
I created Certificate Signing Request (CSR) (which GoDaddy asked for) using OpenSSL on my Mac
Signed it with it and downloaded it to get .p7b file and .crt file
I added the .crt file to Azure and everything works fine, my custom domain now has the lock beside it
So after those steps, logging in with Facebook got that error also so I did a temporary fix:
curl_setopt($rest, CURLOPT_SSL_VERIFYPEER, false);
This is not recommended of course, but it allows me to test the rest of the site. The error still occurred but only when invoking file_get_contents(). I've tried these fixes from what I've seen scouring around:
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_CAINFO, getcwd() . "/cacert/cacert.pem");
But no dice. Can someone explain exactly what the error means and also to combat it? That'd be great. And also, this might be due to creating my Certificate Signing Request with OpenSSL... not sure. Please confirm.
curl_setopt flags works only with handler you're passing to those calls. file_get_contents has no idea about any of CURLOPT_SSL_VERIFYPEER or CURLOPT_SSL_VERIFYHOST flags.
Change file_get_contents to curl calls.

Drupal page sending a request (curl) to a subdomain site: fails while same url works from client

I'm trying to connect two of our websites with each other. First one sends a request to the other one which should reply to it. However, when doing this from backend it fails and from client it works fine.
There's a login required and these parameters are sent within request. From debug log I can see that curl follows the different redirections returned from the other site but it always ends up to the login page.
Is this related to cookies or what? How the backend could get a cookie to be able behave as logged in? Or can I use other cookies under same domain?
I've tried to use these configurations along with different temp files and variables:
curl_setopt($ch, CURLOPT_COOKIEJAR, $ckfile);
curl_setopt($ch, CURLOPT_COOKIEFILE, $ckfile);
but these always land on error
expect parameter 1 to be resource, null given.
Please check this
If the cookie is generated from script, then you can send the cookie manually along with the cookie from the file(using cookie-file option). For example:
sending manually set cookie
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Cookie: test=cookie"));
sending cookies from file
curl_setopt($ch, CURLOPT_COOKIEFILE, $ckfile);
In this case curl will send your defined cookie along with the cookies from the file.
If the cookie is generated through javascrript, then you have to trace it out how its generated and then you can send it using the above method(through http-header).
The utma utmc, utmz are seen when cookies are sent from Mozilla. You shouldn't bet worry about these things anymore.
Finally, the way you are doing is alright. Just make sure you are using absolute path for the file names(i.e. /var/dir/cookie.txt) instead of relative one.
Always enable the verbose mode when working with curl. It will help you a lot on tracing the requests. Also it will save lot of your times.
curl_setopt($ch, CURLOPT_VERBOSE, true);

PHP Send Custom HTTP Request

How can I send a custom HTTP Request to a server whose URL is "http://[IP]:[Port]/"?
What I mean is that, instead of the first line being a normal GET or POST like so:
GET /index.html HTTP/1.1
Host: www.example.com
How can this be replaced with something just like:
CUSTOM
Host: [IP]
I don't mind having to use any additional libraries if necessary such as cURL.
UPDATE:
I've tried using the following cURL code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://[IP]:[Port]/");
curl_setopt($ch, CURLOPT_PORT, [Port]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "CUSTOM");
$output = curl_exec($ch);
curl_close($ch);
print($output);
But it just keeps loading for 2 minutes until it said internal error (both with and without using the CURLOPT_CUSTOMREQUEST). However, if I use a standard website such as http://google.com it'll work fine.
Also I forgot to mention, the port my server is using is 7899, is this ok? Open it in my web browser fine, but cURL doesn't seem to be able to.
Looks like there's nothing wrong with your code. If you're using a shared hosting provider, ask them to open up outbound TCP port 7899.
You can create a custom HTTP request method using the http_request_method_register function:
http://www.php.net/manual/en/function.http-request-method-register.php
This code needs to be run on the server that will be handling the request. If you try to use a non-standard HTTP request method on any old server, depending on the server it may ignore it, or may return an error code.

PHP curl and file_get_contents returns the main website located on the server when I enter a valid URL without DNS resolution

When I call the following:
file_get_contents('http://whgfdw.ca');
or
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://whgfdw.ca');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec($ch);
The return value is the HTML of the homepage of the main site located on the local (dedicated) webserver. I would be grateful if anyone could help me understand this? Perhaps there is something I'm overlooking.
I can't use simple URL validation because http://whgfdw.ca is a perfectly fine URL; it just doesn't have a DNS entry.
My ideal functionality is to be able to catch a DNS lookup failure or a 404 or a case of no content and then act on it. Thanks!
If you got a valid response then that DNS entry exists somewhere. It may be on an internal DNS server, in the /etc/hosts file of the local server or somewhere else in the stack but the bottom line is its being resolved in some way. So the question becomes wheres the entry its resolving to entered. Its possible that there is an application that is set to resolve all lookups to the local server (similar to how openDNS and many ISP's will resolve an un-resolved DNS name to their search page).
Given that its somehow being resolved there really isnt a way to validate it unless you compare the content of the response to some content you expect. Catching a 404 is pretty easy, you can also set up reverse lookup in php to catch unresolved names i believe. But you need to tackle that resolution first i should think.

Categories