.htaccess-Password is blocking cURL - php

I have locked my development-environment with a .htaccess-password.
While I'm now working on a script that uses a cURL-request to that htaccess-protected-folder, it doesn't work. When I delete the htaccess-protection it works fine.
Is there a way to block UserAgents, like GoogleBot and other human requests, but allow cURL ?

You can define the HTTP Auth username and password like this:
curl -u username:password http://...
This way you don't have to disable the HTTP Auth while accessing it from a browser but can access it from your script.
EDIT: If working with the PHP CURL object you can also define it as such:
curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");

Related

Selective 302 redirect

I am making cURL calls from php scripts on one domain (mac2cash.com) to another (thebookyard.com), both hosted on the same Apache server and the same IP address. This has been working fine but I need to add some new functionality to the site and I have just created a new php script at the root level of the same target domain as the cURL call that is working, but when I call this new script using the same code I used on the working script, this is returning the message "Found: the document has moved here".
The target scripts for the working and failing cURL calls are at the root level of the same domain. I have checked they have the same unix permissions. But if I simply change the php file name in the working script to the name of the target script in the failing call, this now fails too with the same 302 redirect message.
I even duplicated the 'working' target script (byasd_api.php) on the target domain to a new file (byasd_api_copy.php) and I get the 302 message if I make a cURL call to that from the calling script that was working even though the code is exactly the same!
I cannot see what the difference is between the two files. Is there some kind of cacheing going on where newly created files are not being treated the same?
For reference, here is the calling code:
$header=array("Host:thebookyard.com");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, HTTP_SERVER_IP."/byasd_api.php");
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'http://www.mac2cash.com');
curl_setopt($ch, CURLOPT_POST,3);
curl_setopt($ch, CURLOPT_POSTFIELDS,$post_data);
$output = curl_exec($ch);
curl_close($ch);
The 'byasd_api.php' script name is the only thing I am changing.
I've spent some hours googling for a solution so would appreciate any suggestions.
Your apache have configurated to search favicon.ico in each call the 302 is because not find the ico
GET http://theboo....com/favicon.ico [HTTP/1.1 302 Found 151ms]
Change the cofiguration or add the favicon.ico file.
Maybe the configuration try to find the ico file only in the root
It turns out the reason for the difference in behaviour was that the working script name was included as a rewrite condition in the htaccess file which was redirecting http to https. Changing the CURL url to "https://".HTTP_SERVER_IP."/byasd_api.php" stopped the "Found: the document has moved here" error but the call was then failing because CURL was trying to validate the SSL certificate for the IP address rather than the domain.
The solution to that was to add the following :
curl_setopt($ch, CURLOPT_RESOLVE, array("www.thebookyard.com:443:".HTTP_SERVER_IP,));
This still allows the call to be to the IP address (which is much faster than via the domain name) but CURL validates the SSL certificate against the domain name.

Different cURL behaviour on Ubuntu and Windows

I'm using cURL to make some requests to external API. Everything works fine, I get some token with first request to use it later on to do stuff which require authorization header.
I add http header with function:
curl_setopt($ch, CURLOPT_HTTPHEADER, $httpHeader);
And here's the problem. On Ubuntu environment I get correct response. On Windows - request time out (which probably is their API setting to not send response on wrong data). On both environment I have same headers, but different results.
Is there some environment specific settings that could make Windows request incorrect?
You can try setting value for CURLOPT_CONNECTTIMEOUT. Set it to 0 to wait indefinitely to connect to the server, or any other value in seconds.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30) //wait for 30 sec.
Try this, maybe this works for you.

Drupal page sending a request (curl) to a subdomain site: fails while same url works from client

I'm trying to connect two of our websites with each other. First one sends a request to the other one which should reply to it. However, when doing this from backend it fails and from client it works fine.
There's a login required and these parameters are sent within request. From debug log I can see that curl follows the different redirections returned from the other site but it always ends up to the login page.
Is this related to cookies or what? How the backend could get a cookie to be able behave as logged in? Or can I use other cookies under same domain?
I've tried to use these configurations along with different temp files and variables:
curl_setopt($ch, CURLOPT_COOKIEJAR, $ckfile);
curl_setopt($ch, CURLOPT_COOKIEFILE, $ckfile);
but these always land on error
expect parameter 1 to be resource, null given.
Please check this
If the cookie is generated from script, then you can send the cookie manually along with the cookie from the file(using cookie-file option). For example:
sending manually set cookie
curl_setopt($ch, CURLOPT_HTTPHEADER, array("Cookie: test=cookie"));
sending cookies from file
curl_setopt($ch, CURLOPT_COOKIEFILE, $ckfile);
In this case curl will send your defined cookie along with the cookies from the file.
If the cookie is generated through javascrript, then you have to trace it out how its generated and then you can send it using the above method(through http-header).
The utma utmc, utmz are seen when cookies are sent from Mozilla. You shouldn't bet worry about these things anymore.
Finally, the way you are doing is alright. Just make sure you are using absolute path for the file names(i.e. /var/dir/cookie.txt) instead of relative one.
Always enable the verbose mode when working with curl. It will help you a lot on tracing the requests. Also it will save lot of your times.
curl_setopt($ch, CURLOPT_VERBOSE, true);

PHP Send Custom HTTP Request

How can I send a custom HTTP Request to a server whose URL is "http://[IP]:[Port]/"?
What I mean is that, instead of the first line being a normal GET or POST like so:
GET /index.html HTTP/1.1
Host: www.example.com
How can this be replaced with something just like:
CUSTOM
Host: [IP]
I don't mind having to use any additional libraries if necessary such as cURL.
UPDATE:
I've tried using the following cURL code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://[IP]:[Port]/");
curl_setopt($ch, CURLOPT_PORT, [Port]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "CUSTOM");
$output = curl_exec($ch);
curl_close($ch);
print($output);
But it just keeps loading for 2 minutes until it said internal error (both with and without using the CURLOPT_CUSTOMREQUEST). However, if I use a standard website such as http://google.com it'll work fine.
Also I forgot to mention, the port my server is using is 7899, is this ok? Open it in my web browser fine, but cURL doesn't seem to be able to.
Looks like there's nothing wrong with your code. If you're using a shared hosting provider, ask them to open up outbound TCP port 7899.
You can create a custom HTTP request method using the http_request_method_register function:
http://www.php.net/manual/en/function.http-request-method-register.php
This code needs to be run on the server that will be handling the request. If you try to use a non-standard HTTP request method on any old server, depending on the server it may ignore it, or may return an error code.

Writing a curl proxy under linux

Hello
I am working with a legacy system where an ASP.NET application posts an XML file to a server via curl.exe (this url to send is configurable by a .config file).
Now due to legacy system limitations, I need curl post this XML to my ubuntu server by changing the said .congfig file, modify the received XML as I need and finally curl post it to the real server.
How can this be done ? My guess is a php or a python script running under apache2 server, listening posts. Once received the xml file, do the required modifications on the file and post to the real curl server.
Via php or python, how can this be done ?
Since ASP.NET application is posting XML, you simply need to handle a normal POST request, modify XML to match your requirement and post it using cURL to the real cURL server. In PHP, it would look something like this (more or less meta code, error checking and additional logic is needed):
$xml = $_POST['xml'];
// do something with posted XML
.....
// post it to the "real" cURL server
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('xml' => $xml));
$result = curl_exec($ch);
curl_close($ch);
That's about it, check cURL documentation and use what is necessary for POST to work with your server, and your are all good.

Categories