Why could Curl be slower than a web browser? - php

I am using this class to make one GET and another POST request to a website (the first request is to set a cookie). I am testing in a Win XP virtual machine with virtualbox, using wamp from wampserver dot com. The 2 requests takes from 10 to 18 seconds (with curl), but if I make those request directly via the webbrowser in that same virtual machine the website loads in just a few seconds, and it retrieves all the images, css, etc.
What could be causing curl to work so slow? is there a way to fix it?

I faced the same issue, using the curl command.
as suggested above - forcing ipv4 only dns lookup fixed it.
curl -4 $url # nice and fast
(I already had ::1 localhost in my hosts file - but that didn't help).

Curl is probably trying to Reverse DNS the server and, as it cant, it just hangs there a little waiting for the timeout.
If the timeout is given by IPV6 you can try CURL_IPRESOLVE_V4 to bypass it altogether. It really depends on your machine configuration and is more a question to Server Fault.

Check your web server logs and try to find any difference between the requests from the normal web browser and the requests from curl

It is probably due to IPv6.
Try add
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4 );

I ran into this issue with a local web server. I was able to fix by adding
::1 localhost
to /etc/hosts/ file.
This is the ipv6 notation for 127.0.0.1

Related

Accessing URLs on same machine timing out

At some point in the last week, one of our servers stopped being able to "talk to itself", so to speak.
We have a cronjob which calls a script via cURL at https://example.com/scripts/curl-script.php - example.com is hosted on the same machine - this no longer works. When run via the command line, you can see cURL looking up the correct external IP for the domain, but the connection simply times out. Same for another URL hosted on the same machine.
wget, or telnetting to example.com on ports 80 or 443, the same thing. Ping also times out (expected, as there's a firewall in place here) and so does traceroute (all hops are just * * *)
If I add an entry in /etc/hosts to map example.com to 127.0.0.1, the script starts working as expected - it just can't talk to itself via it's external IP any more.
I haven't changed anything on this server for a while and I don't believe any automated updates have updated any of the related components. This has been working fine for weeks and I can't understand why it would suddenly stop.
Does anyone have any suggestions for a proper fix for this issue instead of the hosts file amendment?

PHP cURL error: Could not resolve proxy

I'm trying to send a cURL request from a Windows Server 2008 machine using PHP (version 5.3.12) and keep receiving the error Could not resolve proxy: http=127.0.0.1; Host not found. As far as I cal tell, I'm not using a proxy - CURLOPT_PROXY is not set, I've run netsh winhttp show proxy to make sure there's not a system-wide setting in place, I've even checked all the browsers on my machine to confirm none are configured to use a proxy (just in case this could possibly have an effect). I'm having trouble figuring out why cURL insists on telling me that 1) I'm using a proxy and 2) it can't connect to it.
I'm able to resolve the error by explicitly disabling the use of a proxy via curl_setopt($curl, CURLOPT_PROXY, '');, but this isn't the greatest solution - a lot of the places I use cURL are in libraries, and it'd be a pain (not to mention less than maintainable) to go around and hack this line into all of them. I'd rather find the root cause and fix it there.
If it helps, this has happened to me only with POST requests so far. Command-line cURL (from a Git bash prompt) works fine. These calls also work fine from our dev machine, so it seems to be something specific to my machine.
If I need to apply the above hack, I will, but I thought before I resorted to that I'd ask the good folks of SO - is there anywhere I'm missing that could be configuring the use of a proxy? Let me know if there's any additional helpful info I forgot to add.
cURL relies on environment variables for proxy settings. You can set these on Windows via "Advanced System Settings".
The variables you need to set and/or change for optimum control are "http_proxy", "HTTPS_PROXY", "FTP_PROXY", "ALL_PROXY", and "NO_PROXY".
However, if you just want to avoid using a proxy at all, you can probably get away with creating the system variable http_proxy and setting it to localhost and then additionally creating the system variable NO_PROXY and setting it to *, which will tell cURL to avoid using a proxy for any connection.
In any case, be sure to restart any command windows to force recognition of the change in system environment variables.
[source - cURL Manual]

Max OS Lion 10.7.3 DNS lookup very slow with MAMP

This question has been asked in many forms, and I have spent more than six hours scouring the internet for an answer that solves my problem. So far, I've been unsuccessful. I use MAMP to develop PHP applications, and I upgraded from Snow Leopard to Lion yesterday and immediately my local applications were running much slower. I believe that its a DNS lookup issue around how Lion handles IPv6. I tried the following steps to fix the problem:
Changed all of the entries in my host file to no longer use the .local TLD
Put all of the entries in my host file onto separate lines
Ensured that my host file had the correct encoding
Added IPv6 entries to all local entries in my host file
Installed dnsmasq (may have not done this correctly)
Put all of my host file entries before the fe80::1%lo0 localhost line
This fixed some problems, but there's still one problem that I haven't figured out. In our PHP applications, we define our SOAP endpoints like so:
api:8080/contract/services/SomeService?wsdl
On each server, there is an "api" entry in the host file that points to the IP address for the SOAP API. So, when I want to point to our dev server, I change my hosts file to look like this:
132.93.1.4 api
(not a real IP)
The DNS lookup for the api entry in the host file still takes 5 seconds every time. When I ping api, the result comes back immediately. However, when I ssh api, it takes about 5 seconds before I can connect to the server. This means that when I load up my PHP application, any SOAP query will take 5 seconds + however long the actual query takes, making local development totally impossible. I realize that the way we're defining our endpoint may not be the best design decision, but it's what I have to work with.
From other questions I've read, I believe it's trying to look up "api" in IPv6 first, failing, and then looking in /etc/hosts. I tried using dnsmasq to switch this order, but had no luck. Does anybody know how to force it to read /etc/hosts first, or skip IPv6 altogether?
Update: I changed the entry in the hostfile to api.com, api.foo, anything with a "." in it, and it responded immediately. However, I would still like to find a solution that doesn't require changing the name "api".
I was having the same issues ever since I upgraded my modem which supports IPv6. Adding both host name formats (IPv4 and IPv6) fixed the issue for:
::1 domain.dev # <== localhost on crack
127.0.0.1 domain.dev

cURL proxy server

I've got two PC's, but both have another IP. The simple question is how to use one of the PC's as a proxy for the other one, by using cURL, so requests from both PC's will have the same IP.
Is there a way to turn one PC into a proxy server and then make cURL make requests using that IP?
Yes, there are lot's of proxy packages running out of the box (you could even configure apache to do it). Wouldn't recommend rolling your own in PHP if that's what you're after. You can configure curl easily to use a proxy, see the curl_setopt possibilities.
if you are runnning a webserver on each machine , then you can install a php proxy script.
see Google PHP Proxy search results : at least 4 choices on first page.
if you are not running a webserver. Then I suggest you download a standalone proxy such as squid.
This options works for windows or linux. You can download squid for windows here. just unzip and run squid, not setup required.

Write transparent HTTP Proxy script in PHP

Is there an easy forwarding/transparent php proxy script that I can host on my web server? These are my conditions:
I'm using free web hosting, so I have pretty much no control over my machine. Otherwise I could use Perl's HTTP::Proxy module. This means no root password. It does run php though.
I already have a server running on port 80. What I mean is I would like to put a php script as index.php on my server that will forward all requests.
I don't want a script like PHProxy or Glype where I go to the site, then enter a URL. I want a server so I can enter proxy.example.com:80 in Firefox's or IE's or whatever's proxy settings and it will forward all requests to the server.
Preferably (though not fatal if not possible) I would like for it to pass on the USER_AGENT environmental variable (That's the browser) instead of setting itself to be the USER_AGENT
I can't start a new Daemon. My server won't allow it.
Is there a script that will do this? If so, which?
No, I'm fairly sure this is not possible on shared hosting. It will fail your condition number 3. This needs support on web server level (e.g. using Apache's mod_proxy)
For this to work, you would have to set up the remote server to be able to deal with proxied requests. No sane web server will offer that possibility.

Categories