Tell cURL to read hosts file - php

We are running nginx on CentOS 6.4 and upgraded the curl packages from 7.19 to 7.41 yesterday and since then cURL no longer checks the hosts file to resolve host names (and so will not connect to xyz.local)
We use cURL via Guzzle and are no longer able to connect to the various services on our local machines.
John Hart posted this answer which is helpful but would require a fairly significant change to how our site (a LOT of legacy code) manages connections for our local and dev environments.
Is it possible to just tell cURL to use the hosts file?

As a workaround you can install a DNS server that reads your hosts file, and use it as system DNS resolver. For example dnsmasq does this by default.

Related

Is it possible to get FTP to work in VirtualBox using NAT networking or get php's ftp functions to work

I'm trying to get php's ftp methods to work from within a VM. I can connect using ftp_connect but not actually do anything afterwards.
HOST: Ubuntu 14.10
GUEST: Debian 7
Stack: Vagrant - VirtualBox - Debian - LAMP
I'm using vagrant to run a virtual box VM that runs a lamp stack. In php I'm running some method calls (ftp_pasv, ftp_nlist) that are not working.
I discovered that because of the FTP protocol using random ports for connections, the issue is caused by the use of NAT networking in virtualbox. I have the perfect vagrant-virtualbox setup except for this one issue. Does anyone know of a method to get ftp to work on the guest OS in this scenario. I know I could try using a bridged setup, but that means a bunch more work setting it up and the machine will be available to the public. So I would prefer to try to get it working behind that NAT.
I also have tried to use ftp_pasv to get passive mode turned on, which would fix the issue. But the method returns false when I call it to turn on passive mode.
As far as I know this isn't possible. Maybe if you want to hack some source code and compile custom solutions it will work. But that's harder than just using a different setup. I've resorted to using curl to make the ftp connections. Which works for listing files and downloading them.
Anyone that comes across this question and actually finds a solution please post it here.
The problem is most likely related to the network configuration. The fact that e.g. creating a directory works in contrary to getting the directory listing indicates, that theres an issue with the back channel.
A potential root cause is the configuration of the network router. It seems that some routers handle packages different if they are sent from different mac adresses (host vs guest system).
I had this issue and it turned out that upgrading Virtual Box solved the issue. Possibly some bug in the NAT interface.

how php remote debugging works?

I have a php webserver setup running in a remote website; lets call it remote.com
Have the wamp installed in local windows machine ( localhost ) and did the necessary xdebug configuration and netbeans installed. Now my doubts are:
My IP is not static IP but dynamic one. So specifying this in the remote server's php.ini will work?
To debug the php file which resides in the remote server, should I have the same exact copy of source file be available in localhost too? What will happen if there is small changes, lines mismatch etc.
What happens in the background technically?
Thanks in advance.

ADO and msqli connections very slow

I am experiencing a very slow ADO and mysqli connection for my production web server. The current software setup is windows 2008 server R2 Standard Edition SP1, Apache 2.4, PHP 5.3.10, MySql 5.5.24, Pear 1.94, Zend Engine version 2.30.
I've profiled the code using XDEBUG and it shows the initial connections taking around 1200ms each (regardless of page being visited), whereas on my local development machine and another test server the connections only takes around 8ms. The code for the website is all in sync through SVN except for the php, pear, mysql, and apache ini and conf files. I've done diffs on these to check for differences and there aren't any. The DB contents are a complete copy as well. Everything for the production server is hosted on the same machine so there aren't any firewall or internet issues.
The first connection profile has the following call stack:
ADOConnecton->Connect
ADODB_mysql->_connect
php::mysql_connect
The second one:
php::mysqli->mysqli
Any suggestions?
Usually the slowness in (first) connection depend on DNS resolution.
May be:
client to resolve the server name
server to resolve the client name to match an access rule
let the client/server let know the server/client address using the host file:
http://en.wikipedia.org/wiki/Hosts_(file)
I edited the Mysql my.ini to change the mysql service to only bind to the IPV4 loopback adapter.
[mysqld]
...
bind=127.0.0.1
I also changed the \public_html\conf\face.ini to use the IPV4 loopback address instead of local host. (Changed "localhost" to "127.0.0.1")
After that all issues went away. I'm not sure if it is because the machine has a half dozen IP addresses or its trying to decide whether to use IPV6 or IPV4.

cURL proxy server

I've got two PC's, but both have another IP. The simple question is how to use one of the PC's as a proxy for the other one, by using cURL, so requests from both PC's will have the same IP.
Is there a way to turn one PC into a proxy server and then make cURL make requests using that IP?
Yes, there are lot's of proxy packages running out of the box (you could even configure apache to do it). Wouldn't recommend rolling your own in PHP if that's what you're after. You can configure curl easily to use a proxy, see the curl_setopt possibilities.
if you are runnning a webserver on each machine , then you can install a php proxy script.
see Google PHP Proxy search results : at least 4 choices on first page.
if you are not running a webserver. Then I suggest you download a standalone proxy such as squid.
This options works for windows or linux. You can download squid for windows here. just unzip and run squid, not setup required.

setting up remotely accessible wamp server along with iis

I am new to Windows IIS and I need to run a php/mysql application on it. For local php development on windows, I have found WAMP to be the easiest.
But can WAMP be used in this case instead of installing php and mysql separately ?
This needs to be done on an ec2 Windows 2003 instance. So far, I have already tried installing WAMP and setting up apache to listen on port 8080 instead of 80. From inside the remote desktop, both IIS and WAMP work properly in parallel on their respective ports.
However, when I try to connect from another computer using the ip address http://184.**.***.***, IIS works fine serving the default web page but cannot connect to apache on http://184.**.***.***:8080.
Is it possible to use WAMP at all for this purpose and if yes, would there be any disadvantages in using it instead of installing php/mysql seperately ?
Edit :
I dont know if this is a problem of blocked 8080 port. To verify this I stopped IIS and configured apache to listen to 80. Even then http://184.**.***.*** doent show the WAMP homepage. IS anything needed to be configured in IIS ?
RESOLVED :
Added the port 80 in Windows Firewall Exceptions and it started working.
Also, its necessary to select "Put Online" in the WAMP tray otherwise it gives a forbidden response as suggested by some answers.
Thanks
I haven't used EC2 in this way before, but broadly speaking, I'd encourage you to use the same server for development and production environments if at all possible - the installation effort can be a bit of a pain, but it's nothing compared to developing an app locally and then finding an IIS configuration issue causes it to break on production.
This approach also lets you keep your PHP configurations in source code control - php.ini and any modules you're using - and automatically deploy them alongside your application; again, forgetting to deploy the correct PHP.ini usually makes your application do crazy things...
So, your choices appear to be:
- switch off IIS and have WAMP listen to port 80. Not sure WAMP is designed for production level traffic, but in the past, I've run low-traffic public websites in this way.
- work out why port 8080 is blocked, and if it can be unblocked. This would still require you to run your website on an unusual port, which makes for ugly and hard-to-communicate URLs.
- install PHP on your IIS instance. One benefit of having installed WAMP is that MySQL should already be up and running, and the basic PHP installation should also be there; getting PHP to run on IIS is no longer a dark art ([http://php.iis.net][1])
For my money, I'd go for the latter option...IIS is a production quality server, and it's clearly what Amazon want you to use in this instance.
Of course, running IIS on your development environment may be a problem.
Have you put the server online? Think it is offline by default, meaning it's only accessible from your local machine. It's an option in the systray menu to put it online.
If I remember well, there is an option "go public" (or "put online") on the system traya icon of wamp.
This option modifies the httpd.conf to authorize public access.
You should give a try.
On a side note, you can make php work with IIS. This is another option to help you manage your server. (Mysql and Php have to be installed separatly but, this is very easy to do as far as I remember :) )

Categories