Can a PHP script act as a SOCKS proxy server? - php

As the title suggests, I'm wondering If it's technically possible for a PHP script to act as an SOCKS proxy. If not what are the technical limitations?
I have access to a paid hosting which provides me with executing PHP scripts and a domain name is connected to the host. (e.g. example.com).
Is there any SOCKS proxy written in PHP so I may upload it a directory at host (e.g. example.com/proxy) and configure a client (like Firefox) to connect via the proxy.
cURL and other extensions are supported.
I'm not yet sure about SSH access.
I have seen projects like php-proxy or glype but These are not things I need because they can be used only by browsing proxy's homepage. (They are web proxies, But I need a proxy server)

What you describe will not work. While PHP does have the ability to create a TCP server, a proxy server in particular must already be running and listening for connections before a client tries to connect to it, and hosting providers execute a PHP script only on an as-needed basis whenever a client requests the script via the HTTP/S protocol, running the script only for the duration of that request. For what you want, you need a dedicated server running your PHP application separate from a web server. You won't get that with a hosting provider.

Related

Access files in remote computer with php

I need to create an application using php which is hosted in a server and need to communicate with a csv file located in clients local machine. Is there any way we could do this ? How can I connect to a remote csv file ? Is this possible ?
Server machine accessing a csv file directly from a client machine is not a good idea. It's a security threat indeed. Consider you are navigating some website and it's server is able to access your computer's file system!!!
There are various alternatives to achieve this, some of these might be:
Make the user upload csv files to server in order to make it
available to the server application
If the client and server are in the same network, then share the
folder on client machine to make it accessible from the server
etc... I would have preferred the first option as mentioned above.
As #AnthonyB mentioned in comment under your question, server can't directly call client, and that is true. Server is called "server" as it serves requests from the client.
To be able to give away files to remote requests, your client needs its own server application, like Apache HTTPD for example.
In case if you need continuously request client's server to collect files with your PHP server, what you are looking for called "worker". One of AWS tools called Elastic Beanstalk offers possibility to choose a server or a worker application during start up wizard for PHP. It is pretty straight forward and easy to use.
Please note, that your client must have dedicated IP address or use Dynamic DNS approach by pushing its IP to a DB (or directly to a server) where worker will take it from.
If you don't need dedicated worker, you can configure CRON JOB to send requests to clients server applications.
IMHO, all that scenario worth it only if you are building corporate grade application. In most cases (and if you do REALLY need to collect files from clients) you have to install Apache + PHP server on the client side and make this guys to wait for request from YOUR remote php server. Without it, you can not get files from clients computers via browser without user input interactions. At least legally :)

Sending GET Requests from Online Server to Local Server?

So this is the situation: I have a bunch of Arduinos and Raspberry Pis along with an ubuntu server on a local network. The arduinos and pis communicate with that local server routinely using PHP GET & POST requests.
Now this local server sometimes "fetches" something from a remote server in the cloud (also using PHP GETs) to respond to local requests from Arduinos and Pis.
Now here's the problem: The local server has no issues communicating with the remote server by GETs, but what if I want the server in the cloud to send a GET to the local server?
This part is kind of confusing to me as the local server is on a regular LAN and connects to the internet via a router through a local commercial ISP that issues dynamic IPs.
How can I send PHP GETs from an "online" server to a local server?
Please note that both servers are running Apache/PHP/MySQL on Ubuntu 14.04.
Thanks a ton in advance!
You will need two steps to accomplish that.
step 1 - make router forward external requests to LAN server
step 2 - make external server know the current dynamic WAN ip
step 1:
The router has to be configured to forward WAN requests to your LAN server. Assuming you use a normal home router, you typically point your browser towards the router ip and login on the router. Now you have to find where to configure forwarding (unfortunately naming of this feature varies from router to router).
While you typically can define an "exposed host" where just all external requests go to, you are better of in terms of security if you just forward specific ports to your server. As you are going to use HTTP protocol, the standard ports here would be 80 (http) and 443 (https). So assuming you use HTTPS with default port, a typical forwarding would be:
router WAN ip, port 443 --> server LAN ip, port 443
This forwards any external request to the router on port 443 to your internal server on port 443.
Now your server should be able to receive those requests, but you still would need to know your router's current dynamic WAN ip.
step 2:
As your router's WAN ip changes from time to time, you need to somehow announce that ip to your external server.
One easy way of doing is by using an external service which will provide you with a URL, which will resolve to your current ip. This is often referred to as DDNS or dynamic DNS. One quite well known DDNS provider is https://dyn.com/dns/ - but there are plenty others, and you will even find free ones. After registering with such a provider you will be given a URL which your external server can use instead of the ip.
Now you still would have to let know the DDNS provider you current dynamic WAN ip. Most easy way to do this again involves your router. Check its config for DDNS settings, typically routers do support this feature, often there are even some specific providers pre-configured. Setup your router with the credentials you got from the DDNS provider.
Now everything is set. You should be able to send requests to your internal server by using the URL you got from your DDNS provider, while your router both forwards such requests and notifies the DDNS provider about any ip changes.
A word of warning - you just exposed your local server to the internet. So you will have to treat it like any server on the internet to keep it safe, including careful configuration, installing security updates and so on.
You have to open a port on your router, and specify where the router should lead the request to. Lets assume your external ip is: 80.82.71.24, going to this ip address (fx: http://80.82.71.24) will lead to your router. Then the router decides what to do with this request, normally the request would timeoutted / refused. But on the router, if you specify that this certain request (could be: tcp/udp) (to a specified port) should point to a certain internal ip (the local server ip), then it's possible to do what you want.
But to do this, you need to read up on your router - first of all, see if you can login into it. Could you specify what router you use and if your internet connection is yours or shared (fx. campus, school, etc)?
By the way, it would not be a good idea to open up the port for the whole world, so maybe you should consider to only allow your cloud server ip to gain access to that specific port.

HTTP and nodeJS on separate servers

I have two websites:
1)httpwebsite.com where I run my web application which uses APACHE, PHP and MYSQL;
2)wss.com where I run a nodeJS websocket server, used for a multiplayer game;
I want to host the javascript client-side files that communicate with the websocket server, on httpwebsite.com, so I dont have to configure a http server on nodeJS, for many reasons, like security and lack of experience with using nodeJS as HTTP server.
I want to use nodeJS only for the websocket server, for performance and flexibility reasons, among many others.
I've heard that Same-origin policy restricts communication from httpwebsite.com with wss.com , but can this be reconfigured to actually allow communication between two different domains that want to communicate with each other on purpose?
Do I have other options than actually running a HTTP server on the nodeJS server?
You can use CORS for secure requests from one domain to another domain.
http://www.html5rocks.com/en/tutorials/cors/
2 options:
You can add CORS headers to wss.com to allow access to website.com to load it's resources. The link Matt gave should explain how this works and you just need to add this HTTP Header to each Node server you need to access.
You can proxy your requests through your Apache server to the node server. So the web browser thinks it's talking to a service on the same origin. This is often used to only have your web server publically available and your app server (running node) not directly available and protected behind a firewall - though obviously Apache needs to be able to access it.
You can use this config in Apache to achieve option 2 to forward http://website.com/api calls to a service running in wss.com on port 3000.
#send all /api requests to node
ProxyPass /api http://wss.com:3000
#Optionally change all references to wss.com to this domain on return:
ProxyPassReverse /api http://wss.com:3000

Auto discover a web server hosting PHP script

Having a simple PHP script running on a web server in my local network, how can I auto discover this script (or better the server hosting the script) using an external client (e.g., an Android app).
I am aware that this will not be possible with a pure PHP script hosted on a web server. I need to bind a socket to the broadcast address 255.255.255.255 or some multicast address.
Maybe a python script could do, e.g., http://stuvel.eu/blog/186/start-xbmc-from-remote
Apache ZooKeeper seems interesting but too big and complicated at the same time.
What other options are their? Does some of the "big" web servers provide some kind of easy to use service discovery?

PHP, Two connections, VPN and SSL

I need consume two services from two differente providers.
I need connect with one SOAP server (WSDL), this code works correctly, the server required use SSL, but the problem is, in the same application, we need connect against another server who uses VPN and XML over HTTP, how can make this works correctly.
How separate this two ways of connection
Configure your network properly. This is nothing that PHP can influence. All PHP can do is connect via the network to a target server using HTTP or HTTPS. So if the server can ping and connect to both services on the command line (try to download the WSDL or any other resource with wget or curl), it will work.
If not, you have to find out how the servers are to be called (domain names), which IP they have, if the domain name properly resolved to these IPs, and if that IP is actually reachable by network (using a VPN does not really make a difference, it simply is another network connection).
Unfortunately going into these network details is probably beyond the scope of an answer here.

Categories