How to open php socket connection and request/maintain data from client - php

So I want to make a daemon.php file which can open a port on a give ip address and request data from client.php and with the browser to listen/read to earlier pipe (Would that be a problem for the security or not?).
I fount this funphp tutorial which seems to be good but is more like a cronjob, I found also function stream-socket-server but can handle only one request and then shuts down, and I really don't know where exactly to put listener.

You can use the PHP Built-In Server
Add PHP to your PATH environment variable
Start your server on a given port, ex: php -S localhost:8000 OR
Start with a router script, ex: php -S localhost:8000 rt.php
Now, just open your browser and type; localhost:8000

Related

Is there a way to know when the PHP -sever has is ready to serve requests?

One may start a local PHP server, e.g for testing:
php -S localhost:8080
One can also execute a PHP statement, e.g.
php -r "echo 'Hello';"
We initially hoped we could use this to tell when the server was started, i.e. using systemd-notify or some other process readiness protocol. However, using -r and -S together seems to ignore -r.
My question is thus, when starting a local server using php -S, is it possible to execute some code after the server is ready to receive incoming connections? This would allow us to execute something like systemd-notify --ready and enable the parent process to know when to proceed with testing.

Tunnelling / Proxy SSH

I need to connect to a particular API but that API only accept request from my mate server. Then include thoses informations back in our website.
So basically I need to connect to the server make the request receiving the answer transfering it to my host so that I can play around with them and integrating them to my website through a php file.
I've already exchange ssh keys and I can connect to my server easily, I know I probably need to use ssh -L (not -R or -D) for the tunneling, though I don't know what to do with my php files to make that request etc etc or what are the other steps once I've entered that command.
If anyone can help that would be lovely :) !
You can create an SSH tunnel using the -L command line switch:
$ ssh -L [port on local]:[apiserver hostname]:[port on apiserver]
[user]#[your friend's server hostname]
E.g.
$ ssh -L 8080:apiserver.com:80 bob#friend.server.com
After the above command successfully connects to friend.server.com, any requests sent to your localhost:8080 will be tunneled through friend.server.com host and arrive at apiserver.com:80. From apiserver's perspective, the request's origin is friend's server.
(This will actually open an SSH session in the terminal window where executed, i.e. you get the prompt of the remote server, which is not required and you can ignore that prompt. It is possible ro run this in the background w/o console login with other switches)

cURL to remote server through SSH tunnel

I have to write a PHP script that will be executed on my client's network when finalized. My computer (A) cannot connect to that network, but I have SSH access to a single server (B) on it.
My script has to do a cURL request (with certificate and private key) to a web server (C) on a specific port on that network. Another difficulty is that I do not have the IP of the C server, only a URL resolvable only when within the network. But server B and C can communicate between each other
Basically I see 3 steps (but there may be more) :
Open SSH connection from computer A to server B
Send cURL request to server C (https://my.remote.server.domain.com:8444) and store response
Close SSH connection
The thing is, I have no idea how to do that (I'm basically ignorant in all things network related). Anyone has a clue ?
Using Bash:
$ ssh user#ssh_server << EOM
curl http://remote.server/ > /home/user/file
EOM
$ scp user#ssh_server:/home/user/file local_file
This first part connects to your ssh server (ssh_server), executes cURL and saves the file locally (on the ssh server). Then, scp is used to download the file on your local machine.
Creating a temporary file is probably the easiest way of doing this. You could create it in /tmp (and, if you really can't stand having that file there, delete it afterwards using ssh + rm: )
$ ssh user#ssh_server 'rm /tmp/file'
Finally, a dirty (and not recommended) way for not creating files is the following:
$ ssh user#ssh_server << EOM
curl http://remote/server | nc -l 1234 &
exit
$ nc ssh_server 1234 > file
I should probably mention once again that this technique should be avoided at all costs, since it transfers unencrypted data and requires no authentication whatsoever. Also, keep in mind that someone else could connect to the server using that same port (1234) before your command executes, thus retrieving the result for themselves, and leaving your script hanging.
So, one last time, don't use that.

Nginx and PHP-cgi - can't file_get_contents of any website on the server

This one is best explained by code I think. From the web directory:
vi get.php
Add this php to get.php
<?
echo file_get_contents("http://IPOFTHESERVER/");
?>
IPOFTHESERVER is the IP of the server that nginx and PHP are running on.
php get.php
Returns the contents of the (default) website hosted at that I.P. BUT
http://IPOFTHESERVER/get.php
..returns a 504 Gateway Time-out. It's the same with curl. It's the same using the PHP exec command and GET. However, from the command line directly it all works fine.
I've replicated it on 2 nginx servers. For some reason nginx won't allow me to make an HTTP connection to the server its running on, via PHP (unless it's via the command line).
Anyone got any ideas why?
Thanks!
Check that your not running into worker depletion on the PHP side of things, this was the issue on my lab server setup which was configured to save RAM.
Basically I forgot that your using a single worker to process the main page been displayed to the end-user, then the get_file_contents() function is basically generating a separate HTTP request to the same web server, effectively requiring 2 workers for a single page load.
As the first page was using the last worker there was none avaliable for the get_file_contents function, therefore Nginx eventually replied with a 504 on the first page because there was no reply on the reverse proxy request.
Check if allow_url_fopen is set to true in your php.ini.

Why does file_get_contents work with google.com but not with my site?

$page1 = file_get_contents('http://www.google.com');
$page2 = file_get_contents('http://localhost:8000/prueba');
When I echo the results, with Google it works but not with my site. And when I put the address on the explorer works. And this happen with all the site that i make in django. :(
Warning: file_get_contents(http://localhost:8000/prueba) [function.file-get-contents]: failed to open stream: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. in C:\xampp\htdocs\squirrelmail\plugins\captcha\backends\b2evo\b2evo.php on line 138
Fatal error: Maximum execution time of 60 seconds exceeded in C:\xampp\htdocs\squirrelmail\plugins\captcha\backends\b2evo\b2evo.php on line 138
For anyone having this problem using PHP Built-in web server (with Laravel in my case), it is caused by your request being blocked by file_get_contents() / curl functions.
Docs of dev server say that
PHP applications will stall if a request is blocked.
Since the PHP built-in server is single threaded, requesting another url on your server
will halt first request and it gets timed out.
As a solution, you can use proper web server (nginx, apache etc.).
Edit: As of now, I really suggest you to use Laravel Sail as a development environment for PHP projects. It saves you lots of time with setup and configuration of different services (webserver, databases, queues, etc.).
As zub0r pointed out, the built-in PHP server is single threaded. If you do not want to install and configure a web server like nginx, and do not want to use Homestead or Valet, there is another easy solution:
Start another instance of your build-in PHP server with another port and use this in the internal requests of your app.
php -S localhost:8000
\\ in another console
php -S localhost:8001
I use this in my Laravel app when I request some local dummy API via Guzzle and it works fine.
To get the result of a content from a PHP local file, you can use:
exec('php file.php', $content);
Sometimes the $content variable is an array, so just point to the correct key, like $content[3]
Hope this helps you.

Categories