Why running Ratchet websocket server via php pends browser request? - php

I'm building an hybrid mobile app with a framework. It's all good and now I want to update views when needed. So I thought about websockets.
After reading Ratchet doc, I've been able to set this server running with command lines. I know about deployment solution for making server run automatically but It's a bit complex for me. So I tried to run the server from php with this following code:
...
$port = 8080;
//echo("Starting server on port " . $port);
$server = IoServer::factory(
new HttpServer(
new WsServer(
$MessageHandler
)
),
$port
);
$server->run();
I make a request http://localhost/app/ws/run-server and it runs the server as expected but the browser pends this request and any others ones called after.
I then must comment this line $server->run() and restart Apache via XAMPP before I can go ahead.
So I really need a help because I've been searching for days and days.
Thanks in advance!

I was in the same position: I wanted to quickly start/stop Ratchet from a web interface without having to resort to hacks that involve exec() or having to install ZMQ.
So what I did was: call the script that kicks off the Ratchet websocket server from cURL:
$options = [
CURLOPT_URL => 'url/to/server.php',
CURLOPT_RETURNTRANSFER => false,
CURLOPT_TIMEOUT_MS => 30,
CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
CURLOPT_CUSTOMREQUEST => $method
];
// init
$curl = curl_init();
curl_setopt_array( $curl, $options );
// exec
curl_exec( $curl );
curl_close( $curl );
Parts of interest: CURLOPT_TIMEOUT_MS to a really low value, CURLOPT_RETURNTRANSFER to false (not totally sure about that one though). This cURL call will immediately return and will not block the rest of your script execution.
Stopping it is not possible without ZMQ or talking to the webserver layer of the Ratchet Websocket server (not sure how to do that, and if it's possible at all). The script runs forever, unless it crashes. I communicate with the script through WebSockets from then on, also for administrative access like stopping it. But there may be other (better) ways to do that.

Related

PHP : relay resource request through different server

I have the following situation, trying to clearify with a 'scheme'.
On a webapplication, we are showing resources comming from server3. But because server3 is a intranet endpoint, we are loading the resource through some extra steps. Each server has a Apache running to cap the incomming requests.
The infra setup has some 'restrictions':
Server 1 contains the webapplication
Server 2 is only reachable by server1, and can't host the webapplication. And acts as a 'entry' point to server 3, restricted access only for requests comming from server1 (intranet server)
Server 3 (resources that need to be shown on the webapplication) is only reachable by server 2. Server 3 is an intranet server.
As you see on the scheme above, loading a resources takes some steps. I'm wondering if there are any better ways within php to stream contents from server3 through server2 initiated on server1. Without having to transport the content on each step.
I've been looking at file_get_contents in combination with stream_context_create. But not sure if this is possible relaying the request through a server in the middle.
I've tried something like this (stripdown):
$opts = array(
'http' => array(
'timeout' => 10,
'proxy' => 'tcp://server2:port',
'request_fulluri' => true
)
);
$context = stream_context_create($opts);
$data = file_get_contents('server3/files/footer.png', false, $context);
echo $data;
Any expertise is welcome.
Thanks!

PHP Curl works with native library but not with ZF2

I'm connecting to a remote server using TLS1.1 on PHP 5.3.
When using Zend Framework 2, I get an error:
$client = new Client('https://www.example.com/');
$curlAdapter = new Client\Adapter\Curl();
$curlAdapter->setCurlOption(CURLOPT_SSLVERSION, CURL_SSLVERSION_TLSv1_1);
$client->setAdapter($curlAdapter);
$client->send();
Result: Error in cURL request: SSL connect error
Adding this resolves the issue, but is obviously less secure
$curlAdapter->setCurlOption(CURLOPT_SSL_VERIFYHOST, 2);
$curlAdapter->setCurlOption(CURLOPT_SSL_VERIFYPEER,false);
Result: It works
Making the request using native PHP commands works fine:
$c = curl_init('https://www.example.com/');
$options = array(
CURLOPT_SSLVERSION => CURL_SSLVERSION_TLSv1_1,
);
curl_setopt_array ($c ,$options );
curl_exec($c);
Returns the contents of the page.
So PHP works, but ZF2 doesn't unless VerifyPeer = false. What's the issue?
It is probably because you are missing one parameter:
CURLOPT_CAINFO => '/etc/ssl/certs/ca-bundle.pem' // replace with your cert.
It is also possible that you are using different php configurations (web / cli) that point to different places with the ssl certs. Some details are also available here: Security consequences of disabling CURLOPT_SSL_VERIFYPEER (libcurl/openssl)

cURL - Protocol https not supported or disabled in libcurl (post PHP update)

I know there are dozens of questions about this particular topic, but the usual suggestions don't really seem to help. I have checked the server under the root account and the specific account I'm working with and in both cases (if they even can be different) SSL is listed as a feature and HTTPS is listed as a protocol.
My cURL functions were working fine until yesterday when we upgraded PHP from ~5.1/5.2 to 5.4.26. My assumption was that PHP and/or cURL were compiled without SSL support, but that doesn't seem to be the case.
If it helps, the functions are calling Appcelerator's cloud services. This is one of the login functions, the first to throw the "trying to get property of non-object" error because $res is false:
function login() {
$url = 'https://api.cloud.appcelerator.com/v1/users/login.json?key=<MY_APP_KEY>';
$options = array(
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_POST => TRUE,
CURLOPT_POSTFIELDS => array(
'login' => '<MY_APP_LOGIN>',
'password' => '<MY_APP_PASSWORD>'
)
);
$curl_session = curl_init($url);
curl_setopt_array($curl_session, $options);
$res = curl_exec($curl_session);
curl_close($curl_session);
$this->session_id = json_decode($res)->meta->session_id;
}
Is it possible that even though SSL and HTTPS are listed that they're not actually in effect? Is there a way to check and if necessary fix that?
One possible problem is that you have more than one libcurl version installed in your machine, so your checking as root returns the info about a global installation while your PHP environment runs a different version.
In your PHP program you can instead run curl_version() and see what it says regarding SSL support and versions etc.

cURL call with a port in the url/location not working - couldn't connect to host

I am trying to make a cURL call to a url that looks like this:
https://example.com:9000/test
When I execute the following code, I get curl error 7 couldn't connect to host.
$headers = array(
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_CONNECTTIMEOUT => 5,
CURLOPT_TIMEOUT => 10,
CURLOPT_URL => 'https://example.com:9000/test',
);
$headers[CURLOPT_SSL_VERIFYPEER] = FALSE;
$headers[CURLOPT_SSL_VERIFYHOST] = 2;
$ch = curl_init();
curl_setopt_array($ch, $headers);
$response = curl_exec($ch);
If I set the url to https://example.com/test, I am able to connect to the host, just not to what I need to get.
I have also tried setting <code>CURLOPT_PORT => 9000</code> with the same result (error 7).
One other note, I am able to use cURL with the url on some machines but not others. My Windows machine works fine, but the linux server I'm on is the one having issues. Another linux server seems to work fine as well.
EDIT:
Server is shared hosting on hostgator.com.
If anyone else has this problem, contact your server host. I contacted hostgator.com and they responded with the following:
I have opened the outbound port 9000
for you.
Everything works now.
check the url in IE browser. if it works fine then remove the timeout params and try.

PHP locally hosted, connecting to external resources through a proxy server

I'm trying to do some locally hosted facebook development but I'm on a university network, and therefore all outgoing connections from my computer need to pass through our proxy server. The main problem I'm having is that I can't seem to find any documentation for setting up apache to USE a proxy server, rather than to ACT as a proxy server.
Upon thinking about this however, perhaps when I do a "cURL" request or an fopen, that apache does not perform the retrieving of data, and it is instead the PHP drivers that do this. Older versions allowed you to setup a global proxy in the PHP.ini file, but not in PHP 5.
I have to use code to actually physically set the defaults and cannot find any config files where I can set them permanently. For example, this sets up streams so fopen can function:
$r_default_context = stream_context_get_default
(
array
(
'http' => array
( // All HTTP requests are passed through the local NTLM proxy server on port 8080.
'proxy' => 'tcp://proxy.munged.edu:8080',
'request_fulluri' => True,
),
)
);
but this will not set everything which is required as to use cURL, I have to do this:
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_PROXY, "http://proxy.munged.edu:8080");
curl_setopt($ch, CURLOPT_PROXYPORT, 8080);
Is there anyone who knows how to set all things that require outgoing connections to use this proxy as I don't won't code that's specific to this computer (because my plan was to work on my code locally and then upload it to some webspace when it's done: the change/upload/refresh cycle is ALOT more time consuming than just that change/refresh cycle)
edit:
just to clarify, i have been including all this in a file called "proxyconfig.php" then checking for it's existance, and include()-ing it at the top. if there's no way to set up the defaults in config files, having the methods to set up all the things that the facebook.php page used for their API requires would be awesom.
Your method is correct, assuming that the application is in iframe mode (FBML applications require Facebook being able to callback to your server).
If the issue is wanting to be able to develop locally and deploy to a remote site with minimal modification to your files, I'd recommend extending BaseFacebook as a new class called LocalBaseFacebook and changing CURL_OPTS to:
public static $CURL_OPTS = array(
CURLOPT_CONNECTTIMEOUT => 10,
CURLOPT_RETURNTRANSFER => true,
CURLOPT_TIMEOUT => 10,
CURLOPT_USERAGENT => 'facebook-php-3.0',
CURLOPT_PROXY => 'http://proxy.munged.edu:8080',
CURLOPT_PROXYPORT => 8080
);
When deploying out, make a switch when instantiating the Facebook class based on hostname or some uniquely identifying property / configuration (you could even use a $_GET variable such as ?is_local=1) and attach that to the end of your Canvas URL.

Categories