I don't understand when should we use stream wrapper and socket. Can anyone tell me when should we use stream wrapper and socket in PHP?
Please give me some example regarding the same.
StreamWrappers
Quoting the PHP Manual at Streams: Introduction:
A wrapper is additional code which tells the stream how to handle specific protocols/encodings. For example, the http wrapper knows how to translate a URL into an HTTP/1.0 request for a file on a remote server. There are many wrappers built into PHP by default (See Supported Protocols and Wrappers)
You use Stream wrappers whenever you are opening URLs, FTP connection, etc with functions like fopen or file_get_contents. Stream wrappers have the benefit that you do not need to know much about the protocol (unless you write your own custom wrapper).
Since you you funnel all access through the regular file functionsÂDocs, you do not need to learn another API which is a benefit. You likely already used Stream Wrappers without noticing it, for instance, when you did
$pageContent = file_get_contents('http://example.com');
somewhere in your code. Another benefit of Stream Wrapper is that you can put filters in front and modify the stream with minimal effort, for instance
$unzipped = file_get_contents('compress.zlib://http://example.com');
would run the content from that webpage through gzip decompression.
Sockets
Quoting the PHP Manual at Sockets: Introduction:
The socket extension implements a low-level interface to the socket communication functions based on the popular BSD sockets, providing the possibility to act as a socket server as well as a client.
Since PHP provides a number of Stream Wrappers out of the box and also has an API for almost everything, there is rarely any Use Case for using Sockets.
You use sockets when you need to implement at the protocol level to implement a client or a server for a certain protocol. This usually requires in-depth knowledge of the implemented protocol, for instance, to do the same as the file_get_contents call in the example above, you'd need to do (example quoted from manual, you even need to do more actually)
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
As you can see, instead of just calling the URL and let the Stream Wrapper handle all the nitty gritty details you need to know how to construct an HTTP request and how to parse a HTTP response.
You might also find this tutorial about Socket Programming helpful:
http://christophh.net/2012/07/24/php-socket-programming/
Related
I am trying to connect to a server using PHP script. The server is set by SSLv3, I think maybe I need use SSL_Write() to process the message which will send to the server. But I do not find the related function in PHP. So, I wonder which function should I use.
What you are looking for is the TLS stream transport. There is no direct equivalent to the SSL_Write() function; PHP does not implement a functional interface to SSL.
Here is a simple example of using the TLS transport to make a connection to an HTTPS web server. This is not an ideal application of the transport, as PHP already natively supports an HTTPS file wrapper; I'm simply using it here as an example of interacting with a publicly accessible TLS server.
$fh = stream_socket_client("tls://example.com:443", $errno, $errstr);
if ($fh === false) {
die("Failed to connect: error=$errno $errstr");
}
fwrite($fh, "GET / HTTP/1.1\r\nHost: example.com\r\nConnection: close\r\n\r\n"\ );
while (!feof($fh)) {
print fgets($fh);
}
If you need to set specific options on the TLS connection, you can do so by creating a stream context (using stream_context_create()) before making the connection. Refer to the PHP documentation for more details.
I have a small script which uses curl and retrives specific contents from a defined url. I had this tested on my localhost and it worked.
Now I have to retrive data from a HTTPS-only website (plus, the certificate is invalid, but I know the owners) from my free hosting, but the myserver neither supports CURL nor file_get_contents("https://other-server.com") function. By the way, http://other-server.com isn't accesible.
Is there any method to fetch a file from this server using the HTTPS port, but with HTTP protocol? Or is there some method to use HTTPS, altough my server doesn't support it? (It isn't my server, I haven't access to its configuration)
Try this:
<?php
$fp = fsockopen("ssl://other-server.com", 443, $errno, $errstr);
if(!$fp) die($errno. " : " . $errstr);
$send =
"GET / HTTP/1.0\r\n".
"Host:other-server.com\r\n".
"Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\n".
"\r\n";
fwrite($fp, $send);
while(!feof($fp)) {
echo fread($fp, 512);
}
?>
Should you run into 'ssl transport not available error message', see Socket transport "ssl" in PHP not enabled
If your host is external and perhaps a free webhosting service, you are fresh out of luck.. Best option would be to figure out which webhosts has the SSL transport enabled - otherwise the working with HTTPS protocol simply will not comply.
Your last 'out' is to try to load extension into PHP language dynamically. You will need the excact extension (dll/so) which matches
the PHP version on host (see phpinfo).
the CPU architechture of host (unix, see passthru("cat /proc/cpuinfo");), e.g. amd64,i386..
the OS 'layout', .dll is for a windows host (IIS etc) and .so for UNIX.
Funcition to use is dl aka dynamic-link to load the library. For windows host, you will need php_openssl.dll and php_sockets.dll - and in turn for UNIX, OOops - you would need to recompile php core..
Happy hacking :)
php-man-pages
We have shoutcast/icecast audio streams. I'd like to be able to provide a link such as mobiledroid.php on our web site that will open using default player. I've seen this done on another site so I do know it's possible.
I assume it uses php headers and streams via the php file as a stream?
Using Brad's instructions, the android actually gives the option to open with sound player. Nice one.
It also plays on WMP through PC but not on the android how the above link plays
header("Content-type: audio/mpeg");
header("Transfer-Encoding: chunked");
header("Connection: close");
$sock = fsockopen($streamname,$port); //$streamname is the IP
fputs($sock, "GET $path HTTP/1.0\r\n"); //path in my case is /;stream.mp3
fputs($sock, "Host: $ip\r\n");
fputs($sock, "User-Agent: WinampMPEG/2.8\r\n");
fputs($sock, "Accept: */*\r\n");
fputs($sock, "Icy-MetaData:1\r\n");
fputs($sock, "Connection: close\r\n\r\n");
fpassthru($sock);
fclose($sock);
On the android, it says "Sorry, this player does not support this type of audio file"
Update 2:
Removing "Transfer-Encoding" will play on android but as usual will take a long time to begin stream with "Preparing" status due to a live stream not having "Content-Length"
header("Transfer-Encoding: none"); also removed from above code:
Quoting Brad:
Android 2.3 and later has an issue with Transfer-Encoding set to "none". Removing that hard-set header puts the stream back to chunked Transfer-Encoding. This works great for Android 2.3+. Originally I had disabled chunked encoding as VLC doesn't support it. It also seems that Android 2.2 and older does not support chunked encoding.
Noting here that although it works on android, most live streams will take an awful amount of time to begin.
Problem
Traditionally, you would serve a playlist file which would be saved by your browser, and then opened in whatever player software was configured to open such playlists. The player would then read the URL of the stream, connect to it, and begin streaming.
On Android, they (for some crazy reason) chose not to support playlist files as supported media. Because of this, there is no direct way to launch the player using the traditional method.
Solution
On Android it seems that if you link directly to an MP3 file/stream, instead of downloading the file the default action is to open up a media player and try to stream it. This means that you can easily link directly to the URL of your stream (bypassing the playlist step entirely).
There is a catch... streams served with SHOUTcast aren't exactly HTTP compliant. They are close, but not perfect. The real hangup here seems to be one of content length. Typically, HTTP servers send a Content-Length response header. With streaming media, the length is unknown. There are two ways around the problem:
Use chunked transfer encoding. You can do this by setting the following headers: Transfer-Encoding: chunked Then, use chunked encoding for the stream, and you're good to go.
Use no transfer encoding, and specify that the connection will be closed when the resource is finished transferring, with these headers: Transfer-Encoding: none Connection: close
Your example link uses method one. I'm using method two successfully as well.
Once your server is talking valid HTTP, playback on Android seems to be a breeze.
Implementation
There are a handful of ways to go about implementing this fix. You have mentioned a PHP solution. If cURL won't work, you can always fsockopen() or stream_socket_client() to make a direct TCP connection, and handle the incoming stream data yourself. From there, you simply need to send the correct response headers and relay the data as it comes in. Keep in mind that a PHP instance will be running for each connected client, and each instance will make a connection to your server as well. This isn't ideal, but depending on what you have available, might be the only method for you.
We are currently using sockets to open and write to a http connection, requests that we don't necessarily care about the response! Like tracking pings etc
This worked on our old servers and on our windows developments environments but not on our new ubuntu servers.
The code we use is as follows
$aUrlParts = parse_url($sUrl);
$fp = fsockopen(
$aUrlParts['host'],
isset($aUrlParts['port']) ? $aUrlParts['port'] : 80,
$errno, $errstr, 30
);
$sHeader = "GET {$aUrlParts['path']}?{$aUrlParts["query"]} HTTP/1.1\r\n";
$sHeader.= "Host: {$aUrlParts['host']}\r\n";
$sHeader.= "Connection: Close\r\n\r\n";
fwrite($fp, $sHeader);
fclose($fp);
if I do a read after the fwrite i can get it all to work from the servers but this defeats the point of doing the request this way compared to just curling the URL
I have tried flush the socket and setting it to non blocking but non of that works! Just doing a read after is the only thing that works!
Any help is appreciated
Edit: I will mention these new servers are AWS based and I have a feeling the socket implementation on them may be different
I am not sure that this is worthy as an answer but I had the exact same problem and that's how I came here. My only difference was that I was trying to execute a request against the server itself.
My solution was in the access management of the server. I had an htaccess file that was blocking anyone from viewing except for my own network and the hitch was that it was also blocking my server from requesting itself.
So maybe it has something to do with the servers access management. Let me know if this helps.
We were using fsockopen and fwrite combo, then it up and stopped working one day. Or it was kind of intermittent. After a little research and testing, and if you have fopen wrappers enabled, I ended up using file_get_contents and stream_context_create functions with a timeout that is set to 100th of second. The timeout parameter can receive floating values (https://www.php.net/manual/en/context.http.php). I wrapped it in a try...catch block so it would fail silently. It works beautifully for our purposes. You can do logging stuff in the catch if needed.
$context = stream_context_create([
"http" => [
"method"=>"GET",
"timeout" => .01
]
]
);
try {
file_get_contents($url, 0, $context);
}catch( Exception $e ){
// Fail silently
}
I have urls of the following format (I write it as as aregex):
http://exampledomain\.com/files/.+
The URL is requested by a client over my proxy server.
I inject a custom authorization header into this request.
This is no problem, I can do that with mod_proxy and mod_headers in apache.
However this url gives a 302 redirect to another source, that finally should arrive at the client.
This other url is SSL secured and looks like:
https://example.+\.exampledomain.\com/files/.+/.+
It needs the same authorization header than the other one. But because its hTTPS I cannot simply inject it using mod headers.
Therefore my plan is to set up a reverse proxy that I use to request the first HTTP URL with an authorization header. Behind is a script that fetches the request, parses out the second HTTPS URL and Downloads the source with the custom authorization header and passes the received file (bytes and http headers) through to the client.
I am familiar with Apache2 and PHP, therefore it would be nice to stick with theese. But its not necessary.
Just to clarify, we are talking about rapidshare links here. My company is having backup files and storage there, because its the cheapest cloud hosting. There are some job servers that need to fetch files, but I can not configure user/password on them. So the proxy server should add the premium authentication information.
I thought about something like:
proxy.php - this file is requested using the reverse proxy and served by a webserver
this proxy php is written to by a htaccess rewrite rule and by the revery proxy on http://one-or-my-domoains/request-uri-from-the-original-request (That is generated in apache)
$url = "http://example.com/".$_SERVER["REQUEST_URI"];
$opts = array(
'http'=>array(
'header'=>"Cookie: myauthvalue=example\r\n"
)
);
$context = stream_context_create($opts);
stream_context_set_default($context);
$http_content = get_headers($url, 1);
$ssl_location = $http_content['Location'];
$fp = fsockopen("ssl://".parse_url($ssl_location, PHP_URL_HOST), 443, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET ".parse_url($PHP_URL_PATH, PHP_URL_HOST)." HTTP/1.1\r\n";
$out .= "Host: ".parse_url($ssl_location, PHP_URL_HOST)."\r\n";
$out .= "Cookie: myauthorization=something";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
So this should basically pass the ssl download to the standard http connection that the user initiates, instead of the redirect.
My question now is: Will this work? Did I miss something? Are there any caveeats?
I think it will be a reasonable amount of hassle, including the reverse proxy setup and all.
well i ended up just doing passthru('wget ....') wich works just fine ;)