I know that http requires a Content-Length header for POST and PUT requests. Is there a way around this limitation without using websockets? Maybe using somehow chunked transfer encoding?
I mean, how would someone create a client application that streams from a client's webcam to a (Apache/PHP) server which would then save or broadcast the file in "real time"?
Actually I would use it for another use-case: zipping files and streaming the zip to the server at the same time, but the aforementioned use-case seems more mainstream (sorry for the pun).
I've tried the obvious: just doing it without a content-length, but the server just complains with a Bad Content-Length! error.
You should be able to use chunked encoding.
Related
I am writing a service to receive data from an energy monitoring device in my house. The only way the device can push data is by using an HTTP POST, with Transfer-Encoding: chunked, and no Content-Length, and the data as XML in the request body.
I would like to be able to read this data in a PHP script (behind Apache) and store it in my database.
It appears that the most correct way to read the HTTP request body in PHP these days is to open and read php://input. However, comments from users on PHP's website indicate (and my testing confirms), that if no Content-Length header is specified by the client, php://input returns no data (even if there is data being sent by the client in chunked encoding).
Is there an alternate way to get access to the request body? Can I configure Apache (with .htaccess) to decode the chunked data and call my script once it gets it all, and include Content-Length? Or configure PHP so it will be able to handle this?
Thanks!
Might as well write this as an answer I suppose.
The issue you describe is documented here: https://bugs.php.net/bug.php?id=60826
Highlights from the thread:
So here is what I found. If you send chunked http request but do not
send the content-length (as we should do using HTTP 1.1 protocol) the
body part of the request is empty (php://input returns nothing). This
result remains valid for the following configuration where PHP runs
via FastCGI :
The same configuration with PHP running as mod_php is fine, the body
request is available through php://input.
And
Essentially, PHP is following the spec, and not reading beyond the
FastCGI CONTENT_LENGTH header value - so correct fixes are
necessarily above PHP, in the FastCGI or web server implementation.
Possible Workarounds:
0) Use httpd with mod_php
1) Use a different web server.
2) Look at patches for mod_fcgid
3) Maybe (not recommended) flip always_populate_raw_post_data and see if there is anything in $HTTP_RAW_POST_DATA
I have recently created a Digital Ocean server and on it, I have installed the pecl_http-1.7.6 PHP extension so that I may write PHP scripts that can respond to http requests.
I am curious about two different things. The first part, is, how does someone stream data via PHP code using HttpResponse? I know there are 'setStream' methods and such, which leads me to believe that you can set up a data stream using an http request.
The second part, is, how will I go about streaming audio data once I can set up a stream in the general case? I am most likely going to use Java to perform http requests on my server, which I have already tested and it has worked.
I'm writing an API using php to wrap a website functionality and returning everything in json\xml. I've been using curl and so far it's working great.
The website has a standard file upload Post that accepts file(s) up to 1GB.
So the problem is how to redirect the file upload stream to the correspondent website?
I could download the file and after that upload it, but I'm limited by my server to just 20MG. And it seems a poor solution.
Is it even possible to control the stream and redirect it directly to the website?
I preserverd original at the bottom for posterity, but as it turns out, there is a way to do this
What you need to use is a combination of HTTP put method (which unfortunately isn't available in native browser forms), the PHP php://input wrapper, and a streaming php Socket. This gets around several limitations - PHP disallows php://input for post data, but it does nothing with regards to PUT filedata - clever!
If you're going to attempt this with apache, you're going to need mod_actions installed an activated. You're also going to need to specify a PUT script with the Script directive in your virtualhost/.htaccess.
http://blog.haraldkraft.de/2009/08/invalid-command-script-in-apache-configuration/
This allows put methods only for one url endpoint. This is the file that will open the socket and forward its data elsewhere. In my example case below, this is just index.php
I've prepared a boilerplate example of this using the python requests module as the client sending the put request with an image file. If you run the remote_server.py it will open a service that just listens on a port and awaits the forwarded message from php. put.py sends the actual put request to PHP. You're going to need to set the hosts put.py and index.php to the ones you define in your virtual host depending on your setup.
Running put.py will open the included image file, send it to your php virtual host, which will, in turn, open a socket and stream the received data to the python pseudo-service and print it to stdout in the terminal. Streaming PHP forwarder!
There's nothing stopping you from using any remote service that listens on a TCP port in the same way, in another language entirely. The client could be rewritten the same way, so long as it can send a PUT request.
The complete example is here:
https://github.com/DeaconDesperado/php-streamer
I actually had a lot of fun with this problem. Please let me know how it works and we can patch it together.
Begin original answer
There is no native way in php to pass a file asynchronously as it comes in with the request body without saving its state down to disc in some manner. This means you are hard bound by the memory limit on your server (20MB). The manner in which the $_FILES superglobal is initialized after the request is received depends upon this, as it will attempt to migrate that multipart data to a tmp directory.
Something similar can be acheived with the use of sockets, as this will circumvent the HTTP protocol at least, but if the file is passed in the HTTP request, php is still going to attempt to save it statefully in memory before it does anything at all with it. You'd have the tail end of the process set up with no practical way of getting that far.
There is the Stream library comes close, but still relies on reading the file out of memory on the server side - it's got to already be there.
What you are describing is a little bit outside of the HTTP protocol, especially since the request body is so large. HTTP is a request/response based mechanism, and one depends upon the other... it's very difficult to accomplish a in-place, streaming upload at an intermediary point since this would imply some protocol that uploads while the bits are streamed in.
One might argue this is more a limitation of HTTP than PHP, and since PHP is designed expressedly with the HTTP protocol in mind, you are moving about outside its comfort zone.
Deployments like this are regularly attempted with high success using other scripting languages (Twisted in Python for example, lot of people are getting onboard with NodeJS for its concurrent design patter, there are alternatives in Ruby or Java that I know much less about.)
Is there a way in PHP use a HTML form to upload only the header of the file. The length of the header is defined within the first few bytes of the file, I want only that part nothing else. After I have received the last byte of the header, I don't want any more of the file. Is there a graceful way to do this in PHP or am I going to have to do some socket programming?
In the PHP side, you cannot abort an upload (HTTP connection) gracefully. The client would face a "connection reset by peer" error (or whatever friendlier flavored message the webbrowser made out of the error). Best what you can do is to ignore the read bytes, but that's not (clearly?) easy in PHP since it's a pretty high level language.
In the HTML side, there's absolutely no way to upload only the "header" of a file. It's the complete file or nothing. There are countless different file formats and HTML has no notion of any of them, let alone know how to distinguish the "header" part of the file in question.
Your best bet is to write a little application which get served by the webpage and downloaded into client's machine and runs locally over there. This application should have a file chooser/picker/opener and then programmatically determine the "header" and send it to the server. You could do that in flavor of a Java Applet or MS Silverlight or probably Adobe Flash.
For various reasons, I need to play the intermediary between an HTTP Request and a file on disk. My approach has been to populate headers and then perform a readfile('/path/to/file.jpg');
Now, everything works fine, except that it returns even a medium sized image very slowly.
Can anyone provide me with a more efficient way of streaming the file to the client once the headers have been sent?
Note: it's a linux box in a shared hosting environment if it matters
Several web servers allow an external script to tell them to do exactly this. X-Sendfile on Apache (with mod_xsendfile) is one.
In a nutshell, all you send is headers. The special X-Sendfile header instructs the web server to send the named file as the body of the response.
You could start with implementing conditional GET request support.
Send out a "Last-Modified" header with the file and reply with "304 Not Modified" whenever the client requests the file with "If-Modified-Since" and you see that the file has not been modified. Some sensible freshness-information (via "Cache-Control" / "Expires" headers) also is advisable to prevent repeated requests for an unchanged resource in the first place.
This way at least the perceived performance can be improved, even if you should find that you can do nothing about the actual performance.
This should actually be fairly fast. We have done this with large images without a problem. Are you doing anything else before outputting the image that might be slowing the process, such as calculating some meta data on the image?
Edit:
You may need to flush the output and use fread. IE:
$fp = fopen($strFile, "rb");
//start buffered download
while(!feof($fp))
{
print(fread($fp,1024*16));
flush();
ob_flush();
}
fclose($fp);
Basically you want to build a server... That's not trivial.
There is a very promissing project of a PHP based server : Nanoweb.
It's free, and fully extensible.