I have recently created a Digital Ocean server and on it, I have installed the pecl_http-1.7.6 PHP extension so that I may write PHP scripts that can respond to http requests.
I am curious about two different things. The first part, is, how does someone stream data via PHP code using HttpResponse? I know there are 'setStream' methods and such, which leads me to believe that you can set up a data stream using an http request.
The second part, is, how will I go about streaming audio data once I can set up a stream in the general case? I am most likely going to use Java to perform http requests on my server, which I have already tested and it has worked.
Related
I know that http requires a Content-Length header for POST and PUT requests. Is there a way around this limitation without using websockets? Maybe using somehow chunked transfer encoding?
I mean, how would someone create a client application that streams from a client's webcam to a (Apache/PHP) server which would then save or broadcast the file in "real time"?
Actually I would use it for another use-case: zipping files and streaming the zip to the server at the same time, but the aforementioned use-case seems more mainstream (sorry for the pun).
I've tried the obvious: just doing it without a content-length, but the server just complains with a Bad Content-Length! error.
You should be able to use chunked encoding.
Aim: to send a stream of data to the server, and that data to be streamed back in realtime as the transfer is happening using PHP (with only one webserver request)
Objectives complete So far: I have worked out that the PUT method is the only one which php/apache is able to read bit by bit as it is being received (using file_get_contents & Php:input wrapper). I have worked out that the only way to make a PUT request in a browser (without addons), is to do it via ajax, fine.
The next objective is to deliver the data back to the browser (in real time). If there is a possibility to deliver more responses than requests, this could be possible right? Without binding to ports for sockets or installing any third party software/apache modules, is there a way to do this? (On the screen would be a stream of data and your browser slide bar would get progressively smaller, just to visualise).
I'm trying to stream MP4 files through Apache / Nginx using a PHP proxy for authentication. I've implemented byte-ranges to stream for iOS as outlined here: http://mobiforge.com/developing/story/content-delivery-mobile-devices. This works perfectly fine in Chrome and Safari but.... the really odd thing is that if I monitor the server requests to the php page, three of them occur per page load in a browser. Here's a screen shot of Chrome's inspector (going directly to the PHP proxy page):
As you can see, the first one gets canceled, the second remains pending, and the third works. Again, the file plays in the browser. I've tried alternate methods of reading the file (readfile, fgets, fread, etc) with the same results. What is causing these three requests and how can I get a single working request?
The first request is for the first range of bytes, preloading the file. The browser cancels the request once it has downloaded the specified amount.
The second one I'm not sure about...
The third one is when you actually start playing the media file, it requests and downloads the full thing.
Not sure whether this answers your question, but serving large binary files with PHP isn't the right thing to do.
It's better to let PHP handle authentication only and pass the file reference to the web server to serve, freeing up resources.
See also: Caching HTTP responses when they are dynamically created by PHP
It describes in more detail what I would recommend to do.
I'm writing an API using php to wrap a website functionality and returning everything in json\xml. I've been using curl and so far it's working great.
The website has a standard file upload Post that accepts file(s) up to 1GB.
So the problem is how to redirect the file upload stream to the correspondent website?
I could download the file and after that upload it, but I'm limited by my server to just 20MG. And it seems a poor solution.
Is it even possible to control the stream and redirect it directly to the website?
I preserverd original at the bottom for posterity, but as it turns out, there is a way to do this
What you need to use is a combination of HTTP put method (which unfortunately isn't available in native browser forms), the PHP php://input wrapper, and a streaming php Socket. This gets around several limitations - PHP disallows php://input for post data, but it does nothing with regards to PUT filedata - clever!
If you're going to attempt this with apache, you're going to need mod_actions installed an activated. You're also going to need to specify a PUT script with the Script directive in your virtualhost/.htaccess.
http://blog.haraldkraft.de/2009/08/invalid-command-script-in-apache-configuration/
This allows put methods only for one url endpoint. This is the file that will open the socket and forward its data elsewhere. In my example case below, this is just index.php
I've prepared a boilerplate example of this using the python requests module as the client sending the put request with an image file. If you run the remote_server.py it will open a service that just listens on a port and awaits the forwarded message from php. put.py sends the actual put request to PHP. You're going to need to set the hosts put.py and index.php to the ones you define in your virtual host depending on your setup.
Running put.py will open the included image file, send it to your php virtual host, which will, in turn, open a socket and stream the received data to the python pseudo-service and print it to stdout in the terminal. Streaming PHP forwarder!
There's nothing stopping you from using any remote service that listens on a TCP port in the same way, in another language entirely. The client could be rewritten the same way, so long as it can send a PUT request.
The complete example is here:
https://github.com/DeaconDesperado/php-streamer
I actually had a lot of fun with this problem. Please let me know how it works and we can patch it together.
Begin original answer
There is no native way in php to pass a file asynchronously as it comes in with the request body without saving its state down to disc in some manner. This means you are hard bound by the memory limit on your server (20MB). The manner in which the $_FILES superglobal is initialized after the request is received depends upon this, as it will attempt to migrate that multipart data to a tmp directory.
Something similar can be acheived with the use of sockets, as this will circumvent the HTTP protocol at least, but if the file is passed in the HTTP request, php is still going to attempt to save it statefully in memory before it does anything at all with it. You'd have the tail end of the process set up with no practical way of getting that far.
There is the Stream library comes close, but still relies on reading the file out of memory on the server side - it's got to already be there.
What you are describing is a little bit outside of the HTTP protocol, especially since the request body is so large. HTTP is a request/response based mechanism, and one depends upon the other... it's very difficult to accomplish a in-place, streaming upload at an intermediary point since this would imply some protocol that uploads while the bits are streamed in.
One might argue this is more a limitation of HTTP than PHP, and since PHP is designed expressedly with the HTTP protocol in mind, you are moving about outside its comfort zone.
Deployments like this are regularly attempted with high success using other scripting languages (Twisted in Python for example, lot of people are getting onboard with NodeJS for its concurrent design patter, there are alternatives in Ruby or Java that I know much less about.)
I was wondering if same origin policy is violated if you retrieve an image using PHP and cURL and manipulate it using HTML5 canvas? I know getImageData and putImageData doesn't work for images fetched from different (sub)domains. I am not too familiar with the cURL library but I hear that the output that you fetch can either be immediately echoed in the browser or you can put it to the server for later use. With the immediate output method, does the same origin policy still prevent me from manipulating the remote image data on my local machine however I want?
In the end, what I intend to do is use a CDN to store images and have a web server retrieve them and manipulate them (using canvas) upon client request.
Curl seems easy and I'll take the time to learn it if anyone has any experience on the subject.
Do you think hosting the images on the CDN in base64 and translating on the server is a pratical idea? I'm just throwing ideas out there.
Provided the PHP script you use is in the same domain... If you retrieve an object from a remote server and deliver it down to a client connected to your server, from the client's perspective it does indeed come from your server, hence it is not a same-origin violation. This is true whether you retrieve it with cURL and immediately dump it down to the client browser, or if you hang onto it in memory, modify it, and then dump it down to the browser.