I have three weeks looking for this:
I have this page which have to download a file after verifying the data given by the user. I do the validation and execute an external script which gives me an URL. And I use that URL to download another file which I'm gonna execute later.
I'd tried to download the file with curl, wget, file_put_content, javascript, ajax and jquery with no luck, the file is 150MB+ so I created a nice progress bar to tell the user how the download is going and already have a method to read the downloaded size.
I fact I'm able to download the file with cURL, but the problem is that the "do_form.php" won't load until the file is completely downloaded, so I want to load the page first, then download the file - in the background -, so I can show the user the progress of the download.
Please tell me that this is possible... Thanks!!
Have a look at Gearman. Maybe a gearman background job fits your needs. This job can even run on the same machine as the web server. Of course the early response to the page has the disadvantage that it contains no success information. You will need to poll for that information using AJAX again.
If you decide to do so, you should also have a look at GearmanManager as this eases things a lot.
Related
I am trying to process the user uploaded file real time on the websever,
but it seems, APACHE invokes PHP, only once complete file is uploaded.
When i uploaded the file using CURL, and set
Transfer-Encoding : "Chunked"
I had some success, but can't do same thing via browser.
I used Dropzone.js but when i tried to set same header, it said Transfer -Encoding is an unsafe header, hence not setting it.
This answer explains what is the issue there.
Can't set Transfer-Encoding :"Chunked from Browser"
In a Nutshell problem is , when a user uploads the file to webserver, i want webserver to start processing it as soon as first byte is available.
by process i mean, PIPING it to a Named Pipe.
Dont want 500mb first getting uploaded to a server, then start processing it.
But with current Webserver (APACHE - PHP), I cant seem to be able to accomplish it.
could someone please explain, what technology stack or workarounds to use, so that i can upload the large file via browser and start processing it, as soon as first byte is available.
It is possible to use NodeJS/Multiparty to do that. Here they have an example of a direct upload to Amazon S3. This is the form, which sets content type to multipart/form-data. And here is the function for form parts processing. part parameter is of type ReadableStream, which will allow per-chunk processing of the input using data event.
More on readable streams in node js is here.
If you really want that (sorry don`t think thats a good idea) you should try looking for a FUSE Filesystem which does your job.
Maybe there is already one https://github.com/libfuse/libfuse/wiki/Filesystems
Or you should write your own.
But remember as soon as the upload is completed and the post script finishes his job the temp file will be deleted
you can upload file with html5 resumable upload tools (like Resumable.js) and process uploaded parts as soon as they received.
or as a workaround , you may find the path of uploaded file (usually in /tmp) and then write a background job to stream it to 3rd app. it may be harder.
there may be other solutions...
I need to get user download some file (for example, PDF). What will be longer:
send this file by PHP (with specific headers),
or put it in http public folder, and get user the public link to download it (without PHP help)?
In 1st case the original file could be in private zone.
But I'm thinking it will take some time to send this file by PHP.
So how I can measure PHP spent time to sending file and how much memory it can consumed?
P.S. in the 1st case, when PHP sends headers and browser (if pdf plugin is installed) will try to opening it inside browser, is PHP still working, or it push out whole file after headers sent immediately? Or if plugin not installed and browser will show "save as" dialog PHP still working ?
There will be very little in it if you are worried about download speeds.
I guess it comes down to how big your files are, how many downloads you expect to have, and if your documents should be publicly accessible, the download speed of the client.
Your main issue with PHP is the memory it consumes - each link will create a new process, which would be maybe 8M - 20M depending on what your script does, whether you use a framework etc.
Out of interest, I wrote a symfony application to offer downloads, and to do things like concurrency limiting, bandwidth limiting etc. It's here if you're interested in taking a look at the code. (I've not licensed it per se, but I'm happy to make it GPL3 if you like).
I'm using PHP and my script outputs a list of links for files that can be downloaded by the user. To download each file, as a user I would have to copy the url and paste it into something like Free Download Manager.
I would like to improve the user experience and have a "Download" button that would handle or initiate the download process.
I'm thinking either have code written in PHP to act as a download manager, or tie the "Download" button to the functionality of a firefox or such add-on that act as a download manager. Does anyone have good suggestions for the sort of thing I'm trying to do?
Update:
Let's say that:
- my script presents the user with a list of files that can be downloaded.
- next to each file, there's a checkbox, then at the bottom a button that says "download selected".
If this is the setup I have, if I use force download, then clicking the "download selected" button will force dl 12 files at the same time, so not exactly like a download manager. I'm thinking this probably requires something that takes both PHP and Firefox behavior into account.
You can use php header() to force download for single file per time and multiple times.
Some links for you to reference from.
http://w-shadow.com/blog/2007/08/12/how-to-force-file-download-with-php/
http://www.ryboe.com/tutorials/php-headers-force-download
Another good example from php.net: readfile()
How can I create a file upload progress bar with PHP and jQuery? Please don't refer me to Flash stuff like Uploadify. I want to create my own.
Just store and update the progress in server side session and use repeated ajaxical calls from the client side on to obtain the current progress from the server side session until it gets 100%. Long story short, here's a clear tutorial how to do it with PHP and jQuery: How to build an ajax progress bar with jQuery and PHP.
Then the server side part, you need at least PHP 5.2 for this with the PECL Uploadprogress extension. You can find here a blog about it: PECL Uploadprogress example. This comment of jazfresh on php.net is also helpful.
If you don't want to use an already prebuilt one like swfupload you'll have to get your action-script-fu ready and use the external interface api to make it talk with jQuery.
Basically you'll need to control how much data is sent in some time span. As you don't control your browser data transfer, neither how your browser read data from your file, you can't do that with plain Javascript.
You'll need some 3rd party control, like Silverlight, Flash or Java applets. Using them you'll have granted that filesystem access, so you can control how to read your source file. So, to build your progress bar, you just need to make several HTTP calls to your server application sending your source file in small pieces.
To get file(s) upload progress you should use Flash. A tutorial with more info can be found here. Note that this is using .NET though, not PHP.
here is my situation:
I want to create an AJAX file-upload script, which will upload to an external site (ie: not the one the script is located in) and at the same time report the progress of the upload. How would I go about doing this? Note that the process must be secure.
If you are POSTing the file to another server there is no way to know the status of this upload since the upload is done between the users browser and the remote site.
If you have access to the script that handles file upload on the other site you could use Zend_File_Transfer and Zend_ProgressBar to fetch the information on the uplad progress from the other site and display it on your page.
Note: to use Zend_ProgressBar you need APC or uploadprogress extension.
There are two ways to do:
Using ajax and CGI
Using flash
The advantage of the flash method is that it does not require you to rewrite any server side scripts. This is especially good if you upload to a different server than yours. You do need to put a cross domain xml file on that server though.
The advantage of the ajax version is that it does not require your users to have flash installed.
There is no way to get the exact progress using ajax and php. Using php and ajax you can only know wheather the upload is in progress or finished. That is the reason why all ajax/php aplications have loading but no progress bar. If you explicitly want progress bar you should use a PERL CGI
Flash ( swfupload ) is probably the easiest. Vimeo.com uses swfupload to achieve this as well. Ihe only other method I know of involves php and APC which a tutorial of can be found at http://phpriot.com/articles/php-ajax-file-uploads.