Accessing file data before upload complete in php - php

I am currently uploading a file to my server via http. All is well with the upload but I wanted to know if there is a way to access the data of an upload in the $_FILES global before the upload has completed so I can open another stream to a server and push the data over as it comes in
Previously in java I have had read and write streams open at once rather than reading then writing (working with parsing flat files)and this has dramatically increased the speed, so I guess I'm searching for a PHP equivalent for file uploads. Any suggestions?

May be this would give you a path to explore.
https://developer.mozilla.org/en-US/docs/Web/API/FileReader
Check FileReader.readAsDataURL() section specially. Mostly when image upload needs a preview before uploading, FileReader object is used. Search for it you may get an idea

Related

Process Uploaded file on web server without storing locally first?

I am trying to process the user uploaded file real time on the websever,
but it seems, APACHE invokes PHP, only once complete file is uploaded.
When i uploaded the file using CURL, and set
Transfer-Encoding : "Chunked"
I had some success, but can't do same thing via browser.
I used Dropzone.js but when i tried to set same header, it said Transfer -Encoding is an unsafe header, hence not setting it.
This answer explains what is the issue there.
Can't set Transfer-Encoding :"Chunked from Browser"
In a Nutshell problem is , when a user uploads the file to webserver, i want webserver to start processing it as soon as first byte is available.
by process i mean, PIPING it to a Named Pipe.
Dont want 500mb first getting uploaded to a server, then start processing it.
But with current Webserver (APACHE - PHP), I cant seem to be able to accomplish it.
could someone please explain, what technology stack or workarounds to use, so that i can upload the large file via browser and start processing it, as soon as first byte is available.
It is possible to use NodeJS/Multiparty to do that. Here they have an example of a direct upload to Amazon S3. This is the form, which sets content type to multipart/form-data. And here is the function for form parts processing. part parameter is of type ReadableStream, which will allow per-chunk processing of the input using data event.
More on readable streams in node js is here.
If you really want that (sorry don`t think thats a good idea) you should try looking for a FUSE Filesystem which does your job.
Maybe there is already one https://github.com/libfuse/libfuse/wiki/Filesystems
Or you should write your own.
But remember as soon as the upload is completed and the post script finishes his job the temp file will be deleted
you can upload file with html5 resumable upload tools (like Resumable.js) and process uploaded parts as soon as they received.
or as a workaround , you may find the path of uploaded file (usually in /tmp) and then write a background job to stream it to 3rd app. it may be harder.
there may be other solutions...

Large file upload through Browser (100 GB)

Is there any way to upload large files (more than 80 Gb) through a web browser? Previously I have been uploading files (img, png, jpg) using plupload but it seems not to be working for larger files. I would also like to know how to implement a web page where users could upload like Mega.co.nz or Drive.google.com.
If it is impossible to do it using web development tools, can anyone guide me about how I can divide & upload a file in segments?
Thanks.
You can use the JavaScript Blob object to slice large files into smaller chunks and transfer these to the server to be merged together. This has the added benefit of being able to pause/resume downloads and indicate progress.
If you don't fancy doing it yourself there are existing solutions that use this approach. One example is HTML5 Uploader by Filkor.
If I was you I would use something like FTP to accomplish this. If you can use ASP.NET there are already good libraries that exist for file transfer.
Here is a post that shows an example of uploading a file: Upload file to ftp using c#
The catch is you will need a server. I suggest Filezilla. https://filezilla-project.org/

Mp3 streaming/downloading website - apache server memory issue

I have a website, in which users can upload mp3 files (uploadify), stream them using an html5 player (jplayer) and download them using a php script (www.zubrag.com/scripts/).
When a user uploads a song, the path to the audio file is saved in the database and i'm using that data in order to play and show a download link for the song.
The problem that i'm experiencing is that, according to my host, this method is using a lot of memory on the server, which is dedicated.
Link to script: http://pastebin.com/Vus8SRa7
How should I handle the script properly? And what would be the best way to track down the problem? Any ideas on cleaning up the code?
Any help much appreciated.
I would recommend storing your files on disk (named something random [check for collisions!] or sequential, without file extension, and outside of the doc root), and only store information in your DB. It's much easier to stream a file from disk this way than it is out of a database result.
When you retrieve an entire file's contents out of a database result, that data has to be in memory. readfile() doesn't have this issue. Use headers to return the original file name when sending the file back to the client, if you wish.
I would suggest you not to buffer the content when you are writing binary data of the MP3 onto your HTTP output. That way you'd be saving a lot on physical and virtual memory usage.

How can I upload an image from source URL to some destination URL?

Folks
I have an image at some server (SOURCE)
i.e. http://stagging-school-images.s3.amazonaws.com/2274928daf974332ed4e69fddc7a342e.jpg
Now I want to upload it to somewhere else (DESTINATION)
i.e. example.mysite.com/receiveImage.php
First, I am copying image from source to my local server and then uploading it to destination.
It's perfectly working but taking too much time as it copy the image and then uploads...
I want to make it more simple and optimized by directly uploading image from source URL to destination URL.
Is there a way to handle this ?
I am using php/cURL to handle my current functionality.
Any help would be very much appreciated.
Cheers !!
If example.mysite.com/receiveImage.php is your own service, then you may
pass SOURCE URL to your PHP script as GET or POST parameter
in PHP script, use file_get_contents() function to obtain image by URL, and save it to your storage
Otherwise it's impossible by means of HTTP.
However, there are some ways to increase files uploading speed a little:
If files are huge, you may use two threads: one for downloading (it will store all downloaded data to some buffer) and one for uploading (it will get all available data from buffer and upload it to site). As far as I know, this can't be done easily with PHP, because multi-threading is currently not supported yet.
If there are too many files, you may use many threads / processes, which will do download/upload simultaneously.
By the way, these means do not eliminate double traffic for your intermediate service.
One of the services may have a form somewhere that will allow you to specify a URL to receive from/send to, but there is no generic HTTP mechanism for doing so.
copy($url, $uploadDir.'/'.$fileName);
The only way to transfer the image directly from source to destination is to initiate the transfer from either the source or the destination. You can't magically beam the data between these two locations without them talking to each other directly. If you can SSH login to your mysite.com server you could download the image directly from there. You could also write a script that runs on mysite.com and directly downloads the image from the source.
If that's not possible, the best alternative may be to play around with fread/fwrite instead of curl. This should allow you to read a little bit from the source, then directly upload that bit to the destination so download and upload can work in parallel. For huge files this should make a real difference, for small files on a decent connection it probably won't.
create two textfield one url, other filename
in php, use :
uploadDir is path to your file directory ;)
copy($url, $uploadDir.'/'.$fileName);

PHP upload filename

I'd like to have my PHP script upload a file with a certain filename in a directory of my choosing. However, the catch is that I need it to exist there immediately upon upload so I can moniter it on my server. I don't want to use a PHP extension or something - this should be very easy to transfer to any PHP setup.
So basically: Is there a way to guarantee that, from the very beginning of the file upload process, the file has a certain name and location on the server?
Not that I'm aware of.
PHP will use the php.ini-defined tmp folder to store uploads until you copy them to their correct location with move_uploaded_file(). So it's very easy to know its location, but the file name is random and I don't think you can define it.
If you're not going to have multiple concurrent uploads (for example if only you are going to upload files and you know you won't upload 2 files at the same time), you could check the most recent upload file in the tmp directory.
The common solution for monitoring uploads is apc.rfc1867
I know of three options:
RFC1867 (as mentioned by others) which allows you to poll upload progress using ajax
Flash-based uploaders like SWFUpload which allow you to poll upload progress using JavaScript
Create a PHP command line daemon listening on port 80 that accepts file uploads, and used shared memory (or some other mechanism) to communicate upload progress. Wish I could find the link, but I read a great article about a site that allowed users to upload their iTunes library XML file, and it was processed live by the server as it was being uploaded. Very cool, but obviously more involved than the previous options.
I have had decent luck with SWFUpload in the past.
I don't think you can configure the name, as it will be a random name in the temporary folder. You should be able to change the directory, but I can't seem to find the answer on Google (check out php.ini).
As far as I know, this isn't possible with PHP, as a file upload request submits the entire file to the system in one request. So there is no way for the PHP server to know what is happening until it receives the whole request.
There is not a way to monitor file upload progress using PHP only, as PHP does not dispatch progress events during the upload. This is possible to do using a Flash uploader even if Flash is uploading via a PHP script. Flash polls the temporary file on the server during the upload to dispatch progress events. Some of the javascript frameworks like YUI use a SWF to manage uploads. Check out YUI's Uploader widget.
http://developer.yahoo.com/yui/uploader/

Categories