I would like to understand if this application is suitable for Apache/PHP.
I want to do something like WeTransfer service where I upload a big file to this web application and it makes a public web link to download the file.
I have some doubt about the streaming/buffering capabilities of Apache/PHP web server.
Can you confirm that if I want to upload or download a file of 2GB with a PHP script I need to set at least the same memory_limit because POST data is not buffered?
Related
I want to upload a large file from My computer to S3 Server without editing php.ini. Firstly,I choose file from browse button and submit upload button and then upload to s3 server. But I can't post form file data when I upload a large file. But I don't want to edit php.ini.Is there any way to upload a large local file to s3 server?
I've done this by implementing Fine Uploader's php implementation for S3. As of recently it is through an MIT license. It's an easy way to upload huge files to S3 without changing your php.ini at all.
It's not the worst thing in the world to set up. You'll need to set some environment variables for the public/secret keys, set up CORS settings on the bucket, and write a php page based on one of the examples which will call a php endpoint that'll handle the signing.
One thing that was not made obvious to me was that, when setting the environment variables, they expect you to make two separate AWS users with different privileges for security reasons.
ini_set("upload_max_filesize","300M");
try this
If I do
$file='http://notmywebsite.com/verybigimage.png';
$newfile='test.png';
copy($file,$newfile);
will my server download http://notmywebsite.com/verybigimage.png?
If yes, how can I make my user download http://notmywebsite.com/verybigimage.png without that my server download it?
It's not possible to have the client (the user) download a file and then copy this file in the server in a seamless way, because these two environments are living in different systems — in order to do this, the client needs to upload the file back to the server after downloading it, which in turn would be a pain for the user and slower than simply letting the server download the file and copying it there, if the ultimate goal is having a copy of the file in the server.
I need to write a script to upload big Files (~2GB+) to a server.
I don't think HTTP is the right way to do this so I want to use (S)FTP.
There are several tutorials about this (using cURL or ftp_connect) and I understand that i have to set several things in the php.ini.
But all this tutorials upload the file to a remote Server, what I want to do is upload it to the Server the script is running on, without having to upload the file to the Server over HTTP first.
Is this possible? If so, how would I do that?
HTTP can be the right way to upload large files. You can use resumable.js or similar library to split the file in "chunks" and then reassemble the file on the server.
If you decided not to go with HTTP and have shell access, I recommend you use rsync (with the --partial flag) which will do the heavy lifting for you.
I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.
I'd like to have my PHP script upload a file with a certain filename in a directory of my choosing. However, the catch is that I need it to exist there immediately upon upload so I can moniter it on my server. I don't want to use a PHP extension or something - this should be very easy to transfer to any PHP setup.
So basically: Is there a way to guarantee that, from the very beginning of the file upload process, the file has a certain name and location on the server?
Not that I'm aware of.
PHP will use the php.ini-defined tmp folder to store uploads until you copy them to their correct location with move_uploaded_file(). So it's very easy to know its location, but the file name is random and I don't think you can define it.
If you're not going to have multiple concurrent uploads (for example if only you are going to upload files and you know you won't upload 2 files at the same time), you could check the most recent upload file in the tmp directory.
The common solution for monitoring uploads is apc.rfc1867
I know of three options:
RFC1867 (as mentioned by others) which allows you to poll upload progress using ajax
Flash-based uploaders like SWFUpload which allow you to poll upload progress using JavaScript
Create a PHP command line daemon listening on port 80 that accepts file uploads, and used shared memory (or some other mechanism) to communicate upload progress. Wish I could find the link, but I read a great article about a site that allowed users to upload their iTunes library XML file, and it was processed live by the server as it was being uploaded. Very cool, but obviously more involved than the previous options.
I have had decent luck with SWFUpload in the past.
I don't think you can configure the name, as it will be a random name in the temporary folder. You should be able to change the directory, but I can't seem to find the answer on Google (check out php.ini).
As far as I know, this isn't possible with PHP, as a file upload request submits the entire file to the system in one request. So there is no way for the PHP server to know what is happening until it receives the whole request.
There is not a way to monitor file upload progress using PHP only, as PHP does not dispatch progress events during the upload. This is possible to do using a Flash uploader even if Flash is uploading via a PHP script. Flash polls the temporary file on the server during the upload to dispatch progress events. Some of the javascript frameworks like YUI use a SWF to manage uploads. Check out YUI's Uploader widget.
http://developer.yahoo.com/yui/uploader/