I'm currently using SWFUpload to upload files to my S3 bucket. And it's working great.
I'm using the script from a website here: http://www.anedix.com/news/article/50
Again, the upload to my S3 works fine, however, I've been running into an issue when attempting to upload larger files. It seems that I cannot upload anything over 50MB. I have tried this from both my webhost and locally, using my local testing environment.
My question is this: When uploading with SWFUpload, it should be going straight to Amazon S3, correct? If so, then PHP settings such as MAX_UPLOAD_SIZE should not affect it? (Even though in my local environment, I've set it to 1024MB.)
Essentially, what the script does is, shows that it's uploading the file (it takes the appropriate amount of time), redirects to the success page, and does not throw any errors.
Any ideas on why this would be happening, or how I can troubleshoot this?
Thanks!
file_size_limit is also a param of SWFUpload ... have you checked it ?
also consider a 30% more as the content posted be encoded
Related
I use a jQuery upload script with PHP. It works fine. Users can upload files of up to 2GB. I have one user struggling to upload files. I tested it remotely on his computer, and anything over 500KB does not even make it to my PHP script. The same file uploads fine from my computer, using the same browser. I'm suspecting a setting on the user's side, maybe in the browser (chrome), but I don't even know where to begin. I'm not posting any code, since the code is fine for most users - and it's a bit involved to post here. Thank you.
I'm stuck with this specific problem for two days and can't find a solution.
So I have Laravel 7.0 project hosted on AWS Elastic Beanstalk, which is running fine. I also have S3 bucket, used for saving uploaded videos, which are uploaded to server via user form.
The problem is, smaller files(< 10 MB) are uploaded without a problem. But once it comes to bigger files, Storage::disk('s3')->put('videos/lorem.mp4', fopen($request->file('file'), 'r+')); method returns false and file is not uploaded to S3. If I use 'public' disk instead of 's3' disk, file is uploaded without a problem.
I also tried to upload file manually via AWS CLI with the same IAM user and it was uploaded without a problem.
PHP and nginx configuration are correctly configured to accept big files.
I know that this is very specific question, but if anyone have a hint or solution, please do share.
This is likely a timeout issue on the upload. Increasing the PHP's max_execution_time might help. I'd try that first.
Else, you should look into uploading to local disk and have an entirely different process for picking up the files from disk and uploading them to S3. I was amazed at how much throughput you can gain from this lab. It's also worth checking out the job queues feature in Laravel.
I want to upload a large file from My computer to S3 Server without editing php.ini. Firstly,I choose file from browse button and submit upload button and then upload to s3 server. But I can't post form file data when I upload a large file. But I don't want to edit php.ini.Is there any way to upload a large local file to s3 server?
I've done this by implementing Fine Uploader's php implementation for S3. As of recently it is through an MIT license. It's an easy way to upload huge files to S3 without changing your php.ini at all.
It's not the worst thing in the world to set up. You'll need to set some environment variables for the public/secret keys, set up CORS settings on the bucket, and write a php page based on one of the examples which will call a php endpoint that'll handle the signing.
One thing that was not made obvious to me was that, when setting the environment variables, they expect you to make two separate AWS users with different privileges for security reasons.
ini_set("upload_max_filesize","300M");
try this
I am having a lot of trouble handling large file uploading on the client side. I am looking for a way to show the progress of an upload while that person uploads. Then when it is finished, I need to have the file handled by a php script.
I am currently using http://code.google.com/p/swfupload/ SWFUPLOAD, but it is giving me trouble. It works 100% of the time for small files that are 5MB and so on, but for larger files that are over 100MB I am getting weird behavior. For example, when finished, the upload script does not receive some of the posted variables sometimes and so on. It seems to be breaking for reasons I cannot diagnose and I am quite frankly completely sick of it. (PS all my php settings are fine).
I am just looking for a simple solution for upload progress that does not have too many bells and whistles. I just want the ability to upload large files 100MB-500MB and then have the form posted to an upload script without the client side solution hanging or causing problems.
Has anyone worked on a project that required uploading large files and displaying progress? If so, what was your solution?
Did it involve flash?
Does anyone have any recommendations or a reliable solution?
Thanks in advance.
PHP have a restriction for upload files, you can modify this argument in PHP.ini, but if you can't have access to PHP.ini (some webhosting don't give access to PHP.ini) you can try upload file via FTP.
Can try with this (is in spanish) or with another good.
am working on a website and i have a big problem when i tried to upload files, i increase upload_max_filesize and post_max_size and the code still understand only as a max. 10M. for any different folder php accepts 100M. but inside the site folder ( which am working inside) it doesn't understand it. i check for local php.ini or .htaccess.
note: am running a linux server.
For uploading bigger files I would suggest a dedicated uploader plug-in.
Like a SWF of Java. For these reasons:
Security - you can easily encode the sent data (encoding ByteArray in AS3.0 is very easy, can be even tokenized so it is hard to intercept the stream)
Reliability - with simple HTTP requests it is hard to actually monitor the upload progress, so the user might choose to close the uploaded (because he thinks it got stuck)
User friendly - again, progress bar
Not limited by server - if you accept it directly with PHP custom code, you won't need any configuring for annoying things like max file size on upload.
On server-side you will need either a Socket listener, or an HTTP tunnel if unavailable.
You can use JumpLoader, which is a Java applet, and it is able to split large files into partitions, and upload them one by one. Then a PHP script rebuilds the original file from the uploaded partitions on the server.
Plupload can split large files into smaller chunks. See the documentation.
Do you run Apache with mod_security? Then check if the LimitRequestBody is in affect.
Here is a good tutorial about Settings for uploading files with PHP.
Thanks guys,
I found the problem. i don't know why is the file is not visible for me /public_html/site/.htaccess
i tried to overwrite it, and it's seems to be working.
Thanks a lot for efforts.