Large http uploads failing with uploadify/php on Dreamhost - php

Ive been having trouble with a php and JavaScript upload script accepting large file uploads with Dreamhost. I realize that you are supposed to edit php.ini to change post max size and the memory limit, but it isn't behaving as it should.
The only way I have ever successfully had a large file upload was switching to Dreamhost PS and making the memory limit as high as a file (1GB) but there has to be another cost effective way, otherwise how oils sites like YouTube survive? I get I/O errors if I do not have all this memory available.
Could anyone help? Ive struggled with this for over a month.

You usually can't increase the maximum file size on a shared hosting webspace. Run phpinfo() to see what the exact size limit is. Anything beyond that is probably not going to work on that web space without an upgrade.
Don't confuse max_upload_file_size and memory_limit though. memory_limit applies to how much RAM one instance of your PHP script is allowed to use, and has nothing to do with file uploads.

Looks like the answer was to make a Perl script instead. Using Perl I don't see even a blip in the server memory usage.

Editing the PHP.ini is ultimately the solution. You have to change the max_upload_file_size and maybe the post_max_size, you may also want to increase the max_execution_time of the script. These variables will increase your uploading abilities. You also want to include the actual, modified PHP.ini script in the actual directory where you want the changes made.
Reference:PHP Core Variables
Try creating/copying php.ini into a directory, then going to info.php for details. (info.php - just create a new blank php file with contents: <?php phpinfo() ?>
Also, never leave an info.php file on your server. I change its extension on my sites.

I agree using the changes to php.ini you can choose just how crazy of a file you want to allow. With no limit to size, you are opening yourself up for a world of problems. I think even in perl you should fatal out if the number of bytes of a file exceeds a certain amount. Depending on how tech savy your users are you may end up with more than you bargained for. And you wouldnt want to crash yur whole web server because one user uploaded a 200GB file.

Related

Is there any way to upload a large file (over 2GB) to a webserver through 32bit php?

I have a home server set up on a Raspberry Pi 4 running Raspbian Lite. I have built a small webserver on it running Apache and PHP. I have file upload options on this web server that allow me and others to upload files through the web page through a html form that goes to a PHP script that handles the file uploading, updating database, etc.
This server works great for smaller files, however I recently tried to add a file that was around 2GB in size. The webpage would load for over 10 minutes before giving me a "Connection Reset" error.
I have edited both my php.ini and apache2.conf to change all relevant options to allow large file size upload, increase timeouts, etc. I discovered, after much searching, that the problem was that my processor on my server was 32 bit, therefore so was the PHP. Apparently 32 bit PHP doesn't work for uploading files of this size.
I have thought about possible solutions, such as somehow directly uploading via FTP from the html form and bypassing http or somehow splitting the file into smaller pieces while uploading, but I don't know where to start.
Does anyone have any ideas?
It seems that PHP has a limitations in upload sizes specially on 32 bit systems, however uploading a big file (2GB) over the HTTP is may be not a good idea, one good solution is using PHP FTP extension functions.
As far as I know there's no limit to the size of the file, what you have to check is that both upload_max_filesize and post_max_size have a settings that would allow 2048 MB at least like this:
upload_max_filesize = 2048M
post_max_size = 2048M
The next step would be to check what's the memory limit assigned to PHP, it must always be higher than those two previous values. So basically your server would need at least 3 or 4 GB of RAM, because you'd need to assign at least 2049 or more to the PHP process.
With this part:
memory_limit = 2560M
This should work, but that also depends on your server's configuration and specs.

Optimize Wordpress image upload, slow and only 1 at a time on Dedicated

So I have been at this for weeks with different configurations and I usually am able to figure out by searching but I am not as savvy with server configuration, so I need to ask. Hopefully someone can point me in the right direction.
I have a dedicated server with 5 different WordPress installation, EVERYTHING is blazing fast, in terms of site load speed and posting, etc EXCEPT for when I start uploading images using the Wordpress Image upload. ITS SLOW..
When I drag photos over (doesn't matter the size) it takes forever and it seems to be processing one image at a time, unlike when I was on shared hosting, the image upload was a lot faster.. This happens on all my Wordpress installation with different themes and plugins, so I assume this is a server configuration, somewhere.
The process bar would go to 99% for the first image and it sits there for a few minutes, and then goes to the next; also my Admin would stall, so I cannot do anything else on the Admin area until all the images are done uploading. The site doesn't stall and is still functional if I go on a different browser that isn't login as the Admin.
If I go to my process manager, I see that the async-upload.php is running and is only taking up .3% of the CPU and .6% of the memory.
It always finishes but it seems like I am only allowed to have 1 connection or process (sorry, I do not know the correct terms) at a time and then I can request another. Anyone know what server configuration I am missing or have that is causing this? I am on WHM/Cpanel with SSH access, I have tried a few PHP, MySql, and Apache optimizations I found but it hasn't resolved the problem, I am of course doing something wrong with my configuration, anyone can shed some light?
Near the end of its image upload operation, WordPress attempts to resize the incoming images to create thumbnails and medium-sized images. To do this it has to load the images, decompressed, into memory.
That can take a lot of RAM. Try increasing the RAM available to your php instance. Look, in php.ini for a stanza like this
; Maximum amount of memory a script may consume
; http://php.net/memory-limit
memory_limit = 32M
and increase it.
You may also have issues with these settings.
; Maximum allowed size for uploaded files.
; http://php.net/upload-max-filesize
upload_max_filesize = 8M
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 20
Also, take a look in your php_error.log and apache_error.log files and see if you can see any problems. Also, if you're using Google Chrome, open the Javascript console and see if there are any errors showing up.

Can't set upload_tmp_dir to a dir on non-PHP drive?

PHP experts, I've been working on this problem for about a day and a half, and I'm at a loss. If I'm right, I've discovered an arguably huge oversight in PHP.. one I can't believe isn't causing problems for others, and while I've found others having the same problem via Google, all of them have accepted defeat. One even filed a bug with PHP, and it was eventually closed due to lack of activity.
I'm running PHP 5.3 w/ Apache in a WAMP stack.
I'm doing a simple file upload (followed by a move from the tmp location), and for security and speed reasons, I'd like the initial upload temp directory (upload_tmp_dir) to be on a different disk where the file will end up after the move, which happens to be a network share. Z:\temp. Normally, it resides on the same disk as PHP, but this would mean that I have to upload a potentially large file, then wait for it to copy to another disk.. rather than just upload it to the other disk AS the temp, and perform a quick move.
example:
//upload_tmp_dir = Z:\Temp (set in php.ini)
$targetpath = Z:\Data\pdfs (final destination)
move_uploaded_file($_FILES['uploadedfile']['tmp_name'], $targetpath);
Theoretically, this should upload the file temporarily to Z:\Temp, then once it completes without errors, move it to its' final destination of Z:\Data\pdfs\filename.pdf.
I've confirmed that file uploading and moving works as long as upload_tmp_dir is on the same drive PHP is.. i.e. C:\MyTemp, but will NOT work if I traverse to a different drive like Z:\Temp. The value I specified in php.ini for upload_tmp_dir is simply ignored, and the file gets stored in C:\Windows\Temp if I try to traverse to a different drive/partition. Seems to make no difference that it's a mapped drive. Permissions are all set correctly, both share and security have Everyone allowed (chmod 777 equivalent).
I've read in multiple locations that "PHP" and specifically move_uploaded_file() "has trouble with traversing drives". This is obviously for security reasons I'd assume, but it doesn't seem it can be overridden anywhere. However, with an increasing number of servers using virtualization and SSD's with limited space, I don't understand how this is possible. If one was running a site with several hundred users uploading files at a time, how does it make sense that the temporary files (which could be huge) MUST reside on C:\ where PHP is?? Have any of you dealt with this before or have any tips?
I keep assuming it's a permissions issue and so I've looked into other PHP and Apache settings which might be limiting the access, but even setting all the paths in open_base_dir for example doesn't seem to help anything.
Some posts I found outlining the same or similar issues:
http://www.experts-exchange.com/Web_Development/Web_Languages-Standards/PHP/PHP_Databases/Q_21190386.html
https://serverfault.com/questions/128017/php-ignores-upload-tmp-dir
https://bugs.php.net/bug.php?id=44420
This guys says:
mapped drives are user specific, the webserver, and consequentially php, run under a different username
http://forums.phpfreaks.com/index.php?topic=175349.0

php upload_max_filesize problem

am working on a website and i have a big problem when i tried to upload files, i increase upload_max_filesize and post_max_size and the code still understand only as a max. 10M. for any different folder php accepts 100M. but inside the site folder ( which am working inside) it doesn't understand it. i check for local php.ini or .htaccess.
note: am running a linux server.
For uploading bigger files I would suggest a dedicated uploader plug-in.
Like a SWF of Java. For these reasons:
Security - you can easily encode the sent data (encoding ByteArray in AS3.0 is very easy, can be even tokenized so it is hard to intercept the stream)
Reliability - with simple HTTP requests it is hard to actually monitor the upload progress, so the user might choose to close the uploaded (because he thinks it got stuck)
User friendly - again, progress bar
Not limited by server - if you accept it directly with PHP custom code, you won't need any configuring for annoying things like max file size on upload.
On server-side you will need either a Socket listener, or an HTTP tunnel if unavailable.
You can use JumpLoader, which is a Java applet, and it is able to split large files into partitions, and upload them one by one. Then a PHP script rebuilds the original file from the uploaded partitions on the server.
Plupload can split large files into smaller chunks. See the documentation.
Do you run Apache with mod_security? Then check if the LimitRequestBody is in affect.
Here is a good tutorial about Settings for uploading files with PHP.
Thanks guys,
I found the problem. i don't know why is the file is not visible for me /public_html/site/.htaccess
i tried to overwrite it, and it's seems to be working.
Thanks a lot for efforts.

Writing direct to disk with php

I would like to create an upload script that doesn't fall under the php upload limit.
There might be an occasion where I need to upload a 2GB, or larger file and I don't want to have to change the whole server execution to above 32MB.
Is there a way to write direct to disk from php?
What method might you propose someone would use to accomplish this? I have read around stack overflow but haven't quite found what I am looking to do.
The simple answer is you can't due to the way that apache handles post data.
If you're adamant to have larger file uploads and still use php for the backend you could write a simple file upload receiver using the php sockets api and run it as a standalone service. Some good details to be found at http://devzone.zend.com/article/1086#Heading8
Though this is an old post, you find it easily via google when looking for a solution to handle big file uploads with PHP.
I'm still not sure if file uploads that increase the memory limit are possible but I think there is a good chance that they are. While looking for a solution to this problem, I found contradicting sources. The PHP manual states
post_max_size: Sets max size of post data allowed.
This setting also affects file upload.
To upload large files, this value must
be larger than upload_max_filesize. If
memory limit is enabled by your
configure script, memory_limit also
affects file uploading. Generally
speaking, memory_limit should be
larger than post_max_size. (http://php.net/manual/en/ini.core.php)
...which implies that your memory limit should be larger than the file you want to upload. However, another user (ragtime at alice-dsl dot com) at php.net states:
I don't believe the myth that 'memory_size' should be the size of
the uploaded file. The files are
definitely not kept in memory...
instead uploaded chunks of 1MB each
are stored under /var/tmp and later on
rebuild under /tmp before moving to
the web/user space.
I'm running a linux-box with only 64MB
RAM, setting the memory_limit to 16MB
and uploading files of sizes about
100MB is no problem at all! (http://php.net/manual/en/features.file-upload.php)
He reports some other related problems with the garbage collector but also states how they can be solved. If that is true, the uploaded file size may well increase the memory limit. (Note, however, that another thing might be to process the uploaded file - then you might have to load it into memory)
I'm writing this before I tried handling large file uploads with PHP myself since I'm evaluating using php or python for this task.
You can do some interesting things based around PHP's sockets. Have you considered writing an applet in Java to upload the file to a listening PHP daemon? This probably won't work on most professional hosting providers, but if you're running your own server, you could make it work. Consider the following sequence:
Applet starts up, sends a request to PHP to open a listening socket
(You'll probably have to write a basic web browser in Java to make this work)
Java Applet reads the file from the file system and uploads it to PHP through the socket that was created in step 1.
Not the cleanest way to do it, but if you disable the PHP script timeout in your php.ini file, then you could make something work.
It isn't possible to upload a file larger than PHP allowed limits with PHP, it's that simple.
Possible workarounds include using a client-side technology - like Java, not sure if Flash and Javascript can do this - to "split" the original file in smaller chunks.

Categories