How can I get the IIS upload limit using PHP? - php

I want make a check against the IIS upload limit of the server my PHP script is running in the same manner as ini_get('upload_max_filesize') works for the PHP settings.
I'm thinking about parsing the web.config file to get the value of maxAllowedContentLength, but I was wondering if there is a standard way to ask the webserver directly.

There is one link telling a bit about other configuration setting you should read Which gets priority, maxRequestLength or maxAllowedContentLength? And i dont think there is other way as to read the ini file (php have simple ini file reader).
I could propose using some library that splits files like plupload - it can split your files to smaller pieces, upload each piece and then recombine it into one big file bypassing the server maximum upload file / body limits.

Related

php file_exists on FTP Uploading file [duplicate]

My application is keeping watch on a set of folders where users can upload files. When a file upload is finished I have to apply a treatment, but I don't know how to detect that a file has not finish to upload.
Any way to detect if a file is not released yet by the FTP server?
There's no generic solution to this problem.
Some FTP servers lock the file being uploaded, preventing you from accessing it, while the file is still being uploaded. For example IIS FTP server does that. Most other FTP servers do not. See my answer at Prevent file from being accessed as it's being uploaded.
There are some common workarounds to the problem (originally posted in SFTP file lock mechanism, but relevant for the FTP too):
You can have the client upload a "done" file once the upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for an example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, that the file is incomplete.
Some FTP servers allow you to configure a hook to be called, when an upload is finished. You can make use of that. For example ProFTPD has a mod_exec module (see the ExecOnCommand directive).
I use ftputil to implement this work-around:
connect to ftp server
list all files of the directory
call stat() on each file
wait N seconds
For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
If stat() result is not different, then download the file.
This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)
If you are reading files of particular extensions, then use WINSCP for File Transfer. It will create a temporary file with extension .filepart and it will turn to the actual file extension once it fully transfer the file.
I hope, it will help someone.
This is a classic problem with FTP transfers. The only mostly reliable method I've found is to send a file, then send a second short "marker" file just to tell the recipient the transfer of the first is complete. You can use a file naming convention and just check for existence of the second file.
You might get fancy and make the content of the second file a checksum of the first file. Then you could verify the first file. (You don't have the problem with the second file because you just wait until file size = checksum size).
And of course this only works if you can get the sender to send a second file.

PHP receive file as byte array using Zend_Form

We are using Zend_Form inside a PHP application for building an input file html element. We can set the 'destination' of this element, and when calling receive() the file will be saved to the specified location.
We want to be able not to save the file to disc at all, but grab the file as a byte array and do something else with it.
Is this possible? If it is not possible with Zend_Form(), can it be done any other way?
EDIT: The reason why we cannot write to disc is because the application runs on Azure, and it seems that it does not have write access rights anywhere, not even in the temp folder. We get an exception from Zend saying that 'The given destination is not writeable'.
The only thing that seems viable would be to save the file using the php://memory protocol.
I've never had reason to implement but it looks a simple as setting the save location of the file to php://memory here is the link to the manual page PHP I/O Wrappers.
All PHP uploads are written to the file system regardless of using Zend or not (see upload_tmp_dir and POST method uploads).
Files will, by default be stored in the server's default temporary
directory, unless another location has been given with the
upload_tmp_dir directive in php.ini.
Instead of using receive to process the upload, try accessing it directly using the $_FILES array which would let you read the file into a string using file_get_contents() or similar functions. You can however, still use Zend_Form to create and handle the form in general.
You could set up shared memory upload_tmp_dir to map a filesystem to memory where uploaded files are held. Be cautious with this as if someone attempts to upload a very large file, it will go into memory which could affect performance or your cost of service.
Ultimately, Zend_File_Transfer_Adapter_Http::receive() calls move_uploaded_file() to move the file from its temporary location to the permanent location. In addition it makes sure the upload is valid and filters it, and marks it as received so it cannot be moved again (as that would fail).

big size multi image upload

I built a website in that I upload 10 too big size(10MB) images. When uploading start, it continues to some time then a blank page will come. I tried to change php_values in .htaccess file, because I don't have permission to change the settings in php.ini file (it's shared server). I have some doubts regarding this.
1) what happen if file will going to post request, because I want fastly uploded the files.
2) it takes time when posting the request or uploding the file, I am cropping the images (loop) using php GD functions.
It is because of the limits your web hosting provider set. Which values did you try to change in the .htaccess?
You could try using some flash uploader, it should work despite the limits imposed by the server. A good one is SWFUpload.
That is because of the exection time of a script.You can edit your php.ini file. If that is not permitted you can set the *MAX_EXECUTION_TIME* for a script using your .htaccess file.

Download while uploading

How to use PHP or any other language to read an uploading-file to allow download of the uploading-file while it is uploading?
Example sites that does this are:
http://www.filesovermiles.com/
http://host03.pipebytes.com/
Use this: http://www.php.net/manual/en/apc.configuration.php#ini.apc.rfc1867
In the array the file name is included as temp_filename - so you can pass that to your other program, which can read from the file and stream it live. The array also includes a file size so that program can make sure not to try to read beyond the end of the file.
I don't think this is possible in PHP because PHP takes care of receiving the download and only hands over control when it has the complete file. When writing CGI programs or Java servlets you read the upload from the socket so you are in control while receiving the file and you can administer if it is still uploading and how much has been received so another process could read this data and start sending what is already there.
One of the site's you've given as an example is just downloading a file from an URL or from the client computer, stores it temporarily and assigns a code to that file to make it identifiable.
After uploading, any other user who has the code can then download that file again.
This is more a question how you operate a server system then writing the code.
You can download files to the local system by making use of file_get_contents and file_put_contents.
If you want to stream file-data from the server to the browser, you can make use of readfile PHP Manual.

Overcome host's max file size limit for PHP/POST data when uploading files?

I'm with a fairly mediocre low-cost (shared) host at the moment (as it's all I can afford just now) and want to implement a very basic file upload feature on a page. I want to allow files of up to 100MB to be uploaded to the server but my free host limits the PHP_MAX_FILESIZE to 32MB and the POST_FILESIZE to 64MB.
Is there any way of overcoming this without requiring the user to split the larger files into smaller chunks? Either a JavaScript or Flash-based solution which could perhaps somehow route parts of the file through different HTTP requests or some other way to get around the 32MB limit?
Or are there any commands I can attempt to make which might over-ride the host's limits? I've already tried using .htaccess without success. EDIT: also tried ini_set. I'm thinking the only way is some kind of chunking/splitting/multi-streaming or another way to solve the inability to set higher PHP values.
Any suggestions are hugely appreciated. Thanks.
You can use Flash. Start with this: http://soenkerohde.com/2010/01/chunk-file-upload/
OR
use https://github.com/moxiecode/plupload
might also possibly be able to use ini_set('upload_max_filesize','100M');
But I have a sneaking suspicion that your host might not be happy with you trying to circumvent their limit...
if your 'free host' already limits you, there is nothing you can do about it. try reading
http://www.php.net/manual/en/ini.core.php#ini.post-max-size
and
http://www.php.net/manual/en/ini.core.php#ini.upload-max-filesize
here is where you can set it during runtime (ini_set) or not
http://www.php.net/manual/en/configuration.changes.modes.php
i suggest you just do multiple file uploads. 100 mb right? are you planning to host videos and movies? try looking for a better paid host rather than free ones :)
if your host allows java applet then in sourceforge there is already package for it. which allows you to dw file via java applet from users machine to your host via small packages. it works because applet handles the file upload code in users machine and at server side you will receive small chunks which you can bind later and any file size can be uploaded.
i found the link here it is,
http://sourceforge.net/projects/juploader/
Some hosts allow using a local copy of php.ini (mine does, for example), so you could change parameters at will.
Keep in mind that 100MB each file can rapidly bloat your host, expecially if it's not top of the category, so be careful.
Since ini_set doesn't seem to be working, you could try and set it via the .htaccess file. I'm not sure about the exact syntax but it's something involving php_flag.
I wouldn't be surprised if this doesn't work either.
From my experience choosing a good host based on their advertising is impossible. I know of no other way than to simply try out a bunch and hope your run across one that isn't super retentive.
Upload limits are a common problem. If it's common for your customers to upload very large files then perhaps it would be wise to look into some other hosting plan. If it's not very common at all you could just have them send you the file so you can FTP it. Sometimes the best solution is the simplest solution.
You could have your users upload to an outside site. Then give you the URL from that outside site. If you have enough space, you can circumvent how long the outside site keeps the file by downloading it to a directory in your site.
It's certainly not the best option, but your users will have a bigger upload quota and probably faster upload speeds (shared servers and speed mix like oil and water).
Abstract:
1) Read the file before to sending it (catch into onsubmit event); split and send the chunks as textarea fields.
2) In the server side, recover the chunks, and make one single file.
Proposal:
Depending on the environment in which your script runs and where the file resides, your options include the following:
XMLHttpRequest
object (for reading files available via URLs on your website)
FileSystemObject
(if you use Windows Scripting Host or Internet Explorer in a trusted environment) a "helper" Java applet that reads a file or URL for your script.
(Extract from http://www.javascripter.net/faq/reading2.htm)
If ok, remove the input file element of the form.
Then, split the string into many chunks.
mySplitResult = myReadedDocument.split( 1024 * 1024 ); // 1M each
That make an array of pieces of your document:
Add the values into the form (remember set the same name with [] to all the new controls). Assume that form id is 'myForm':
formName = document.getElementById('myForm');
mySplitResult.forEach(function(item) {
ta = document.createElement('textarea');
ta.appendChild(document.createTextNode(item))
ta.setAttribute("name", "chunk[]" );
formName.appendChild(ta);
});
In the server side, you can reconstruct the chunks and save as a file.
<?php
$chunks = $_POST['chunk'];
$fileContent = implode( '', $chunks );
file_put_content( 'dirname/filename.foo', $fileContent );
?>
The weight of success is that you can read the file on the client side.
Good luck!

Categories