I have the most annoying problem, boils down to this simplest HTML:
<form enctype="multipart/form-data" method="post" action="/testupload.php">
<input type="file" name="thefile">
<input type="submit">
</form>
when selecting a small file, all is good.
When selecting a 3M file (example for big file), the process simply hangs. No error, no breaks, no nothing. If I use firebug I see the file being submitted is 0 size.
I am thinking this is something on the Apache side, but my precious host (InMotion VPS) gives me the "GoDaddy" response "it's in the code" so they can't take care of it, albeit the "code" is 4 lines of the most basic HTML.
I do see an error in my logs "Handler for (null) returned invalid result code 70007", googling it I get possoble mod.security issues or restarting httpd, which I did to no avail.
It used to work when I was developing that piece of application, and uploaded files like crazy for couple of days. Then it stopped.
Any takers?
Related
I need to upload a file to a web server using cURL. The web app looks like this:
Click here to view a screen capture
This is part of the HTML code (the complete code is 300 lines, it's a very simple web app on an embedded system to run 3D printers called Smoothieboard):
<h2> Upload File </h2>
<input type="file" id="files" name="files[]" onchange="upload();">
<h3>Uploading file(s)</h3>
<output id="list"></output>
<div id="progress"></div>
<div id="uploadresult"></div>
<script>
document.getElementById('files').addEventListener('change', handleFileSelect, false);
</script>
When the user clicks the "browse" button, a window pops up to browse the file system and pick a file. My file is called "firmware.bin". Upon selecting the file, the upload begins immediately (there is no "upload" button, the file's transfer is done right after picking it). I need to automate this task using cURL. I'm currently doing the following:
curl -i -F files[]=#/home/pi/P18/firmware.bin http://192.168.3.222/upload
The output is:
HTTP/1.0 200 OK
Server: uIP/1.0
Connection: close
Access-Control-Allow-Origin: *
Content-Type: text/plain
OK
However, it doesn't seem to be working. I can access the server through other means and I can assure you that the usual human-friendly upload procedure works, but what I'm doing with cURL doesn't. Something DOES seem to be going on, since the OK message takes a few seconds to pop up, which also happens in the web app. The file seems to be transfered, but I feel that I need to do something more to complete the process.
Something that caught my attention is that, whether I type files[] or potatoes[], the same thing happens.
Without the upload() function, it's hard to reproduce the problem but what you can do is just extracting the curl request with Chrome development console :
In Network Tab, check "Preserve logs", reload the page, upload the file, right click on request / Copy / Copy as cURL :
Sorry for yet another question about the session upload_progress freature in PHP, but it has me stumped. I developed a prototype file upload facility on a Windows server and the progress feature worked just fine. But when I moved the prototype to our Linux server, it didn't work. The progress array never appears in the $_SESSION superglobal, though the files transfer without issue.
Here is the environment:
Apache version: 2.2
PHP version: 5.4.31
Server API: Apache 2.0 Handler (not CGI!)
Here are the session variables:
session.upload_progress.cleanup: Off (turned off so I could check $_SESSION after the transfer)
session.upload_progress.enabled: On
session.upload_progress.freq: 1000
session.upload_progress.min_freq: 1
session.upload_progress.name: PHP_SESSION_UPLOAD_PROGRESS
session.upload_progress.prefix: upload_progress_
Here are my core file transfer settings:
post_max_size: 0
upload_max_filesize: 0
max_input_time: -1
max_execution_time: -1
output_buffering: 4096
(I set the output_buffering parameter to match the setting on the Windows server where the upload progress worked. Initially, it had "no value" on the Linux server).
Here is the HTML that sets up the form:
<form name="UploadForm" id="UploadForm" method="post" enctype="multipart/form-data">
<input type="hidden" name="<?php echo ini_get('session.upload_progress.name');?>" id="<?php echo ini_get('session.upload_progress.name');?>" value="1" />
Enter a note to be sent along with your file:<br />
<textarea name="UploadNote" id="UploadNote" rows="5" cols="50"></textarea><br /><br />
<div style="border: thin solid black; padding: 10px;">
Choose the file you wish to transfer and then click the "Upload" button.
<input type="file" name="UploadFile" id="UploadFile" />
<br /><br /><input type="button" id="SubButton" value="Upload" onclick="submit_form('SU')" />
The PHP program to retrieve the uploaded file is launched in a hidden iframe. The status checking program is called every 1.5 seconds via jQuery/Ajax.
Like I said, the upload_progress code worked on a Windows server and was moved unchanged to the Linux server. Some of the things I've since checked:
The session integrity is fine. I added my own session variable in the main program (that sets up the form) and was able to retrieve it in the status program and the upload program (running in the iframe). The session IDs are all the same.
Nothing outside of my $_SESSION item ever gets added to $_SESSION.
Again, the file transfers themselves work fine.
Some other SO posts pointed to possible issues involving the buffering of POST data by Apache so PHP only receives the file when it's completely uploaded. Would that cause the upload progress array to never be posted in $_SESSION? I read that mod-security could be interposed in the transfer of POST data. So I temporarily disabled mod-security in my Apache virtual host, but that unfortunately didn't help.
I have to think there is some module that is part of the standard cPanel Apache build that is affecting the operation of the session.upload_progress feature. Does anyone have any suggestions of what to try? Thanks very much.
I know its an old thread but I'll leave this here for other coming across the same problem. It is indeed mod_security interfering with the session.upload. SecRequestBodyAccess needs to be set to Off for session.uploads to work. The reason being is that SecRequestBodyAccess buffers the page output so mod_security can process the page.
Mod Security Documentation SecRequestBodyAccess
I have a strange problem. I cannot send a form if one the fields has a string of more than 1333 characters.
Here's my simple html page:
<form method="POST">
<input type="hidden" name="a" value="WffapzB...truncated" />
<input type="submit" value="OK" />
</form>
The problem comes from Apache or my Computer since when I load the html file in the browser (file://localhost/Users/etienne/Developpement/htdocs/test/SendForm/index.html), without passing through the server, the form gets sent.
If I access it via the server (http://tests.localhost/SendForm/), the page times out.
Here's what I have tried:
Also, all the php and apache logs do not show anything...
I've set post_max_size = 500M
I've set LimitBodyRequest 0
You can see my phpinfo page here: http://jsfiddle.net/etiennenoel/VZfeQ/
What can cause a page not to accept strings longer than 1333 characters on server side ? Clearly, it is not a browser problem since it doesn't work on other browsers and it works using the html file. Therefore, the problem comes when the server is in the process.
Update 2
I completely removed MAMP Server and reinstalled the Mac OS X native server without success. Therefore, it is not linked with MAMP PRO but with other things that I have not clue about...
Update 3
I also found out that the same problem occurs sending the data via GET.
Update 4
Using wireshark on my local adapter, I see no POST data being sent. I see the http requests for that page but it doesn't show any POST requests. If I launch another site, then I will see the POST requests for that site. : https://docs.google.com/file/d/0B2quoUxT9OnJdmN3ajJVR2dPbUk/edit?usp=sharing
I finally found it ! After hours and days of searching, I decided to reactivate the firewall. Then, I got a request from a program called HideMyIp (that I completely forgot that I had installed). I deleted that program, restarted my computer, and everything was then working fine ! Thanks to everyone that have tried to help me !
Here is a simple file upload form HTML.
<form enctype="multipart/form-data" action="upload.php" method="POST">
Send this file: <input name="userfile" type="file" />
<input type="submit" value="Send File" />
</form>
And the php file is pretty simple too.
<?php
die();
As you see, the PHP script do nothing in server side. But when we uploading a big file, the process still cost a long time.
I know, my PHP code will executed after the POST process ended. PHP MUST prepare the array named $_POST and $_FILES before the first line code parsed.
So my question is: Can PHP (with Apache or Nginx) check HTTP header before POST request finished?
For example, some PHP extensions or Apache modules.
I was told that Python or node.js can resolve this problem, just want to know if PHP can or not.
Thanks.
================ UPDATE 1 ================
My target is try to block some unexpected file-upload request. For example, we generated a unique token as POST target url (like http://some.com/upload.php?token=268dfd235ca2d64abd4cee42d61bde48&t=1366552126). In server side, my code like:
<?php
define(MY_SALT, 'mysalt');
if (!isset($_GET['t']) || !isset($_GET['token']) || abs(time()-$_GET['t'])>3600 || md5(MY_SALT.$_GET['t'])!=$_GET['token']) {//token check
die('token incorrect or timeout');
}
//process the file uploaded
/* ... */
Code looks make sense :-P but cannot save bandwidth as I expected. The reason is PHP code runs too late, we cannot check token before file uploading finished. If someone upload file without correct token in url, the network and CPU of my server still wasted.
Any suggestion is welcome. Thanks a lot.
The answer is always yes because this is Open Source. But first, some background: (I'm only going to talk about nginx, but Apache is almost the same.)
The upload request isn't sent to your PHP backend right away -- nginx buffers the upload body so your PHP app isn't tying up 100MB of RAM waiting for some guy to upload via a 300 baud modem. The downside is that your app doesn't even find out about the upload until it's done or mostly done uploading (depending on client_body_buffer_size).
But you can write a module to hook into the different "phases" internally to nginx. One of the hooks are called when the headers are done. You can write modules in LUA, but it's sill fairly complex. There may be a module that will send the "pre-upload" hook out to your script via HTTP. But that's not great for performance.
It's very likely you won't even need a module. The nginx.conf files can do what you need. (i.e. route the request to different scripts based on headers, or return different error codes based on headers.) See this page for examples of header checking (especially "WordPress w/ W3 Total Cache using Disk (Enhanced)"): http://kbeezie.com/nginx-configuration-examples/
Read the docs, because some common header-checking needs already have directives of their own (i.e. client_max_body_size will reject a request if the Content-Length header is too big.)
There is no solution in HTTP level, but is possible in TCP level. See the answer I chose in another question:
Break HTTP file uploading from server side by PHP or Apache
PHP 4.4 and PHP 5.2.3 under Apache 2.2.4 on ubuntu.
I am running Moodle 1.5.3 and have recently had a problem when updating a course. The $_POST variable is empty but only if a lot of text was entered into the textarea on the form. If only a short text is entered it works fine.
I have increased the post_max_size from 8M to 200M and increased the memory_limit to 256M but this has not helped.
I have doubled the LimitRequestFieldSize and LimitRequestLine to 16380 and set LimitRequestBody to 0 with no improvement.
I have googled for an answer but have been unable to find one.
HTTP Headers on firefox shows the content size of 3816 with the correct data, so its just not getting to $_POST.
The system was running fine until a few weeks ago. The only change was to /etc/hosts to correct a HELO issue with the exim4 email server.
I can replicate the issue on a development machine that has exim4 not running so I think it is just coincidence.
Thanks for your assistance.
I don't know enough to really provide a useful answer so the following is more a well-educated guess (at least I hope so).
First, you should debug the entire request, either by access_log or for example through firebug. (Good to have Firebug anyway.) To me your problem sounds like a redirect happens in between. I give you an example:
Assume this is your structure:
/form.php
/directory/index.php
This is your form:
<form action="/directory" method="post">
...
</form>
Problem in this case is, that even though /directory is a valid url, Apache will redirect you one more time to /directory/, thus you are loosing your payload (what is supposed to be in $_POST).
Could it be not the size related of the post, but how long the line is before a new line. If they use the Moodle WYSIWYG view the html would just get put into one line with no breaks. If you go into html and hit return around every 1000 characters does it work?
What is the enctype of the form? Could be that it limits the ammount of data send through the form.
You might also want to check the incoming raw $_POST data with:
file_get_contents('php://input');
To make sure their is actually data send.
Got these suggestions here.
It sounds like an Apache or Apache/PHP integration problem. If $_POST is empty it would hint that the http server is not giving the POST information to PHP. If I were you I'd investigate the Apache configuration.
This is obvious, but did you /etc/init.d/apache2 restart ?
Also, the error_log file should show some information about whether the post size exceeded the set limit. Consider increasing your verbosity for troubleshooting the issue