I am facing a read timeout problem from couple of days.
There is a facility to upload users in my application (Xls,xlsx are allowed extensions). This is completely admin panel.
I am using PHPExcel for reading the data from the sheet and inserting each row details into database on the fly.
Here, there is a possibility to upload large size files. Right now the file I am having is 16MB file which contains nearly 200k records.
I have increased below configurations through htaccess
php_value memory_limit 1024M
php_value max_execution_time 259200
php_value max_input_time 3000
php_value post_max_size 700M
php_value upload_max_filesize 100M
I have placed set_time_limit(0) in the specific controller also.
My problem is read timeout in production environment. It is executing for around 15 mins and returning the below error
The requested URL could not be retrieved
While trying to retrieve the URL: http://example.com/upload/url
The following error was encountered:
Read Timeout
The system returned:
[No Error]
A Timeout occurred while waiting to read data from the network. The network or server may be down or congested. Please retry your request.
Will Keep Alive do something here. It is set to 5 in production and Apache timeout is 300.
I have searched many of the similar errors post in this site but no luck
I am planing to set a cron job, uploading the files only from front end. Hope that will solve this but I also want to know what factor is causing this error.
Related
I have problem in multiple file upload in php. I have set php.ini setting in .htaccess file
php upload_max_filesize 1024M
php post_max_size 1024M
php max_execution_time 120
php max_input_time 120
php max_file_uploads 40
So when I upload images approx 40M size then server respond with status failed or connection reset instead of it upload images within 1 min time .If I upload little less than 40M then its working fine.Is there any other setting I have to do. How I can fix this issue.
Most likely you're running against the clock or have conflicting settings, but it's hard to tell with the amount of information you provided.
Even though you set up your PHP instance to accept uploads of up to 1024M (are you sure you need this, by the way?) there's a lot more you need to consider:
php max_execution_time 120
php max_input_time 120
The above means that whatever happens, your PHP instances will be stopped after 120 seconds. It may be that you are able to upload almost 40M in under 120 seconds.
Now, even if you had a connection speed that allows to upload more than 40M in less than 120 seconds, there's more settings thay may be conflicting, as the above ones only apply to the PHP process.
Check your Apache settings (I assume you use Apache given the tag on your question) and look for Apache's directives regarding execution times and upload limits. Even if PHP was configured to allow 1 Terabyte per file and 24 hours per process, if Apache has more restrictive limits, Apache will be constraining your upload sizes and running times.
I face a problem with uploading files in php script on an external server wich I have no privileges to edit their files, with small files everything is ok but whene i try to upload a bigger file (not much 27Mb) after 5 minutes I get an error message:
Internal Server Error
The server encountered an internal error or misconfiguration and was
unable to complete your request.
I tried all the possible solutions:
add a php.ini file in the folder
add a .htaccess file
I change the following parameters:
max_input_time = 900
upload_max_filesize = 40M
post_max_size = 40M
memory_limit = 256M
I even tried set_time_limit(900) function in my script.
with all this changes when I use phpinfo() function I see that the time and the size are increased but I still have the timeout problem after about 300 seconds.
Is there a solution to this problem ?
I want to upload upto 10GB files using a normal php form. But even after increasing the below values,
upload_max_filesize
post_max_size
php_value upload_max_filesize
php_value post_max_size
request_terminate_timeout
FcgidMaxRequestLen
am able to upload a file upto 3.5gb without any problem. But above that am getting an error as "upload_err_partial". I also reffered a lot of site but the answer I thought releavant was adding "header(connection:close)". I added the line but still I did not get any result. Could anyone guide me in this.
This error usually means that the uploading is cancelled by the user , and some times it can be server problem also which can cause this. Try to contact to your hosting provider and see what can he do
Background: I'm planning on using the JW FLV Media Player for streaming some videos:
http://www.longtailvideo.com/players/jw-flv-player/
Question: What settings, both PHP globals and php.ini, would I need to change in order to handle the uploading of large video files?
Sub-question: Is there anyway, through maybe the .htaccess file, that I could have the settings only apply to a single domain? I host several of my websites on the same server, and let's say if I changed the execution timeout to a few minutes for videos I wouldn't someone else on one of my other sites to have to wait through that kind of timeout for a regular upload if an error should occur.
In .htaccess this will let you upload a 20MB file, and increase time for the script to 200 seconds. Most shared hosting won't let you do this though, and will keep a global limit.
php_value upload_max_filesize 20M
php_value post_max_size 20M
php_value max_execution_time 200
php_value max_input_time 200
I have an image upload for a slideshow, and the users are continuously uploading files that are 2MB plus. Files under this size work fine, but files over the size cause what looks like a browser timeout.
Here are my php ini settings:
Max memory allocation: 12M
Max file upload size: 10M
Max HTTP Post size: 10M
Max execution time: 60
Max input parsing time: 120
These settings are in the configuration file itself, and I can change them directly. Changes show up when using phpinfo().
I am running on an apache server and php 4.3.9(client's choice, not mine). The apache server's request limit is set to default, which I believe is somewhere around 2GB?
When I use the firebug network monitor, it does look like I am not receiving a full response from the server, though I am not too experienced at using this tool. Things seem to be timing out at around 43 seconds.
All the help I can find on the net points to the above settings as the culprits, but all of those settings are much higher than this 2MB file and the 43 second time out.
Any suggestions at where I can go from here to solve this issue?
Here are relevant php ini settings from phpinfo(). Let me know if I need to post any more.
file_uploads On On
max_execution_time 60 60
max_input_nesting_level 64 64
max_input_time 120 120
memory_limit 12M 12M
post_max_size 10M 10M
safe_mode Off Off
upload_max_filesize 10M 10M
upload_tmp_dir no value no value
Make sure you have error reporting activated in php.ini: display_errors = On; this might give you a clue about what's going on. Production servers usually (should) have error reporting disabled.
I recently had a similar problem, and increasing the memory_limit setting worked for me. If you read files content into variables, each variable will take about as much memory as the file size, increasing the scripts memory requirements.
Where are those settings? If you're using .htaccess then your Apache configuration might not be allowing you to override those settings.
I'd suggest checking with a phpinfo() call if those settings are indeed being applied or not:
<?php
phpinfo();
?>
If it's a shared host, your host might have set a limit to override yours.
Otherwise, try making the POST limit higher than the file upload size. AFAIK, uploads are POSTED.
If the problem is due to the Host overriding your timeout, you can look for a host that still uses Apache 1: In Apache 1 the local .htaccess overrides the global setting even for the timeout.
Otherwise, there are dozens of Java applett uploaders available for just a few dollars (Google it). They split the file, upload the parts and put the parts back together transparently.
This is a guaranteed fix for timeout, and has the added advantage of letting users pause and resume their upload, see the progress, et all.
(Flash based uploaders don't have this advantage.)