I have developed a page to upload a big CSV(greater than 7000 rows) file to the database. Sometimes it works fine, sometimes not. Sometimes, I am getting an error like "Data not received" in the middle of uploading. But the values are inserted into the database upto that moment. I have checked the php.ini in the server:
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
;max_input_nesting_level = 64 ; Maximum input variable nesting level
memory_limit = 128M ; Maximum amount of memory a script may consume (128MB)
Server has a lot of projects, all of them are using zend framework. But I am using simple PHP for my project. My system admin said, php.ini file is / here, usr/local/zend/etc/php.ini
Any suggestions?
Related
In a php application. I am uploading 20-30 files at once. Each files is around 100-200MB. Means more than 2GB of data i am uploading on server.
Because it takes time around 20-30 mins to upload. One general ajax pooling job getting cancelled after some time.
I have following configuration:
upload_max_filesize = 4096M
post_max_size = 4096M
max_input_time = 600
max_execution_time = 600
During this process my CPU consumption goes only upload 10-20%. I have 32 GB RAM and 12 CORE Linux machine.
Application is running on PHP 8.0, APACHE 2, MYSQL 8, Ubuntu 20.
Can anyone suggest what else i can check?
max_execution_time: This sets the maximum time in seconds a script is
allowed to run before it is terminated by the parser. This helps
prevent poorly written scripts from tying up the server. The default
setting is 30. When running PHP from the command line the default
setting is 0.
max_input_time: This sets the maximum time in seconds a script is
allowed to parse input data, like POST and GET. Timing begins at the
moment PHP is invoked at the server and ends when execution begins.
The default setting is -1, which means that max_execution_time is used
instead. Set to 0 to allow unlimited time.
I think change it:
max_input_time = 1800 & max_execution_time = 1800 (30 minutes)
I have a Wordrpess with a custom import script for csv. The file I want to import it has 24 MB and 12000 products. At over 10500 products the script stops.
It worked until I reached this number or products.
Here is my configuration:
upload_max_filesize 500 M
post_max_size 500 M
max_execution_time 18000
max_input_time 18000
wait_timeout 60
What do I need to change?
If you get any imports at all, it means that upload limitations are not to blame. If you were hitting those, none of the import would take place.
The two most probably "candidates" are: execution time was hit or memory limit was reached.
For the former, you already have max_execution_time set to quite a large number and I assume your input script is not taking that long. (correct me if I'm wrong)
So the most obvious one is that your script reaches memory_limit then just halts, hence incomplete import.
If increasing the memory_limit doe not help, you will need to enable error reporting in order to find out what's going on.
To do that in WordPress, simply enable debug mode by adding the following line in your wp-config.php:
define('WP_DEBUG', true);
Optional side note
Having said that, importing large amounts of data by way of unreasonably increasing allowed resources is probably not the right way to go.
Try implementing incremental imports. I.e. the receiving script just parses submitted data, then uses AJAX to do import one by one. Or import submission form takes index parameters (import records 0 to 1000), etc.
Allowing a lot of resources taken by PHP is asking for trouble. Malicious users can exploit that to easily bring your website down.
When I try to paste a large(5000 lines) sql file into PhpMyAdmin, I get this error? I know I can use the upload but on my old version of PhpMyAdmin this used to work without a problem.
ALERT - configured request variable value length limit exceeded - dropped variable
'sql_query' (attacker '111.171.123.123', file '/usr/share/apache2/phpmyadmin/import.php'),
referer: https://example.co.uk/phpmyadmin/db_sql.php?db=test&server=1&
token=0f355f8bbc6fc09d5c512e0409e9cac9&db_query_force=1
I have already tried changing the $cfg['ExecTimeLimit'] = 0;
php.ini
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
; Maximum execution time of each script, in seconds
max_execution_time = 120
; Maximum amount of time each script may spend parsing request data
max_input_time = 60
;max_input_nesting_level = 64 ; Maximum input variable nesting level
;Maximum amount of memory a script may consume (128MB)
memory_limit = 100M
As far as I'm concerned this message means that Suhosin (a security patch for PHP) is blocking your request because of its length. The simplest way to solve your problem without changing Suhosin's config - is to import a file with the same SQL statements to PHPMyAdmin (it allows uploading files for import).
So basically all you need - is to create a simple text file, paste the same SQL statements into it, and upload this file to PHPMyAdmin - it has the appropriate page for such imports.
If you really want to use PhpMyAdmin try using the version 3.4.3.2 or higher as I am not sure if yours version has got this
Partial import
Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.)
http://www.phpmyadmin.net/home_page/index.php
I hope it helps.
I know how to do this on a user form using HTML. However, malicious users can by pass that form to call the server action page and send abnormally large sized text.
Is there anyway to deny such requests from the server. Perhaps, there is a mechanism by which we can realize in advance the size of POST data that is arriving before it actually arrives, similar to upload of huge files.
Edit the php.ini file and set the max post size to the number in megabytes you want to allow. Keep in mind you need it high enough for long blog posts and what not.
post_max_size = 4M
Other settings you should check are
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
; Maximum amount of time each script may spend parsing request data. It's a good
; idea to limit this time on productions servers in order to eliminate unexpectedly
; long running scripts.
; Note: This directive is hardcoded to -1 for the CLI SAPI
; Default Value: -1 (Unlimited)
; Development Value: 60 (60 seconds)
; Production Value: 60 (60 seconds)
; http://php.net/max-input-time
max_input_time = 60
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 30MB
If you are using Apache or Nginx you can also set the max request size in the server config so they can block the request before it even reaches php.
You can use Suhosin.
It's a protection system for PHP. And among the settings, you can forbid requests over a certain length.
By the time PHP processes the POST data, it's a little late. This is better accomplished at the web server level. If you're using Apache, check out the LimitRequestBody directive.
Im confused... I can't seem to upload files in the 2gb range. When i try using curl to send a 1.92gb file to my site (through an API), it doesn't report anything at all, its just blank. When i send a 1kb file, it reports back like it should.
When i try uploading via the upload form, it ends up freezing mid way, around 33%. Although im not sure if only the progress bar has froze or if the actual file upload it self has been suspended. I suspect that only the progress bar has froze because it still says data is being sent even though the progress bar freezes.
My php.ini (yes, its reflected by phpinfo as well):
register_globals = Off
magic_quotes_gpc = Off
post_max_size = 2047M
upload_max_filesize = 2047M
max_execution_time = 25200 ; Maximum execution time of each script, in seconds
max_input_time = 25200 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2048M ; Maximum amount of memory a script may consume (16MB)
short_open_tag = On
My vps doesnt actually have 2gb of ram at its disposal, but does memory_limit really need to be set this high?
How should i go about testing this? I know 400mb files work, i haven't tested anything in between 400mb and 1.92gb
You will need a premium account to test up to 2gb, so here is one you can play with:
User: testreferral
Pass: 1234
http://filefx.com
I dont understand where this problem is arising.
Check for:
Memory limit. Try uploading files above and below the actual memory limit.
Time limit. Aren't your uploads take 7+ hours, are they?
The effective settings. Some setting might be overridden by server/etc settings.
PHP: mysql query skipped/ignored after large file uploads?
Mysql was timing out during the file upload. So the file wasn't showing up in the DB