I have a Wordrpess with a custom import script for csv. The file I want to import it has 24 MB and 12000 products. At over 10500 products the script stops.
It worked until I reached this number or products.
Here is my configuration:
upload_max_filesize 500 M
post_max_size 500 M
max_execution_time 18000
max_input_time 18000
wait_timeout 60
What do I need to change?
If you get any imports at all, it means that upload limitations are not to blame. If you were hitting those, none of the import would take place.
The two most probably "candidates" are: execution time was hit or memory limit was reached.
For the former, you already have max_execution_time set to quite a large number and I assume your input script is not taking that long. (correct me if I'm wrong)
So the most obvious one is that your script reaches memory_limit then just halts, hence incomplete import.
If increasing the memory_limit doe not help, you will need to enable error reporting in order to find out what's going on.
To do that in WordPress, simply enable debug mode by adding the following line in your wp-config.php:
define('WP_DEBUG', true);
Optional side note
Having said that, importing large amounts of data by way of unreasonably increasing allowed resources is probably not the right way to go.
Try implementing incremental imports. I.e. the receiving script just parses submitted data, then uses AJAX to do import one by one. Or import submission form takes index parameters (import records 0 to 1000), etc.
Allowing a lot of resources taken by PHP is asking for trouble. Malicious users can exploit that to easily bring your website down.
Related
I managed to build a project similar to the video: "Import CSV in Laravel 5.5 + Matching Fields"
Using excel import instead of CSV, it works fine with small excel files (less than 1000 rows) but I have excel files with more than 13000 rows the app keeps issuing the following error:
Maximum execution time of 30 seconds exceeded
Level
ERROR
Exception
{
"class": "Symfony\\Component\\Debug\\Exception\\FatalErrorException",
"message": "Maximum execution time of 30 seconds exceeded",
"code": 1,
.
.
.
I tried different ways and I read Laravel Excel documentation > Import Section > Chunk Reading and Queued reading but this also didn't work as I import excel file to collection then match the fields and then create new models and save them.
Please, advice for any tip can help me to import large excel files and match database fields with excel file column headings.
Change You PHP Configuration. Increase time by below 4 Ways.
Increase the execution time at php.ini level,
max_execution_time=300
Use PHP at runtime to increase it,
ini_set('max_execution_time',300); //300 seconds = 5 minutes
Use inside the .htaccess to increase it,
php_value max_execution_time 300
set time limit in __construct method or you can set in your index controller also where you want to have large time limit.
public function __construct()
{
set_time_limit(500000);
}
Some times I receive 504 Gateway Time-out and sometimes I get laravel error:
Maximum execution time of 30 seconds exceeded or the value I set in max_execution_time as suggested by #Vaibhavi Sojitra
With these errors, it may be any of the following PHP parameters in the php.ini that is causing the PHP process to end abruptly, thus causing Nginx to get a "reset" from its peer (in this case, the php-fpm backend).
I couldn't come to a better laravel technique to solve this issue, the solution that worked for me (till I find a better one):
increase the following parameters in php.ini
max_input_time = ...
max_execution_time = ...
default_socket_timeout = ...
After that the import process worked well.
I have a relatively small store, about 20k skus all simple products. I'm using magento 1.7.2 but had the same problem with all the older versions. I simply cannot export my products to a CSV. Out of memory when running directly from the dataflow profiles in the magento backend and same error when running it from shell.
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 71 bytes) in /home/public_html/app/code/core/Mage/Eav/Model/Entity/Attribute/Source/Table.php on line 62
I've increased the memory limits and execution times in magentos htaccess to 512m, magentos php.ini to 512m and my VPS php configuration ini to 512mb. it still burns through it in about 4 minutes and runs out of memory.
I'm so confused, my entire database (zipped) is only 28mb! what am I missing to make the magento export all products function work?
Magento dataflow does have a tendency to use massive amounts of memory making exports on large stores difficult. For stores with large product catalogues it is often a lot quicker and easier to write a script to export directly from the database rather than through dataflow.
This might be problem with your .htaccess file(s) aren’t overriding your memory_limit settings globally set in php.ini.
Other option is set memory_limit to unlimited in you index.php for testing. Then you will come to know whether changes in .htaccess is not getting effected or not.
I solved this problem by exporting 500, 1000 or as many as I wanted at a time (with a custom export script).
I made a file that received as parameters $start and $productsToExport. The file took the collection of products, and then used
LIMIT ($start-1)*$productsToExport, $productsToExport
This script only returned the number of products exported.
I made a second, master script that did a recursive AJAX to the first file, with the parameters $start = 0, $productsToExport = 500. When the AJAX was finished, it did the same with $start = 1, and so on, until no products are left.
The advantage of this is that it doesn't overload the server (one ajax is run only after the previous is finished) - and if an error occurs, the script continues. Also thememory_limit and max_execution_time are sa
If by 20k skus you mean 20.000 then this is totally possible. The export is very hungry for memory, unfortunately. I always increase the memory_limit to 2000M in this case and then it takes a while to create the file, but succeeds in the end.
I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200
When I try to paste a large(5000 lines) sql file into PhpMyAdmin, I get this error? I know I can use the upload but on my old version of PhpMyAdmin this used to work without a problem.
ALERT - configured request variable value length limit exceeded - dropped variable
'sql_query' (attacker '111.171.123.123', file '/usr/share/apache2/phpmyadmin/import.php'),
referer: https://example.co.uk/phpmyadmin/db_sql.php?db=test&server=1&
token=0f355f8bbc6fc09d5c512e0409e9cac9&db_query_force=1
I have already tried changing the $cfg['ExecTimeLimit'] = 0;
php.ini
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
; Maximum execution time of each script, in seconds
max_execution_time = 120
; Maximum amount of time each script may spend parsing request data
max_input_time = 60
;max_input_nesting_level = 64 ; Maximum input variable nesting level
;Maximum amount of memory a script may consume (128MB)
memory_limit = 100M
As far as I'm concerned this message means that Suhosin (a security patch for PHP) is blocking your request because of its length. The simplest way to solve your problem without changing Suhosin's config - is to import a file with the same SQL statements to PHPMyAdmin (it allows uploading files for import).
So basically all you need - is to create a simple text file, paste the same SQL statements into it, and upload this file to PHPMyAdmin - it has the appropriate page for such imports.
If you really want to use PhpMyAdmin try using the version 3.4.3.2 or higher as I am not sure if yours version has got this
Partial import
Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.)
http://www.phpmyadmin.net/home_page/index.php
I hope it helps.
I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.