I managed to build a project similar to the video: "Import CSV in Laravel 5.5 + Matching Fields"
Using excel import instead of CSV, it works fine with small excel files (less than 1000 rows) but I have excel files with more than 13000 rows the app keeps issuing the following error:
Maximum execution time of 30 seconds exceeded
Level
ERROR
Exception
{
"class": "Symfony\\Component\\Debug\\Exception\\FatalErrorException",
"message": "Maximum execution time of 30 seconds exceeded",
"code": 1,
.
.
.
I tried different ways and I read Laravel Excel documentation > Import Section > Chunk Reading and Queued reading but this also didn't work as I import excel file to collection then match the fields and then create new models and save them.
Please, advice for any tip can help me to import large excel files and match database fields with excel file column headings.
Change You PHP Configuration. Increase time by below 4 Ways.
Increase the execution time at php.ini level,
max_execution_time=300
Use PHP at runtime to increase it,
ini_set('max_execution_time',300); //300 seconds = 5 minutes
Use inside the .htaccess to increase it,
php_value max_execution_time 300
set time limit in __construct method or you can set in your index controller also where you want to have large time limit.
public function __construct()
{
set_time_limit(500000);
}
Some times I receive 504 Gateway Time-out and sometimes I get laravel error:
Maximum execution time of 30 seconds exceeded or the value I set in max_execution_time as suggested by #Vaibhavi Sojitra
With these errors, it may be any of the following PHP parameters in the php.ini that is causing the PHP process to end abruptly, thus causing Nginx to get a "reset" from its peer (in this case, the php-fpm backend).
I couldn't come to a better laravel technique to solve this issue, the solution that worked for me (till I find a better one):
increase the following parameters in php.ini
max_input_time = ...
max_execution_time = ...
default_socket_timeout = ...
After that the import process worked well.
Related
Trying to import a 150MB .sql.zip file into WAMP phpMyAdmin using this method (saving the import file in c:/wamp/sql, and editing C:\wamp\apps\phpmyadmin5.0.2\config.inc.php to include $cfg['UploadDir'] = 'C:\wamp\sql'; at the end, then an option to import the file save to C:\wamp\sql during the import process.
However, I keep receiving a timeout error in phpMyAdmin, with advice to rerun the import selecting the same file, but on the 2nd run I always receive SQL errors.
So, I've set max_execution_time = 4500 in Wamp's PHP.ini (4500 seconds equals 75 minutes) and restarted Apache and MySQL.
However, the same timeout error occurs when using the same import process. The timeout error occurs after about 5 minutes.
Why is the timeout error occurring within 5 minutes and not within the time set by PHP.ini max_execution_time = 4500 ?
Edit
phpinfo says localhost has:
memory_limit 200M
post_max_size 200M
upload_max_filesize 200M
max_execution_time 4500
I continue to receive the message that a timeout has occurred and to rerun the import again with the same file, but on the 2nd run, I receive an error:
Error
Static analysis:
2 errors were found during analysis.
Unexpected beginning of statement. (near "RT" at position 0)
Unrecognized statement type. (near "INTO" at position 3)
SQL query: Copy
RT INTO `cache_menu`
It looks like the SQL INSERT command is being cut by the process.
How can I safely split the SQL into smaller chunks?
Apart from maximum execution time, PHP will also have limits on post maximum size, memory limit, upload max filesize. (all of them can be related to the upload file operation for data import). If any one of them reaches the allowed limit, will cause failure of the execution and throw a time-out error
Hence you may set /change the following
ini_set('memory_limit', '40M');
ini_set('max_execution_time', 80000);
ini_set('post_max_size', '40M');
ini_set('upload_max_filesize', '40M');
I am unsure if this is an adequate solution for someone encountering the same problem but my workaround was to split the large SQL file into smaller files using SQL Dump File Splitter
I have a Wordrpess with a custom import script for csv. The file I want to import it has 24 MB and 12000 products. At over 10500 products the script stops.
It worked until I reached this number or products.
Here is my configuration:
upload_max_filesize 500 M
post_max_size 500 M
max_execution_time 18000
max_input_time 18000
wait_timeout 60
What do I need to change?
If you get any imports at all, it means that upload limitations are not to blame. If you were hitting those, none of the import would take place.
The two most probably "candidates" are: execution time was hit or memory limit was reached.
For the former, you already have max_execution_time set to quite a large number and I assume your input script is not taking that long. (correct me if I'm wrong)
So the most obvious one is that your script reaches memory_limit then just halts, hence incomplete import.
If increasing the memory_limit doe not help, you will need to enable error reporting in order to find out what's going on.
To do that in WordPress, simply enable debug mode by adding the following line in your wp-config.php:
define('WP_DEBUG', true);
Optional side note
Having said that, importing large amounts of data by way of unreasonably increasing allowed resources is probably not the right way to go.
Try implementing incremental imports. I.e. the receiving script just parses submitted data, then uses AJAX to do import one by one. Or import submission form takes index parameters (import records 0 to 1000), etc.
Allowing a lot of resources taken by PHP is asking for trouble. Malicious users can exploit that to easily bring your website down.
I have a relatively small store, about 20k skus all simple products. I'm using magento 1.7.2 but had the same problem with all the older versions. I simply cannot export my products to a CSV. Out of memory when running directly from the dataflow profiles in the magento backend and same error when running it from shell.
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 71 bytes) in /home/public_html/app/code/core/Mage/Eav/Model/Entity/Attribute/Source/Table.php on line 62
I've increased the memory limits and execution times in magentos htaccess to 512m, magentos php.ini to 512m and my VPS php configuration ini to 512mb. it still burns through it in about 4 minutes and runs out of memory.
I'm so confused, my entire database (zipped) is only 28mb! what am I missing to make the magento export all products function work?
Magento dataflow does have a tendency to use massive amounts of memory making exports on large stores difficult. For stores with large product catalogues it is often a lot quicker and easier to write a script to export directly from the database rather than through dataflow.
This might be problem with your .htaccess file(s) aren’t overriding your memory_limit settings globally set in php.ini.
Other option is set memory_limit to unlimited in you index.php for testing. Then you will come to know whether changes in .htaccess is not getting effected or not.
I solved this problem by exporting 500, 1000 or as many as I wanted at a time (with a custom export script).
I made a file that received as parameters $start and $productsToExport. The file took the collection of products, and then used
LIMIT ($start-1)*$productsToExport, $productsToExport
This script only returned the number of products exported.
I made a second, master script that did a recursive AJAX to the first file, with the parameters $start = 0, $productsToExport = 500. When the AJAX was finished, it did the same with $start = 1, and so on, until no products are left.
The advantage of this is that it doesn't overload the server (one ajax is run only after the previous is finished) - and if an error occurs, the script continues. Also thememory_limit and max_execution_time are sa
If by 20k skus you mean 20.000 then this is totally possible. The export is very hungry for memory, unfortunately. I always increase the memory_limit to 2000M in this case and then it takes a while to create the file, but succeeds in the end.
When I try to paste a large(5000 lines) sql file into PhpMyAdmin, I get this error? I know I can use the upload but on my old version of PhpMyAdmin this used to work without a problem.
ALERT - configured request variable value length limit exceeded - dropped variable
'sql_query' (attacker '111.171.123.123', file '/usr/share/apache2/phpmyadmin/import.php'),
referer: https://example.co.uk/phpmyadmin/db_sql.php?db=test&server=1&
token=0f355f8bbc6fc09d5c512e0409e9cac9&db_query_force=1
I have already tried changing the $cfg['ExecTimeLimit'] = 0;
php.ini
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
; Maximum execution time of each script, in seconds
max_execution_time = 120
; Maximum amount of time each script may spend parsing request data
max_input_time = 60
;max_input_nesting_level = 64 ; Maximum input variable nesting level
;Maximum amount of memory a script may consume (128MB)
memory_limit = 100M
As far as I'm concerned this message means that Suhosin (a security patch for PHP) is blocking your request because of its length. The simplest way to solve your problem without changing Suhosin's config - is to import a file with the same SQL statements to PHPMyAdmin (it allows uploading files for import).
So basically all you need - is to create a simple text file, paste the same SQL statements into it, and upload this file to PHPMyAdmin - it has the appropriate page for such imports.
If you really want to use PhpMyAdmin try using the version 3.4.3.2 or higher as I am not sure if yours version has got this
Partial import
Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.)
http://www.phpmyadmin.net/home_page/index.php
I hope it helps.
I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .