Long time reader, first time poster
I have a script which imports rows from a CSV file, processes in, then inserts into a MYSQL database, the CSV file itself has around 18,800 rows.
This script runs as part of a WordPress plugin installation, and appears to be very temperamental. Sometimes it will complete the entire script and load the page as normal, other times, lets say, 2/3s of the time, it will only import around 17.5k of the rows before silently terminating the script and reloading the page without any GET or POST vars,
I have tried everything I can think of to see why its doing it, but with no luck.
The server software is Apache installed on Linux,
The Server error log doesn't have any entries,
max execution time is set to 0,
PHP max input time is 1800,
PHP register long arrays is set to on,
The script is running as PHP 5.3.5CGI
The database is hosted on the same server
The max memory limit is 256M
The max post size is 7M
Is there anything I am missing that may be causing an error?
Any help would be appreciated, as I am totally stumped!
Thanks in advance!
EDIT:
If I use a CSV file of 15k rows instead of 18k it completes correctly, could it be a time issue?
Related
I am running the cron to update the inventory in our database from the ERP inventory csv file. The ERP inventory CSV file contains the 19K record almost. Cron will pick up all records 1 by 1 and update the matched inventory in the database. But since a few days among 19K records 13K-14k records only parse by files and script break in the middle.
I have tried to run the script directly from browser also but its raised the same issue. No error is displayed in the error log.
I was thinking that its timeout issue and increases the max_execution_time to 1500 (25min). But the issue is not resolved yet.
Anyone can suggest me how to solve this issue? Thanks in Advance!
Did you check cron-log? Which operating system you are using?
No error is displayed in the error log.
Then your first course of action is to verify that your error logging is working as expected. If it is a (PHP) timeout or a memory limit issue then the reason will be getting flagged - but it might not be getting reported.
You forgot to tell us how cron runs the task - is this via the CLI SAPI or are you using a http client (wget, curl etc) to invoke via the webserver? The SAPIs have very different behaviours and usually use seperate php.ini files.
I was thinking that its timeout issue
Because you've checked and it always bombs out at the same interval after starting?
But since a few days among 19K records 13K-14k records only parse
And previously it was taking less than the identified amount of time to complete?
and increases the max_execution_time
How? In the script? In the (right) php.ini? Note that if the script is running via the webserver, then there may also be a timeout configured on the webserver.
You might consider prefixing your script with:
set_time_limit(3000);
error_reporting(E_ALL);
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
And capturing the stderr and stdout from the script.
Try to divide csv in 2 files and execute the cron i think your csv file have a problem. if one of them execute without trouble then the process have not issue, repeat process to find the block with problem.
a few years i work with a interface wich read a file to export to SAP, sometimes one special character make the script break.
I have a small website / web application (HTML/Jquery/PHP/MYSQL) that loads an HTML document, then simultaneously calls php files in the backend to fetch data sets.
For example, a contacts.php page is loaded, then AJAX calls to 4 PHP scripts at the same time to load different data sets:
contacts-names.php
contacts-groups.php
contacts-tags.php
contacts-locations.php
These datasets are rather small and are less than 100 rows in the DB.
individually these guys run fine. But when called from main page (initial page load) I get the memory limit hits.
If it was just one file causing the issue, i could dig in and optimize it.. but every time i load the page, one or 2 of the above calls get an error (hitting memory limit) and they seem so random.
I went as far as to putting an exit(); code on the top of the script (to stop it from running) to two of the files i would still get max resource hit randomly, sometimes on the file with the exit() code! doesn't make sense. The darn file isn't running any code anymore.
The only way that seem to fix this is when i remove some of the calls (2 scripts) w/c renders my App useless.
Or if I set a delayed call to 2 file (JS timeout) ..
So it seems that calling all php scripts at the same time causes the memory limit issue. Is this normal? I could simply settle w/ this delayed call strategy but I wanted to know if you guys had to do this as well (on limited shared hosts)
Other notes:
- im on a very good cloud unix type shared server
- even if i terminate 2 of the scripts being called simultaneously w/ the other 2 scripts , i still get issues. So it must not be my code and optimization wont do me any good.
- i profiled my scripts individually via xdebug and cachegrind and they all seem fine.
Best
I have tried an activity to check if a
> FOR/WHILE LOOP
fails or not if it loop through a few million records ....
My Activity
I have more than 1000000 records in a table and if i fetch them and print them
through a for loop and executed this script it with command line it worked successfully..
but when i tried it in a web browser it stucks and closed my browser...
So I want to know is it a failure of PHP(obviously not as it worked in command line)..
or It is a failure of WEB SERVER???
What to do if i must have such a situation in real time web application??
Should I switch to a more powerfull web server if yes then Which One...??
I'm guessing you're running into the max execution time limit which is unlimited on the command line but has a default of 30s on the web server. Try adding the following line at the beginning of your script to remove the limit:
set_time_limit(0);
From the official php docs
max_execution_time
This sets the maximum time in seconds a
script is allowed to run before it is terminated by the parser. This
helps prevent poorly written scripts from tying up the server. The
default setting is 30. When running PHP from the command line the
default setting is 0.
If you want to print 1 000 000 records to browser, you gonna have a bad time...
First - what FuzzyTree said - execution time. But if you say that it works in cli I guess that's not the problem.
Problem with this amount of data is probably browser, which simply cannot handle that much! Check out your CPU and RAM usage while trying to visit that page.
If you REALLY need to display user million of records, use pagination or load new records when user reach bottom of the page using ajax.
i have a script that load a csv by CURL, once i have the csv it add each of the records to the database and when finished, display the total amount of registries added.
on less than 500 registries, it execute just fine, the problem is that whennever the amount of registries is too big, the execution is interrupted at some point and the browser displays the download dialog with a file named like the last part of my url withouth extension containing nothing. no warning, error or any kind of message. the database shows that it added some of the registries, if i run the script several times it adds a small amount more.
i have tried to look for someone with a similar situation but haven't find it yet.
i would appreciate any insight in the matter, i'm not sure if this is a symfony2 problem, a server configuration problem or what.
thanks in advance
Probably your script is reaching the maximum php execution time which is by default 30 secs. You can change it in the controller doing the lengthy operation with the php set_time_limit() function. For example:
set_time_limit (300); //300 seconds = 5 minutes
That's more a limitation of your webserver/environment PHP is running in.
Increase max_execution_time to allow your webserver running the request longer - alternative would be writing a console command, the cli environment isn't restricted in many cases.
I have a very painful slow script that gets lots of data from MySQL and creates a large report out of it that it serves to the user at the end as application/force-download.
Long story short, on production server it keeps terminating after about 30 seconds (quite consistently) and spitting out an empty file instead. On development server it works fine, but it does take significantly longer to execute - about 90 seconds. Just to be 'safe', I set my php.ini file to max_execution_time = 2000 and also run set_time_limit(4000) in the beginning of my script (numbers way over the expected completion time, but just to be sure ;)).
What could be causing my web server to ignore the time limits I set and quit on me after only 30 seconds?
Edit: one thing I know for sure is that it takes MySQL portion of the code 8-9 seconds to complete, and it successfully gets past that point every time.
Maybe the PHP safe_mode.
Try to do a
die(ini_get('max_execution_time'))
to read the value after you have called the set_time_limit(0); to see if actually it get overwrited.
If it gets overwrited to 0 and your script still dies then the cause could be somewhere else in your code