i have a script that load a csv by CURL, once i have the csv it add each of the records to the database and when finished, display the total amount of registries added.
on less than 500 registries, it execute just fine, the problem is that whennever the amount of registries is too big, the execution is interrupted at some point and the browser displays the download dialog with a file named like the last part of my url withouth extension containing nothing. no warning, error or any kind of message. the database shows that it added some of the registries, if i run the script several times it adds a small amount more.
i have tried to look for someone with a similar situation but haven't find it yet.
i would appreciate any insight in the matter, i'm not sure if this is a symfony2 problem, a server configuration problem or what.
thanks in advance
Probably your script is reaching the maximum php execution time which is by default 30 secs. You can change it in the controller doing the lengthy operation with the php set_time_limit() function. For example:
set_time_limit (300); //300 seconds = 5 minutes
That's more a limitation of your webserver/environment PHP is running in.
Increase max_execution_time to allow your webserver running the request longer - alternative would be writing a console command, the cli environment isn't restricted in many cases.
Related
I am running the cron to update the inventory in our database from the ERP inventory csv file. The ERP inventory CSV file contains the 19K record almost. Cron will pick up all records 1 by 1 and update the matched inventory in the database. But since a few days among 19K records 13K-14k records only parse by files and script break in the middle.
I have tried to run the script directly from browser also but its raised the same issue. No error is displayed in the error log.
I was thinking that its timeout issue and increases the max_execution_time to 1500 (25min). But the issue is not resolved yet.
Anyone can suggest me how to solve this issue? Thanks in Advance!
Did you check cron-log? Which operating system you are using?
No error is displayed in the error log.
Then your first course of action is to verify that your error logging is working as expected. If it is a (PHP) timeout or a memory limit issue then the reason will be getting flagged - but it might not be getting reported.
You forgot to tell us how cron runs the task - is this via the CLI SAPI or are you using a http client (wget, curl etc) to invoke via the webserver? The SAPIs have very different behaviours and usually use seperate php.ini files.
I was thinking that its timeout issue
Because you've checked and it always bombs out at the same interval after starting?
But since a few days among 19K records 13K-14k records only parse
And previously it was taking less than the identified amount of time to complete?
and increases the max_execution_time
How? In the script? In the (right) php.ini? Note that if the script is running via the webserver, then there may also be a timeout configured on the webserver.
You might consider prefixing your script with:
set_time_limit(3000);
error_reporting(E_ALL);
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
And capturing the stderr and stdout from the script.
Try to divide csv in 2 files and execute the cron i think your csv file have a problem. if one of them execute without trouble then the process have not issue, repeat process to find the block with problem.
a few years i work with a interface wich read a file to export to SAP, sometimes one special character make the script break.
I have tried an activity to check if a
> FOR/WHILE LOOP
fails or not if it loop through a few million records ....
My Activity
I have more than 1000000 records in a table and if i fetch them and print them
through a for loop and executed this script it with command line it worked successfully..
but when i tried it in a web browser it stucks and closed my browser...
So I want to know is it a failure of PHP(obviously not as it worked in command line)..
or It is a failure of WEB SERVER???
What to do if i must have such a situation in real time web application??
Should I switch to a more powerfull web server if yes then Which One...??
I'm guessing you're running into the max execution time limit which is unlimited on the command line but has a default of 30s on the web server. Try adding the following line at the beginning of your script to remove the limit:
set_time_limit(0);
From the official php docs
max_execution_time
This sets the maximum time in seconds a
script is allowed to run before it is terminated by the parser. This
helps prevent poorly written scripts from tying up the server. The
default setting is 30. When running PHP from the command line the
default setting is 0.
If you want to print 1 000 000 records to browser, you gonna have a bad time...
First - what FuzzyTree said - execution time. But if you say that it works in cli I guess that's not the problem.
Problem with this amount of data is probably browser, which simply cannot handle that much! Check out your CPU and RAM usage while trying to visit that page.
If you REALLY need to display user million of records, use pagination or load new records when user reach bottom of the page using ajax.
A PHP page of our web application has a rather complex query. On the production website this takes about 20 seconds to complete, and then the results are displayed. I have a test version of the site running on my local desktop. When I run the (Postgres) query directly in PGAdmin, it takes about 3 minutes to complete.
So my desktop is slow - not a real problem now. PHP does however quit loading the page and displays no results. Plain text that should be displayed right after the results is not shown either. Searching for solutions I found I could set the max_execution time for PHP on the page itself. I did so using the following code at the first line of the page:
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
This doesn't make a difference. The page loading stops after 6 seconds. What can I do to display the results?
Use set_time_limit()
You should read the documentation for this function where it says:Warning: This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
So make sure your PHP isn't running in safe-mode, else there wont be any way to set or modify the script execution time.
I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.
Long time reader, first time poster
I have a script which imports rows from a CSV file, processes in, then inserts into a MYSQL database, the CSV file itself has around 18,800 rows.
This script runs as part of a WordPress plugin installation, and appears to be very temperamental. Sometimes it will complete the entire script and load the page as normal, other times, lets say, 2/3s of the time, it will only import around 17.5k of the rows before silently terminating the script and reloading the page without any GET or POST vars,
I have tried everything I can think of to see why its doing it, but with no luck.
The server software is Apache installed on Linux,
The Server error log doesn't have any entries,
max execution time is set to 0,
PHP max input time is 1800,
PHP register long arrays is set to on,
The script is running as PHP 5.3.5CGI
The database is hosted on the same server
The max memory limit is 256M
The max post size is 7M
Is there anything I am missing that may be causing an error?
Any help would be appreciated, as I am totally stumped!
Thanks in advance!
EDIT:
If I use a CSV file of 15k rows instead of 18k it completes correctly, could it be a time issue?