I have a json file for more than 100000+ members, I am trying to import them to database, I coded the script and script runs fine and data do gets to insert, but not all of the data, after some time I get internal server error and only 100 or so members get inserted from json file.
I already have tried to set the max execution time to unlimited but still no output.
error_reporting(E_ALL);
ini_set('display_errors',true);
set_time_limit(0);
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 0);
it only gives server time out after some time. if it would be possible to run script even for 1 hour will be great.
i cant run script on local wamp, my internet speed is very slow and script also downloads and resized profile images of members too. plus if my internet is disconnected that would also interrupt the script.
Is there any way i can by pass time out.
I heard about doing it in cron job but not sure if cron job has the same time limit.
the json file i am trying to import was of 122mb, but i divied it in to 34 parts which took some time..
the file now i am trying to run is of 4mb in size, but still after minute or two i get internal server error.
Related
So I am using a form with a file input to submit an excel file and upload the data into a MySQL database. The script imports everything just fine.
Only problem is that the process seems to timeout on the client browser, even though it seems to continue it's process on the server, but it seems to timeout on the client browser. I have already added all the common fixes, I.E.:
ini_set('max_execution_time', 0);
set_time_limit(0);
ini_set('memory_limit', '-1');
ignore_user_abort(true);
I am currently using a flush(); and ob_end_flush(); to output a message every 1000 records that get scanned for upload.
Unfortunately, after exactly 2 minutes, the page stops loading.
On the flip side, it does continue to upload the records all records get scanned and uploaded regardless of the page. This leads me to believe that somehow the request is continuing to process on the web server, but somehow the browser stops getting info from the server after a certain amount of time. Based on my calculations, the browser stops getting the request after exactly 2 minutes.
I have tried this on both Internet Explorer and Chrome and I get the same result. I have not made any changes to the php.ini file for the security reasons.
Using Old Zend Server with Apache2.
So I am using a form with a file input to submit an excel file and upload the data into a MySQL database. The script imports everything just fine.
Only problem is that the process seems to timeout on the client browser, even though it seems to continue it's process on the server, but it seems to timeout on the client browser. I have already added all the common fixes, I.E.:
ini_set('max_execution_time', 0);
set_time_limit(0);
ini_set('memory_limit', '-1');
ignore_user_abort(true);
I am currently using a flush(); and ob_end_flush(); to output a message every 1000 records that get scanned for upload.
Unfortunately, after exactly 2 minutes, the page stops loading.
On the flip side, it does continue to upload the records all records get scanned and uploaded regardless of the page. This leads me to believe that somehow the request is continuing to process on the web server, but somehow the browser stops getting info from the server after a certain amount of time. Based on my calculations, the browser stops getting the request after exactly 2 minutes.
I have tried this on both Internet Explorer and Chrome and I get the same result. I have not made any changes to the php.ini file for the security reasons.
Using Old Zend Server with Apache2.
I am working on a script where I need to find if there's and updation on page went on Job URLs in my database. i.e if any job is posted/page updated etc for the pages which have their URLs stored in my database. I am trying to fetch headers of those pages and checking if their last-modified date is more than stored in my database or content-length is more or less than stored in my database (once i fetch the last modified date and content -length and when script run again it compare records for each URL)
The script is working fine on my local but the problem is when it run on bluehost server it is breaking after uncertain record or amount of time and showing error [an error occurred while processing this directive]. (Its when i try to trigger script from my browser.) and when I let it run from Cron it never return anything (I have added a script to send me a mail once the script run fully. with or without errors. I am bypassing the errors if any in-case script is not able to make updation in record (which is around 15 records).
Any one know what could be the error? I was earlier using wget --delete-after command and now using php -f and my cron is on dedicated IP. Execution time could be around 15-20 min.
PHP configs do have provisions for max execution time and also resources they are permitted to use e.g. max memory.
You can set these programatically.
See http://www.php.net/set_time_limit
It is not possible to set the limit beyond the environment max_execution_time - that is set in the php configs.
If you are on a shared server - these limits are often set quite aggressively to ensure resources do not get monopolised.
I have a script that updates my database with listings from eBay. The amount of sellers it grabs items from is always different and there are some sellers who have over 30,000 listings. I need to be able to grab all of these listings in one go.
I already have all the data pulling/storing working since I've created the client side app for this. Now I need an automated way to go through each seller in the DB and pull their listings.
My idea was to use CRON to execute the PHP script which will then populate the database.
I keep getting Internal Server Error pages when I'm trying to execute a script that takes a very long time to execute.
I've already set
ini_set('memory_limit', '2G');
set_time_limit(0);
error_reporting(E_ALL);
ini_set('display_errors', true);
in the script but it still keeps failing at about the 45 second mark. I've checked ini_get_all() and the settings are sticking.
Are there any other settings I need to adjust so that the script can run for as long as it needs to?
Note the warnings from the set_time_limit function:
This function has no effect when PHP is running in safe mode. There is no workaround other than turning off safe mode or changing the time limit in the php.ini.
Are you running in safe mode? Try turning it off.
This is the bigger one:
The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
Are you using external system calls to make the requests to eBay? or long calls to the database?
Look for particularly long operations by profiling your php script, and looking for long operations (> 45 seconds). Try to break those operations into smaller chunks.
Well, as it turns out, I overlooked the fact that I was testing the script through the browser. Which means Apache was handling the PHP process, which was executed with mod_fcgid, which had a timeout of exactly 45 seconds.
Executing the script directly from shell and CRON works just fine.
Long time reader, first time poster
I have a script which imports rows from a CSV file, processes in, then inserts into a MYSQL database, the CSV file itself has around 18,800 rows.
This script runs as part of a WordPress plugin installation, and appears to be very temperamental. Sometimes it will complete the entire script and load the page as normal, other times, lets say, 2/3s of the time, it will only import around 17.5k of the rows before silently terminating the script and reloading the page without any GET or POST vars,
I have tried everything I can think of to see why its doing it, but with no luck.
The server software is Apache installed on Linux,
The Server error log doesn't have any entries,
max execution time is set to 0,
PHP max input time is 1800,
PHP register long arrays is set to on,
The script is running as PHP 5.3.5CGI
The database is hosted on the same server
The max memory limit is 256M
The max post size is 7M
Is there anything I am missing that may be causing an error?
Any help would be appreciated, as I am totally stumped!
Thanks in advance!
EDIT:
If I use a CSV file of 15k rows instead of 18k it completes correctly, could it be a time issue?