I'm calling an MLS service that responds with 4000+ records ... and I need to process each and every one of them, as well as insert all of the meta data per listing.
I'm able to get to about 135 (* 150 meta records) and then the script apparently stops responding, or at least stops processing the rest of the data.
I've added the following to my .htaccess file:
php_value memory_limit 128M
But this doesn't seem to help me any. Do I need to process chunks of the data at a time, or is there another way to ensure that the script will indeed finalize?
You should probably enable display_errors and error_reporting to get a better analysis of why the script isn't processing.
However, you should also consider making sure the time limit isn't being hit by calling:
set_time_limit( 0 );
This will give you an unlimited time period. You can also just set it to something relatively high, like 600 (10 minutes)
It isn't the memory- it's most likely the script execution time.
Try adding this to your htaccess, then restart apache:
php_value max_execution_time 259200
Related
I have a Wordrpess with a custom import script for csv. The file I want to import it has 24 MB and 12000 products. At over 10500 products the script stops.
It worked until I reached this number or products.
Here is my configuration:
upload_max_filesize 500 M
post_max_size 500 M
max_execution_time 18000
max_input_time 18000
wait_timeout 60
What do I need to change?
If you get any imports at all, it means that upload limitations are not to blame. If you were hitting those, none of the import would take place.
The two most probably "candidates" are: execution time was hit or memory limit was reached.
For the former, you already have max_execution_time set to quite a large number and I assume your input script is not taking that long. (correct me if I'm wrong)
So the most obvious one is that your script reaches memory_limit then just halts, hence incomplete import.
If increasing the memory_limit doe not help, you will need to enable error reporting in order to find out what's going on.
To do that in WordPress, simply enable debug mode by adding the following line in your wp-config.php:
define('WP_DEBUG', true);
Optional side note
Having said that, importing large amounts of data by way of unreasonably increasing allowed resources is probably not the right way to go.
Try implementing incremental imports. I.e. the receiving script just parses submitted data, then uses AJAX to do import one by one. Or import submission form takes index parameters (import records 0 to 1000), etc.
Allowing a lot of resources taken by PHP is asking for trouble. Malicious users can exploit that to easily bring your website down.
I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);
I have a php script that continues to timeout at 45 seconds. I have done everything I can find with execution time and currently have the following.
In my php.ini:
max_execution_time = 0
In my .htaccess:
# set timeout to 10 minutes
<IfModule mod_php5.c>
php_value max_execution_time 600
</IfModule>
And in the script that timesout:
ini_set("memory_limit", "-1");
set_time_limit(0);
In the script, when I echo ini_get('max_execution_time') I get 0 so it looks like everything is right. Is are there other resource limits at play that are keeping the script from running? I've researched memory_limit, input time, etc but am thinking there's something here I don't know about.
The script does a while loop against a table and then crawls different sites according to the record. When I limit the return to 1 or 2 records it works fine but any more than that it goes to a 404 page not found. To me this means it times out but does the 404 error indicate something else is going on? Thanks
A 404 error does not mean your script times out, it just means that those URLs you hit are Not Found.
You need to evaluate your script and maybe send a HEAD request to check the status code.
See here a list of HTTP status code and their meaning.
I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?
I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.