set_time_limit not working on heroku - php

I am using PHP with heroku. I keep on getting a request timeout error due to some database insertions and queries.
I added this line to all my php files in order to avoid this error:
set_time_limit(0);
However, I am still getting this error. Does heroku ignore this command?
I did a simple check to see if the time limit is being changed:
echo 'TIME : '.ini_get('max_execution_time');
set_time_limit(0);
echo 'TIME : '.ini_get('max_execution_time');
It is being changed from 30 (default value) to 0. Despite the change, I am still getting the error.
Also, I would like to add that the php file is being called by ajax.
Furthermore, as far as I know, php is not set to safe mode, so there is no reason why the command should be ignored.
Heroku suggests to use a background job, and as far as I can tell, it forces you if the task takes more than 30 seconds. Has anybody managed without using a background job?
Update: Tried using:
ini_set('max_execution_time', 0);
Still does not want to work

If you have to go over the 30s request timeout on Heroku, you'll need to use a background job - there is no way around that (Heroku will just kill the request if it takes longer than 30 seconds). Heroku has some documentation on this.

Related

mysql server is responding too slow or not responding

I use mysql through phpmyadmin interface. i have no problem with the apache server. It responded,as it was before. But when i am trying to access the phpmyadmin page, the page is loading with a huge time. After a long time it came with a message
`Fatal error: Maximum execution time of 30 seconds exceeded in
C:\xampp\phpMyAdmin\libraries\classes\Dbi\DbiMysqli.php on line 213
` i have changed the value of the variable
$cfg['ExecTimeLimit']
from 300 t0 1200. I think for that, i am able to see the loaded page. But, after loading i can't do anything with the interface as it takes too much time to respond.
I have tried the mentioned things in the following link
WAMP/XAMPP is responding very slow over localhost
can anyone help me out to get rid of this problem, it's really waste a huge time of mine from several days
I was having the same problem you could edit your php.ini file and set the max_execution_time = 120. But this isn't always working. Another suggestion is the past this two lines in your code. This solved my problem. It will affect the execution time of the script itself so it will get more time to run then the 30 seconds.
ini_set('max_execution_time', 300);
set_time_limit(0);
I don't know if you use this but this is the easiest way to visualise the errors.
error_reporting(E_ALL);
ini_set('display_errors', 1);

php page loading script

I have a site hosted on rackspace cloud sites. I have a troubleshooting script I am trying to run to figure out some problems with the site.
Cloud sites has a 30 second timeout and it is timing out before the results page can load. I spoke with their support and they advised me to put a page loading script at the top of the php file to keep the connection open but I have no idea how to do that and the googling I have done hasnt been much help.
The script I am trying to run is too long to include here but if anyone needs it you can find it here http://forum.joomla.org/viewtopic.php?f=621&t=582860
edit: so no matter what I set the execution time to in the script the load balancers rackspace uses will still timeout after 30 seconds. they have told me to run a 'page loading' script at the beginning of the script to keep the connection open so I am about to start looking into how to do that.
You could try the set_time_limit() function:
http://php.net/manual/en/function.set-time-limit.php
By default, a PHP script times out after 30 seconds.
Use the set_time_limit( int $seconds ) function to extend the maximum execution time.
You can also use ini_set() and set the max_execution_time:
ini_set("max_execution_time", 300);
EDIT
if the above doesn't work, then they probably use a secondary mechanism to timeout blocking connections. What you could try in this situation is to flush some data at a regular interval.
ob_start(); // enable output buffering
// output something at regular interval
echo " ";
ob_flush();
// at end of script
ob_end_flush();
Hope this helps.

Why php script may stop?

I'm running really long task in php. It's a website crawler and It has to be polite and sleep for 5 seconds each page to prevent server overloading. Script starts with:
ignore_user_abort(1);
session_write_close();
ob_end_clean();
while (#ob_end_flush());
set_time_limit(0);
ini_set('max_execution_time',0);
After few hours (between 3-7h) script dies without any visible reason.
I've checked
apache error log (nothing)
php_errors.log (nothing)
output for errors (10 578 467b of debug output, no errors)
memory consumption (stable, around 3M from memory_get_usage(true) checked every 5 sec, limit set to 512M)
It's not browser, cause I was using wget and chrome to check with the similar reason.
Output is sent to browser every 2-3 seconds, so I don't think that's the fault + I ignore user abort.
Is there any other place I can check to find the issue?
I think there's a problem in the rest of your script, not Apache.
Try profiling your application using the register_tick_function with a light profiler like this and logging the memory usage, may be that.

PHP weird Seg-faults on mysqli_stmt_bind_result

When migrating a PHP script from PHP 5.2 to PHP 5.3, I've stumbled to the following problem:
The general purpose of the script is data mining.
I have a procedure inside that adds data to the MySQL server.
Since it is really repetitive, I've rewritten it (a while ago) to use MySQLi, in particular prepared statements, since there are a total of 3 possible queries to perform.
Anyway, now, on the PHP 5.3 server, the script is crashing on the following line:
mysqli_stmt_bind_result($prepCheck, $id1);
Where $prepCheck is created with $prepCheck = mysqli_prepare($con, $checkQuery) or die("Error");. The query runs fine on the MySQL server ($checkQuery, that is) and the PHP code was working, too, on the previous server.
Running the script with strace didn't reveal anything, since the last thing in it is the system call for echo "Execute";, which is 29936 19:44:18 write(1, "Execute\n", 8) = 8.
The connection object is not FALSE, and even if it was, it should fail with another error, right?
Here comes the weirdest part:
This procedure does not fail when I run the script, limiting the number of pages visited and the script completes successfully. However, when I set a higher limit, it fails, always on the first call to this procedure, and precisely on this line.
If anyone has any suggestions what could be causing this, they would be deeply appreciated.
I can paste code if anyone needs to see a larger picture, but the procedure is very long and boring to death (may be that's why the script is failing :).
Here is how the script starts: error_reporting(E_ALL); ini_set('display_errors', '1');.
No error is reported besides the 'magical' Segmentation fault. I'm not using APC.
Not sure if it's relevant, but I'm using CLI to run the script, not a web-interface.
PHP version is 5.3.8, MySQL version is 5.1.56. The memory limit is set to 64MB.
EDIT: The procedure failing + some of the other code is uploaded here: http://codepad.org/KkZTxttQ. The whole file is huge and ugly, and I believe irrelevant, so I'm not posting it for now. The line that's failing is 113.
An answer to my own question, since I've solved the issue, and there are no other answers...
Credit goes to #jap1968 for pointing to me to the function mysqli_stmt_error (which I assumed I would not need, since I have error_reporting(E_ALL)).
The problem was that MySQL had a very weird default configuration: particularly
connect_timeout = 10
wait_timeout = 30
This caused the MySQL server to close the connection after only 30 seconds (default is more than a half hour, according to MySQL website). This in turn, caused the mysqli_stmt_bind_result function to fail with a Segmentation Fault.

set_time_limit() timing out

I have an upload form that uploads mp3s to my site. I have some intermittent issues with some users which I suspect to be slow upload connections...
But anyway the first line of code is set_time_limit(0); which did fix it for SOME users that had connections that were taking a while to upload, but some are still getting timed out and I have no idea why.
It says the script has exceeded limit execution of 60 seconds. The script has no loops so it's not like it's some kind of infinite loop.
The weird thing is that no matter what line of code is in the first line it will always say "error on line one, two, etc" even if it's set_time_limit(0);. I tried erasing it and the very first line of code always seems to be the error, it doesn't even give me a hint of why it can't execute the php page.
This is an issue only few users are experiencing and no one else seems to be affected. Could anyone throw some ideas as to why this could be happening?
set_time_limt() will only effect the actual execution of the PHP code on the page. You want to set the PHP directive max_input_time, which controls how long the script will accept input (like files) for. The catch is that you need to set this in php.ini, as if the default max_input_time is exceeded, it'll never reach the script which is attempting to change it with ini_set().
Sure, a couple of things noted in the PHP Manual.
Make sure PHP is not running in safe-mode. set_time_limit has no affect when PHP is running in safe_mode.
Second, and this is where I assume your problem lies.....
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
So your stream may be the culprit.
Can you post a little of your upload script, are you calling a separate file to handle the upload using Headers?
Try ini_set('max_execution_time', 0); instead.

Categories