Sleep() Timeout - php

I have an issue with my host setting in terms of timeout but actually i don't know which one exactly is responsible for that. When i run this code in php page:
<?PHP
sleep(30);
echo "Done";
?>
It gives me "tcp error operation timeout".
But when i change it to be sleep(20) it runs successfully. So please help me to find out which item in my php ini file is responsible for this timeout. However i tried to google it and already tried a lot of suggestions but with no luck, and finally i am stuck at this point to let my php script go live.

Based on the error message, I suspect that this is not PHP timing out, but the actual web server "pulling the plug" (although 30 seconds is a ridiculously short time limit for that).
That is certainly not the normal PHP error message, and you shouldn't be able to induce a PHP timeout using sleep, since that uses up no actual CPU time.
Without knowing what kind of host you're running under (Apache? Nginx? IIS? Shared hosting? A VPS? etc) it's hard to know where, and even if, you could change this.

Take a look at your time limit in your configuration.
You can modifiy the time limit of the script with this line of code:
set_time_limit(60);
The default value id 30s. If you are using the set_time_limit() without parameter inside, it will just reset the counter, and the total time available will be increased by 30s.
Take a look at the documentation set_time_limit documentation

30 seconds is default php timeout, if you need to sleep(30) you must increase the timeout, there are 2 ways to change it, from php script.
set_time_limit ( int $seconds );
OR
ini_set('max_execution_time', int $seconds);

I would like to share this answer with every one who is searching for similar issue. After spending several hours searching for the root cause, finally i found out without changing any piece of code that Network Error (tcp_error) A communication error occurred: "Operation timed out" error disappeared when i ran my script on another machine, hence the issue was with either my machine or my internet speed really i don't know. Anyway the error might be not found only in code.

Related

Possible causes for connection interrupted, LAMP stack

MySQL 5.1.73
Apache/2.2.15
PHP 5.6.13
CentOS release 6.5
Cakephp 3.1
After about 4 minutes (3 min, 57 seconds) the import process I'm running stops. There are no errors or warnings in any log that I can find. The import process consists of a lot of SQL calls and data processing, nothing too crazy, but it can take about 10 minutes to get through 5500 records if it's doing a full compare for updates.
Firefox: Secure Connection Failed - The connection to the server was reset while the page was loading.
Chrome: ERR_NO RESPONSE
The php set time limit is set to 900, which is working. I can set it to 5 seconds and get an error. The limit is not being reached.
I can sleep another controller for 10 minutes, and this error does not happen, indicating that something in the actual program is causing it to fail, and not the hosting service killing the request because it's taking too long (read about VPS doing this to prevent spam).
The php errors are turned all the way up in the php.ini, and just to be sure, in the controller itself.
The import process completes if I reduce the size of the file being imported. If it's just long enough, it will complete AND show the browser message. This indicates to me it's not failing at the same point of execution each time.
I have deleted all the cache and restarted the server.
I do not see any output in the apache logs other then that the request was made.
I do not see any errors in the mysql log, however, I don't know if it's because its not turned on.
The exact same code works on my local host without any issue. It's not a perfect match to the server, but it's close. Ubuntu Desktop vs Centos, php 5.5 vs php 5.6
I have kept an eye on the memory usage and don't see any issues there.
At this point I'm looking for any good suggestions on what else to look at or insights into what could be causing the failure. There are a lot of possible places to look, and without an error, it's really difficult to narrow down where the issue might be. Thanks in advance for any advice!
UPDATE
After taking a closer look at the memory usage during the request, I noticed it was getting much higher than it ideally should.
The httpd (apache) process gets killed and a new thread spawned. Once the new thread runs out of memory, the error shows up on the screen. When I had looked at it previous, it was only at 30%, probably because it had just killed the old process. Watching it the whole way through, I saw it get as high as 80%, which with the other processes was enough to get have it run out of memory, and a killed process can't log anything, hence the no errors or warnings. It is interesting to me that the process just starts right back up.
I found a command to show which processes had been killed due to memory which proved very useful:
dmesg | egrep -i 'killed process'
I did have similar problems with debugkit.
I had bug in my code during memory peak and the context was written to html in the error "log".

PHP script runs much longer than max_execution_time?

PHP is running as an Apache module.
The script start with a: ini_set('max_execution_time', 300);
What it does is basically connecting to a database, doing a big SELECTquery and looping through the results, writing them to a file and echoing a "result ok" after each write with explicit flush();
There is no sleep() call.
This is a "test" script made from a co-worker of mine for backup purposes and is intended to run up to a few hours! I thought I was aware of script execution time limit and thought his script would time out after 300 seconds...
But it didn't !
It's invoked from a web browser. The page is left open and we can see the results being echoed in real-time.
Why doesn't it time out?
Even stranger, one of the test issued a "Maximum execution time of 300 seconds exceeded" but this appeared at least after 2 hours of execution!
What's going on here? Is there something to understand between max_execution_time and flush() or a browser window being kept opened?
As you can see on the man page for the set_time_limit function, here the total execution time you are setting only affects the actual script. The time spent on database queries or any other external calls is not counted (if the OS is not Windows).
The only thing I can think of that may cause this is if PHP is running in safe mode. ini_get('max_execution_time') should tell you if it's actually being set.
Edit: Spotted your comment above.
echo ini_get('max_execution_time'); // says 300
If 300 is reported, and your not on Windows, #mishu is likely right. Your SELECT query is probably taking hours.

PHP ZipArchive Timeout

I'm currently trying to find a work around to prevent a PHP script from timing out whilst creating a .zip using the ZipArchive class.
I've managed to do this no problem by overriding the max script execution time inside php.ini using set_time_limit(). However its not guaranteed that safe mode will be turned off inside php.ini on the servers that the script will be running on and I don't have access to the php.ini file. Is there any other way to stop the script timing out? Can the ZipArchive be run as a background task?
I think time out is not problem.
It will be solved by ini_set of max_execution_time.
But memory limit is problem.
I am not facing time out issue about zip file creation of 5.8G directory. but i am faing memory limit issue.
Try
ini_set('max_execution_time', 60); //60 seconds = 1 minute
ini_set('max_execution_time', 0); //0=NOLIMIT
But, there may be restrictions put in place on a shared host, so if these functions don't work as they should, ask some admin of the server.
I'm trying to solve this as well, set_time_limit is no good on shared hosting in safe mode or disabled on server.
I'm using Ajax client side calls to PHP to process archiving in steps, however it opened up a few new issues. Script timeout and some kind of "check-pointing" so I can resume on a continue operation.
So my recommendation is to use ajax/client side implementation to hit php to do work knowing the script may not finish in one call but could take N calls, and although an operation could take long enough in PHP to time out the script you can still put logic in to check for elapsed time and try to save state before a timeout and cover both cases in client for timeout/manual kick out.

PHP script stops running after some time run by browser

I have made a PHP script which probably would take about 3 hours to complete. I run it from browser and after about 45minutes it stops doing anything. I know this since its polling certain web addresses and then saves some data to database. So it basically stops putting any data to database which lead me to conclusion that it has stopped. It still shows in browser like it would be loading the page though but its neverending.
There arent any errors so it probably is some kind of timeout... But where it occurs is mystery or how can I prevent it from happening. In my case I cant use the CLI, I must user browser client to initiate the script.
I have tried to put
set_time_limit(0);
But it had no apparent effect. Any suggestions what could cause the timeout and a fix for it?
Try this:
set_time_limit(0);
ignore_user_abort(true);
ini_set('max_execution_time', 0);
Most webhosts kill processes that run for a certain length of time. This is intended as a failsafe against infinite loops.
Ask your host about this and see if there's any way it can be disabled for this particular script of yours. In some cases, the killer doesn't apply to Cron tasks, or processes run by SSH. However, this varies from host to host.
Might be the browser that's timing out, not sure if browsers do that or not, but then I've also never had a page require so much time.
Suggestion: I'm assuming you are running a loop. Load a page then run each iteration of the loop in an ajax call to another page, not firing the next iteration until the previous one returns.
There's a setting in PHP to kill processes after sometime. This is especially important for shared servers (you do not want that one process slows up the whole server).
You need to ask your host if you can make modifications to php.ini (through .htaccess). In particular the max_execution_time setting.
If you are using session, then you would need to look at 'session.cookie_lifetime' and not set_time_limit. If you are using an array, the array size might also fill up.
Without more info on how your script handles the task, it would be difficult to identify.

PHP: Maximum execution time of 30 seconds exceeded

I have a php app that calls a class called Client. Every so often I get a sort of time out error. I thought it was SQL at first but it turns its pointing to the class itself.
Fatal error: Maximum execution time of 30 seconds exceeded in C:\Program Files (x86)\Apache Software Foundation\Apache2.2\htdocs\ClientPortal\classes\Connections.php on line 3
<?php
session_start();
class Connections { //line 3
Does anyone know what's going on here?
thanks,
Billy
PHP scripts have a maximum time they're allowed to execute for, as declared in the php.ini.
You can circumvent this if you really want by adding the following line:
ini_set('max_execution_time', 123456);
where 123456 is the number of seconds you want the limit to be.
You can also use the set_time_limit function, which I only just found out about and assume does the same thing. I've always just done the former though.
You can change it in the php.ini file, but you might be using your script to do a batch operation or something. You wouldn't want a PHP script that is being accessed by an end user to sit there hanging for 30 seconds or more though, so you're better off leaving it at the default or even turning it down in the php.ini file, and setting the max_execution_time on an as-needed basis.
As seengee points out in the comment below, you can set the max_execution_time to 0 to stop the error from ever happening, but seengee is right to say that at least for a web request, you really shouldn't do this. For the php command line interpreter, this behaviour is the default though.
If you're seeing this problem for things that are supposed to be used by end-users through a web request, you might have to do some profiling to work out the real cause. If you're doing MySQL queries, start by turning on the slow query log. It's particularly good at letting you know when you've forgotten an index, or if you're doing something else inefficient.
You can also shove a few $s = microtime(true); yourstuff(); var_dump(microtime(true)-$s); things around to get a vague overview of which bits are slowing things down, just make sure you don't leave any of them in afterwards!
If you're still struggling to find the root cause, set xdebug up on your local machine and run the profiler. The extension is available as a precompiled windows binary (although there seems to be a confusing array of versions). You can inspect the results of running the profiler using wincachegrind.

Categories