Ajax error - 0 after a long php script - php

I am using jquery ajax to send the url of a file (csv file) located on the server to my php script so as to process it.
The csv file contains telephone calls. If i have a file with even 10.000 calls everything is ok. But if i try a big file with like for example 20000 calls then i get an Ajax Error 0 . I check for server responce with firebug but i get none.
This behaviour occurs after like 40mins of w8ing for the php script to end. So why do i get this error on big files only? Does it have to do with apache, mysql or the server itself? Anyone able to help will be my personal hero cause this is driving me nuts.
I need a way to figure out whats happening exactly but firebug wont return a server responce. Any other way i can find out whats happening?
I checked the php error log and it reports nothing on the matter
Thanks in advance.

The script may have timed out:
See your php.ini file
max_execution_time
max_input_time ;# for the max time an input can be processed
Were your PHP.ini is depends on your enviroment, more information: http://php.net/manual/en/ini.php

Check:
max_input_time
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. It is measured from the moment of receiving all data on the server to the start of script execution.
max_execution_time
This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30. When running PHP from the command line the default setting is 0.
Also
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.

First enable php error by placing below code at top of the php file.
error_reporting(E_ALL);
Then as Shamil explained in this answer, checkout your max_execution_time settings of your php.
To check max_execution time, open your php.ini file and search for that, and then change it to a maximum value of say one hour (3600).
I hope this will fix your issue.
Thank you

Related

Php : 500 - Internal server error

I have a php script for multiple upload of files.
I noticed that when the upload takes more than (about) two minutes I get the following error:
500 - Internal server error. There is a problem with the resource you
are looking for, and it cannot be displayed.
Some info:
PHP Version: 5.4.23
System: Windows NT SDADMIN32263436 6.1 build 7601 (Windows Server 2008
R2 Standard Edition Service Pack 1) i586
Any tips?
Thank you
By default PHP only allows upload of files a couple of meg big. You could try changing the following directives in the php.ini file ....
memory_limit = 32M
upload_max_filesize = 24M
post_max_size = 32M
Obviously use values that are appropriate to you.
It could however not be linked to the upload size at all. As PHP is server side, the 500 error is incredibly generic. You can try looking at your PHP log files (you can do this on IIS through server 2008).
It might also help you to turn on some error catching in your application. For development, one way to do this is to put the following at the top of your PHP script
ini_set('display_startup_errors',1);
ini_set('display_errors',1);
error_reporting(-1);
This will mean PHP will show any errors it encounters in the browser. It is NOT a good idea to this in production though, as it can give sensitive information about your server and hosting out.
I refer to this question. This user seems to have the same problem of yours and in this answer he was suggested make some changes in the configuration file:
"max_execution_time" integer
This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30. When running PHP from the command line the default setting is 0.
The maximum execution time is not affected by system calls, stream operations etc. Please see the "set_time_limit()" function for more details.
[...]
"max_input_time" integer
This sets the maximum time in seconds a script is allowed to parse input data, like POST, GET and file uploads.
[...]
Additionally here's some info on checking/setting CGI Timeout in IIS5 and 6.
I also suggest you to check the PHP error logs in order to retrieve more information about the upload execution.
Finally in this question and this question they also talk about the IIS configuration in order to allow PHP to make bigger uploads.

Internal Server Error after long time running of php script

I have some db query in my php file, approximately after 40 second, happened INTERNAL SERVER ERROR, though in php.ini file this settings are set:
memory_limit 8192M
max_execution_time 120
I think this settingst is enough, other what reason may causes INTERNAL SERVER ERROR after long time running of php script?
set
ini_set('max_execution_time' ,0);
ini_set('set_memory_limit', -1)
The question is old but this might work for someone.
Use this for every iteration of your loop sleep(0.5);
0.5 is user defined you can change it.

PHP Script ends before timeout without error

I have a PHP Script, it needs execution time of at least 1000 seconds to complete.
It terminates after around 265 seconds each time with no errors. Since I am using loops I tested number of iterations and it is independent of that, further ruling out a possibility of occurrence of an error in the loop.
I have set max_execution_time to 10800 in php.ini, and also changing memory_limit doesn't affect the results.
Please help! I have scratched my head thoroughly!
Did you check your log file? If you get error 6 or segmentation fault. Then your script is is actually crashing php without showing any errors on the browser (if it is browser and not cli).
If you are using apache on unix, then you should find this log in /var/log/apache2/error.log.
Otherwise you can define the path of the log file in .htaccess by add this line:
php_value error_log "/path/to/somewhere/convenient/php_errors.log"
Change the path to somewhere where your httpd has write permission and where you have read permission.
give:
set_time_limit(0);
in the beginning of script. so that the code execution will not time out unless its completed.
This same error happened to me. One of my PHP functions died without any errors sent to stderr, stdout, nor any other log file.
What happened was I was using a helper PHP script written by some other developer that was setting the memory limit to 512MB half way through the operation of my program. The sub module poisoned the well by also setting the error log settings to silent at some point mid-way through the processing of my script.
You can prove if this is happening to you by printing out the php system settings available to your PHP script on EVERY iteration of the loop. When the subscript did the dirty deed, the PHP engine throws a fit after the garbage collector runs at a random point in the future, then dies immediately without error. It's a bug in the PHP garbage collector when sub modules mess with system settings as the garbage collector is doing its work.
Solution: Edit the php helper sub-modules, and make sure they don't tweak the system settings, as the garbage collector is doing its work. The PHP interpreter will freak out and die without any errors or output at a random interval after the poisoning of the PHP system variables.
On PHP, the only way to hide errors from logs even when the display_errors and error_reporting are properly indicated is to use the Error Control Operators,
for example: the following code will raise a Fatal error but a "hidden" one, it will not appear neither on the logs or on the browser.
#echo "forgetting the last quote;
So set the display_errors value to On and the error_reporting depending on the PHP version you are using (see differences here), and check if you are using a "#" symbol on your code.

PHP backup script timing out

I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.

set_time_limit() timing out

I have an upload form that uploads mp3s to my site. I have some intermittent issues with some users which I suspect to be slow upload connections...
But anyway the first line of code is set_time_limit(0); which did fix it for SOME users that had connections that were taking a while to upload, but some are still getting timed out and I have no idea why.
It says the script has exceeded limit execution of 60 seconds. The script has no loops so it's not like it's some kind of infinite loop.
The weird thing is that no matter what line of code is in the first line it will always say "error on line one, two, etc" even if it's set_time_limit(0);. I tried erasing it and the very first line of code always seems to be the error, it doesn't even give me a hint of why it can't execute the php page.
This is an issue only few users are experiencing and no one else seems to be affected. Could anyone throw some ideas as to why this could be happening?
set_time_limt() will only effect the actual execution of the PHP code on the page. You want to set the PHP directive max_input_time, which controls how long the script will accept input (like files) for. The catch is that you need to set this in php.ini, as if the default max_input_time is exceeded, it'll never reach the script which is attempting to change it with ini_set().
Sure, a couple of things noted in the PHP Manual.
Make sure PHP is not running in safe-mode. set_time_limit has no affect when PHP is running in safe_mode.
Second, and this is where I assume your problem lies.....
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
So your stream may be the culprit.
Can you post a little of your upload script, are you calling a separate file to handle the upload using Headers?
Try ini_set('max_execution_time', 0); instead.

Categories