Limits set in the php.ini file - php

I have a system where I want users to be able to download large amounts of data. Some of the files can be over 800 MB. The problem is that php times out before the download is complete. I can get just under 250 MB worth, someone on a slower computer got considerably less.
I've think the problem lies in the php.ini file have increased some of the values which hasn't made any difference. I've found three sections of the file that possibly need to be changed but I don't know what they do and can't seem to find out in the php manual. I was wondering if somebody could tell me what they could do and it this could affect my issue.
; Default timeout for socket based streams (seconds)
default_socket_timeout = 360
; Connect timeout
;mssql.connect_timeout = 5
; Query timeout
;mssql.timeout = 60
; Default timeout in seconds.
pfpro.defaulttimeout = 30
Can anyone help me?

Have you tried this?
set_time_limit(0);

This is a likely candidate to have its value raised:
max_execution_time - 1800

Related

How to prevent timeout when running a time consuming PHP

I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?

Restful codeigniter limitation with files?

I use PhilSturgeon RestFul libraries (Client and Server).
I want to download a file from my client, it works with a lot of files, but when the file is big (>= 6 Mo), I get a blank page with no error.
Can I handle big files with this system ? My code si simply :
$data = file_get_contents('my_big_file.bmp');
$this->response(base64_encode($data), 200);
I didn't find any configuration dealing with timeouts or execution times in both libraries.
Yes, you can. The problem you will face is that your servers RAM will get to it's max quickly if the file is too big and probably if the file takes a lot of processing time the script will time out. This has to do with your php.ini settings.
To fix this open your php.ini and change "max_execution_time" to the seconds you want. 0 is unlimited but that isn't good on a production level so set a time you think it will take. If you want to be able to process files bigger than 128MB change the "memory_limit" to something higher.
It is also possible to set these via your php script/controller with http://www.php.net/manual/en/function.ini-set.php e.g.
ini_set('memory_limit', '256M'); //Sets the memory_limit on 256MB
ini_set('max_execution_time', '900'); //Sets the max_execution_time to 900 seconds (15 minutes)

PHP script timeout in Log Analyzer

I have been trying to get Log Analyzer to work for longer than I care to admit. I can't seem to get syslog messages to display in the Log Analyzer web-GUI, but this morning I got the following error:
"While reading the logstream, the php script timeout forced me to abort at this point. If you want to avoid this, please increase the LogAnalyzer script timeout in your config.php. If the user system is installed, you can do that in Admin Center."
I was not getting this error on Friday; only "No syslog records found." The timeout is set to 30 seconds in the config file, but I read that setting will get overwritten back to the default anyway. The database grew to over 4GB over the weekend. Does the the db size have anything to do with this?
It's pretty clear I am new to php and Log Analyzer, so any help with both would be greatly appreciated. I can post config file settings if needed.
Try to increase time. 30 second may be not enough for parse large file. For example set 600 seconds.
Another case to increase timeout - edit following line in php.ini file
; Duration of time, in seconds for which to cache realpath information for a given
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
max_execution_time = 30;
Too many records in the database, the script does not have enough time to process everything. I keep the last 14 days in the database, that's enough.
USE Syslog;
DELETE FROM Syslog.SystemEvents WHERE ReceivedAt < DATE_SUB(NOW(), INTERVAL 14 DAY);

PHP Connection Reset on Large File Upload Regardless Correct Setting

I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.

PHP Script Times out after 45 seconds

I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .

Categories