Restful codeigniter limitation with files? - php

I use PhilSturgeon RestFul libraries (Client and Server).
I want to download a file from my client, it works with a lot of files, but when the file is big (>= 6 Mo), I get a blank page with no error.
Can I handle big files with this system ? My code si simply :
$data = file_get_contents('my_big_file.bmp');
$this->response(base64_encode($data), 200);
I didn't find any configuration dealing with timeouts or execution times in both libraries.

Yes, you can. The problem you will face is that your servers RAM will get to it's max quickly if the file is too big and probably if the file takes a lot of processing time the script will time out. This has to do with your php.ini settings.
To fix this open your php.ini and change "max_execution_time" to the seconds you want. 0 is unlimited but that isn't good on a production level so set a time you think it will take. If you want to be able to process files bigger than 128MB change the "memory_limit" to something higher.
It is also possible to set these via your php script/controller with http://www.php.net/manual/en/function.ini-set.php e.g.
ini_set('memory_limit', '256M'); //Sets the memory_limit on 256MB
ini_set('max_execution_time', '900'); //Sets the max_execution_time to 900 seconds (15 minutes)

Related

PHP running on IIS - time out after 15minutes

I have a php script which creates (in a loop) 20 PDF documents from HTML files.
Each file contains quite a lot of data (pictures) and generally takes about 2 minutes per file.
It works perfectly, until a timeout seems to occur at around 15minutes.
I am overriding the php.ini file for this script with
ini_set('max_execution_time', 0);
as a test.
The ini_set command is clearly working as it is overriding the default 30 seconds.
Is there an IIS setting that I don't know about that is killing scripts / threads after around 15minutes?
Any insight gratefully received.
set max_input_time to 0 also if you are using fastcgi you need to check fastcgi config timeout https://learn.microsoft.com/en-us/iis/application-frameworks/install-and-configure-php-applications-on-iis/using-fastcgi-to-host-php-applications-on-iis#php-process-recycling-behavior

How to prevent timeout when running a time consuming PHP

I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?

PHP backup script timing out

I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.

PHP: Uploading large files fail

Im confused... I can't seem to upload files in the 2gb range. When i try using curl to send a 1.92gb file to my site (through an API), it doesn't report anything at all, its just blank. When i send a 1kb file, it reports back like it should.
When i try uploading via the upload form, it ends up freezing mid way, around 33%. Although im not sure if only the progress bar has froze or if the actual file upload it self has been suspended. I suspect that only the progress bar has froze because it still says data is being sent even though the progress bar freezes.
My php.ini (yes, its reflected by phpinfo as well):
register_globals = Off
magic_quotes_gpc = Off
post_max_size = 2047M
upload_max_filesize = 2047M
max_execution_time = 25200 ; Maximum execution time of each script, in seconds
max_input_time = 25200 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2048M ; Maximum amount of memory a script may consume (16MB)
short_open_tag = On
My vps doesnt actually have 2gb of ram at its disposal, but does memory_limit really need to be set this high?
How should i go about testing this? I know 400mb files work, i haven't tested anything in between 400mb and 1.92gb
You will need a premium account to test up to 2gb, so here is one you can play with:
User: testreferral
Pass: 1234
http://filefx.com
I dont understand where this problem is arising.
Check for:
Memory limit. Try uploading files above and below the actual memory limit.
Time limit. Aren't your uploads take 7+ hours, are they?
The effective settings. Some setting might be overridden by server/etc settings.
PHP: mysql query skipped/ignored after large file uploads?
Mysql was timing out during the file upload. So the file wasn't showing up in the DB

Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC)

I have a bunch of client point of sale (POS) systems that periodically send new sales data to one centralized database, which stores the data into one big database for report generation.
The client POS is based on PHPPOS, and I have implemented a module that uses the standard XML-RPC library to send sales data to the service. The server system is built on CodeIgniter, and uses the XML-RPC and XML-RPCS libraries for the webservice component. Whenever I send a lot of sales data (as little as 50 rows from the sales table, and individual rows from sales_items pertaining to each item within the sale) I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
128M is the default value in php.ini, but I assume that is a huge number to break. In fact, I have even tried setting this value to 1024M, and all it does is take a longer time to error out.
As for steps I've taken, I've tried disabling all processing on the server-side, and have rigged it to return a canned response regardless of the input. However, I believe the problem lies in the actual sending of the data. I've even tried disabling the maximum script execution time for PHP, and it still errors out.
Changing the memory_limit by ini_set('memory_limit', '-1'); is not a proper solution. Please don't do that.
Your PHP code may have a memory leak somewhere and you are telling the server to just use all the memory that it wants. You wouldn't have fixed the problem at all. If you monitor your server, you will see that it is now probably using up most of the RAM and even swapping to disk.
You should probably try to track down the offending code in your code and fix it.
ini_set('memory_limit', '-1'); overrides the default PHP memory limit.
The correct way is to edit your php.ini file.
Edit memory_limit to your desire value.
As from your question, 128M (which is the default limit) has been exceeded, so there is something seriously wrong with your code as it should not take that much.
If you know why it takes that much and you want to allow it set memory_limit = 512M or higher and you should be good.
The memory allocation for PHP can be adjusted permanently, or temporarily.
Permanently
You can permanently change the PHP memory allocation two ways.
If you have access to your php.ini file, you can edit the value for memory_limit to your desire value.
If you do not have access to your php.ini file (and your webhost allows it), you can override the memory allocation through your .htaccess file. Add php_value memory_limit 128M (or whatever your desired allocation is).
Temporary
You can adjust the memory allocation on the fly from within a PHP file. You simply have the code ini_set('memory_limit', '128M'); (or whatever your desired allocation is). You can remove the memory limit (although machine or instance limits may still apply) by setting the value to "-1".
It's very easy to get memory leaks in a PHP script - especially if you use abstraction, such as an ORM. Try using Xdebug to profile your script and find out where all that memory went.
When adding 22.5 million records into an array with array_push I kept getting "memory exhausted" fatal errors at around 20M records using 4G as the memory limit in file php.ini. To fix this, I added the statement
$old = ini_set('memory_limit', '8192M');
at the top of the file. Now everything is working fine. I do not know if PHP has a memory leak. That is not my job, nor do I care. I just have to get my job done, and this worked.
The program is very simple:
$fh = fopen($myfile);
while (!feof($fh)) {
array_push($file, stripslashes(fgets($fh)));
}
fclose($fh);
The fatal error points to line 3 until I boosted the memory limit, which
eliminated the error.
I kept getting this error, even with memory_limit set in php.ini, and the value reading out correctly with phpinfo().
By changing it from this:
memory_limit=4G
To this:
memory_limit=4096M
This rectified the problem in PHP 7.
You can properly fix this by changing memory_limit on fastcgi/fpm:
$vim /etc/php5/fpm/php.ini
Change memory, like from 128 to 512, see below
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 128M
to
; Maximum amount of memory a script may consume (128 MB)
; http://php.net/memory-limit
memory_limit = 512M
When you see the above error - especially if the (tried to allocate __ bytes) is a low value, that could be an indicator of an infinite loop, like a function that calls itself with no way out:
function exhaustYourBytes()
{
return exhaustYourBytes();
}
Your site's root directory:
ini_set('memory_limit', '1024M');
After enabling these two lines, it started working:
; Determines the size of the realpath cache to be used by PHP. This value should
; be increased on systems where PHP opens many files to reflect the quantity of
; the file operations performed.
; http://php.net/realpath-cache-size
realpath_cache_size = 16k
; Duration of time, in seconds for which to cache realpath information for a given
; file or directory. For systems with rarely changing files, consider increasing this
; value.
; http://php.net/realpath-cache-ttl
realpath_cache_ttl = 120
Rather than changing the memory_limit value in your php.ini file, if there's a part of your code that could use a lot of memory, you could remove the memory_limit before that section runs, and then replace it after.
$limit = ini_get('memory_limit');
ini_set('memory_limit', -1);
// ... do heavy stuff
ini_set('memory_limit', $limit);
In Drupal 7, you can modify the memory limit in the settings.php file located in your sites/default folder. Around line 260, you'll see this:
ini_set('memory_limit', '128M');
Even if your php.ini settings are high enough, you won't be able to consume more than 128 MB if this isn't set in your Drupal settings.php file.
Change the memory limit in the php.ini file and restart Apache. After the restart, run the phpinfo(); function from any PHP file for a memory_limit change confirmation.
memory_limit = -1
Memory limit -1 means there is no memory limit set. It's now at the maximum.
Just add a ini_set('memory_limit', '-1'); line at the top of your web page.
And you can set your memory as per your need in the place of -1, to 16M, etc..
For Drupal users, this Chris Lane's answer of:
ini_set('memory_limit', '-1');
works but we need to put it just after the opening
<?php
tag in the index.php file in your site's root directory.
PHP 5.3+ allows you to change the memory limit by placing a .user.ini file in the public_html folder.
Simply create the above file and type the following line in it:
memory_limit = 64M
Some cPanel hosts only accept this method.
Crash page?
(It happens when MySQL has to query large rows. By default, memory_limit is set to small, which was safer for the hardware.)
You can check your system existing memory status, before increasing php.ini:
# free -m
total used free shared buffers cached
Mem: 64457 63791 666 0 1118 18273
-/+ buffers/cache: 44398 20058
Swap: 1021 0 1021
Here I have increased it as in the following and then do service httpd restart to fix the crash page issue.
# grep memory_limit /etc/php.ini
memory_limit = 512M
For those who are scratching their heads to find out why on earth this little function should cause a memory leak, sometimes by a little mistake, a function starts recursively call itself for ever.
For example, a proxy class that has the same name for a function of the object that is going to proxy it.
class Proxy {
private $actualObject;
public function doSomething() {
return $this->actualObjec->doSomething();
}
}
Sometimes you may forget to bring that little actualObjec member and because the proxy actually has that doSomething method, PHP wouldn't give you any error and for a large class, it could be hidden from the eyes for a couple of minutes to find out why it is leaking the memory.
I had the error below while running on a dataset smaller than had worked previously.
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4096 bytes) in C:\workspace\image_management.php on line 173
As the search for the fault brought me here, I thought I'd mention that it's not always the technical solutions in previous answers, but something more simple. In my case it was Firefox. Before I ran the program it was already using 1,157 MB.
It turns out that I'd been watching a 50 minute video a bit at a time over a period of days and that messed things up. It's the sort of fix that experts correct without even thinking about it, but for the likes of me it's worth bearing in mind.
In my case on mac (Catalina - Xampp) there was no loaded file so I had to do this first.
sudo cp /etc/php.ini.default /etc/php.ini
sudo nano /etc/php.ini
Then change memory_limit = 512M
Then Restart Apache and check if file loaded
php -i | grep php.ini
Result was
Configuration File (php.ini) Path => /etc
Loaded Configuration File => /etc/php.ini
Finally Check
php -r "echo ini_get('memory_limit').PHP_EOL;"
Using yield might be a solution as well. See Generator syntax.
Instead of changing the PHP.ini file for a bigger memory storage, sometimes implementing a yield inside a loop might fix the issue. What yield does is instead of dumping all the data at once, it reads it one by one, saving a lot of memory usage.
The reason for this error is that your server configuration has a very low memory limit. Try adding this to wp-config.php (put it after <?php in this file):
define('WP_MEMORY_LIMIT', '96M');
Please note that this limit is OK for the theme and the plugins that come with the theme. If you want to enable other plugins you may need to increase the limit further.
define('WP_MEMORY_LIMIT', '256M');
Running the script like this (cron case for example): php5 /pathToScript/info.php produces the same error.
The correct way: php5 -cli /pathToScript/info.php
If you're running a WHM-powered VPS (virtual private server) you may find that you do not have permissions to edit PHP.INI directly; the system must do it. In the WHM host control panel, go to Service Configuration → PHP Configuration Editor and modify memory_limit:
I find it useful when including or requiring _dbconnection.php_ and _functions.php in files that are actually processed, rather than including in the header. Which is included in itself.
So if your header and footer is included, simply include all your functional files before the header is included.
Greetings is a very common problem because if you have very little memory allocated to php and your website is growing will require more resources.
I found myself in a site that had problems that gave error 500 to modify only some products, the problem was that they had used very heavy images in those specific products, solution:
1.- Increase "memory_limit" in php.ini
2.- Lower the weight of the images.
3.- Adapt again "memory_limit" to an acceptable value "512M" at least for me more than enough.
now it is important that you verify that the changes are being made because php apart from having several versions and several types of installations on the server, maybe you modify one and it does not work and this is because you are not modifying the correct php.ini file.
How do you verify that you are modifying the correct file?
In the prestashop dashboard go to advanced settings/information there you can see "Memory limit".
always remember that after making a change in the php.ini file it is advisable to restart apache or Nginx.
Ubuntu: sudo services apache2 restart
IMPORTANT NOTE: Never set the "memory_limit = -1" as many people mention here. The problem is that if you have a problem with a file or module you could be in a continuous loop consuming all the server's memory and processor. Let's take a simple example: a module has an error and makes a call to a function and until it is not positive it keeps calling, this will create an infinite loop and it will never stop doing it because php has no limit.
I hope it helps colleagues who have this problem.
The most common cause of this error message for me is omitting the "++" operator from a PHP "for" statement. This causes the loop to continue forever, no matter how much memory you allow to be used. It is a simple syntax error, yet is difficult for the compiler or runtime system to detect. It is easy for us to correct if we think to look for it!
But suppose you want a general procedure for stopping such a loop early and reporting the error? You can simply instrument each of your loops (or at least the innermost loops) as discussed below.
In some cases such as recursion inside exceptions, set_time_limit fails, and the browser keeps trying to load the PHP output, either with an infinite loop or with the fatal error message which is the topic of this question.
By reducing the allowed allocation size near the beginning of your code you might be able to prevent the fatal error, as discussed in the other answers.
Then you may be left with a program that terminates, but is still difficult to debug.
Whether or not your program terminates, instrument your code by inserting BreakLoop() calls inside your program to gain control and find out what loop or recursion in your program is causing the problem.
The definition of BreakLoop is as follows:
function BreakLoop($MaxRepetitions=500,$LoopSite="unspecified")
{
static $Sites=[];
if (!#$Sites[$LoopSite] || !$MaxRepetitions)
$Sites[$LoopSite]=['n'=>0, 'if'=>0];
if (!$MaxRepetitions)
return;
if (++$Sites[$LoopSite]['n'] >= $MaxRepetitions)
{
$S=debug_backtrace(); // array_reverse
$info=$S[0];
$File=$info['file'];
$Line=$info['line'];
exit("*** Loop for site $LoopSite was interrupted after $MaxRepetitions repetitions. In file $File at line $Line.");
}
} // BreakLoop
The $LoopSite argument can be the name of a function in your code. It isn't really necessary, since the error message you will get will point you to the line containing the BreakLoop() call.
In my case it was a brief issue with the way a function was written. A memory leak can be caused by assigning a new value to a function's input variable, e.g.:
/**
* Memory leak function that illustrates unintentional bad code
* #param $variable - input function that will be assigned a new value
* #return null
**/
function doSomehting($variable){
$variable = 'set value';
// Or
$variable .= 'set value';
}
Increasing the memory_limit fixed the problem. However, I had problems finding the memory limit. I am working on my project directly from live server, so if you're doing the same, on cPanel you can find the memory_limit if you go to Software - MultiPHP INI Editor and select the location. I increased mine from 256M to 512M. You can also find instructions here.

Categories