What Are The Potential Problems With Setting Higher Memory Limit? - php

What are the potential problems that could be encountered when the memory limit in php.ini is set to say 100MB?

I can see only the obvious one: That a greedy or broken script can hog a lot of your resources. If such a script gets requested by multiple clients simultaneously, it could become a serious resource problem for your server.
Depending on your use case, maybe it's wise to increase the memory limit on a per-script basis instead.

On Linux, if the server runs out of memory, it starts swapping (using slow hard disk for virtual memory).
If it runs out of swap, it starts killing processes at semi-random (the algorithms are complex and usually hard to predict).

Related

Drawbacks of increasing PHP memory limit

Recently my app started throwing fatal errors regarding exhaustion of the max allowed memory. After some research, I found out that the limit is set in the .htaccess file and it sets to 64m.
The tried allocation is of around 80MB, and we are able to provide these resources, but I wanted to ask the community if increasing the value of this variable is a good solution to the problem.
Thank you very much
The answer depends on what your script is doing :) It sounds like the increase in your case was quite small (from 64 to 80 MB, I'd recommend sticking to the powers of two though and upping it to 128MB btw.) so it shouldn't make much of a difference for modern machines. But to know if it was the right thing to do you need to find out why you needed more memory.
If your processing simply requires more memory (eg. you're processing uploaded files in memory, or decoding big json or xml structures or doing something else which is memory intensive) upping your limit is ok and common.
If however your application has memory leaks or is written in an inefficient way then upping the memory limit is just masking the problem and not solving it. You'll likely keep running into this issue and end up upping the memory all the time which is not feasible.
If you don't know what caused the sudden increase in memory consumption I'd recommend profiling your application using e.g. xhprof. You can also look at the last few changes to your app and see what might have caused it. If you can justify it then give your script more memory, otherwise try optimising your code first.
PHP is typically deployed in a way that a PHP process serves multiple request after another. During a single request a script can now allocate memory. At the end this memory will be free'd. So far so good. Now most operating systems are built in a way to keep memory, which was allocated bound to a process, even when freed. The assumption there is that a programmed which required memory once will need the amount again and it's cheaper to keep it available to the process than taking it back. Thus in a PHP deployment it might happen, that one request takes a lot of memory and then the memory is bound to the process and not available to the system anymore. Additionally it's a possible indication for a bug if some process takes a lot more memory than anticipated. So for those two things the memory_limit serves as safety net.
If your application needs more memory it's generally fine to increase the limit. The absolute maximum value is dependant on the system (available RAM / number of worker processes might be a rough formula, rough as it doesn't include other memory needed) Typically you should only increase by an amount needed.
Of course when changing this you have to remember when moving to other systems. Also typically less memory usage means faster execution thus you should try to see if you can optimise your code.
Side note: I purposely simplified the memory model above, ignoring virtual memory pages and how operating systems optimize there

Laravel4 memory consumption concerns

Case
Currently I am developing an application using Laravel 4. I installed profiler to see the stats about my app. This is the screenshot:
Questions
You can see that it consumes 12.25 MB memory for each request (very simple page) in my vagrant (Ubuntu 64 bit + Nginx + PHP 5.3.10+ MySQL). Do you think this is too much ? This means If I have 100 concurrent connections, the memory consumption will be about 1 GB. I think this is too much but what do you think ?
It loads 237 files for each request. Do you think this is too much ?
When I deploy this app to the my server (Centos 6.4 with Apache + PHP 5.5.3 with Zend OPcache + MySQL) the memory consumption decreases dramatically. This is the screenshot from the server:
What do you think about this difference between my mac and the server ?
No, you don't really need to worry about this.
12MB is not really a large amount for a PHP program. And 100 concurrent connections is a lot.
To put it into context, assume your PHP page takes half a second to run, that would mean you'd need to have 12000 page loads per minute to achieve a consistent 100 concurrent connections. That's a lot more traffic than any of my sites get, I can tell you that.
Of course, if your page takes longer than half a second to load, this number will come down quickly, and your 100 concurrent connections can become a possibility much more easily.
This is one reason why it's a really good idea to focus on performance‡ -- the quicker your program can finish running, the quicker it can free up its memory for the next visitor. In fact unless you have a really major memory usage problem (which you don't), performance is probably more important in this context than the amount of memory being used.
In any case, if you do have 100 concurrent connections, you're likely to get issues with your server software before you have them with PHP. Apache has a default limit to the max number of connections, and it is a lot lower than 100. (you can raise it, of course, but if you really are getting that kind of traffic, you'll likely be wanting more servers anyway)
As for the 12M memory usage, you're not really ever likely to get much less than that for a PHP program. PHP needs a chunk of memory just in order to run in the first place, and the framework will need a chunk too, so most of your 12M will be due to that. This means that although your small program may be using 12M, it does not follow that a larger program would use twice as much. So you probably don't need to worry too much about it.
If you do have high traffic, and performance issues as a result, there are various ways you can mitigate the problem. The main one is by using caching. PHP 5.5 comes with an OpCache module built-in, which will cache your programs for you so that it doesn't have to do all the bootstrap work such as loading all the files every time. For some systems, this can have a dramatic impact on performance.
There are also other layers of caching you can use, such as a server-level page cache like Varnish, which will cache your static pages so that PHP doesn't even need to be called if the page content hasn't changed.
(‡ of course there are other reasons for focussing on performance too, like keeping your visitors happy)

PHP memory issue

I set memory_limit to -1 . Still i am getting out of memory issues.
I am working with a legacy system, which is poorly coded ( :) ). I ran apache benchmark to check the concurrent user access to the system
ab -n2000 -c100 http://......com/
In the log file i see so many memory related issues.
In the code they use object buffering. This can be the issue ?. Is object buffering is related to memory_limit ?
Changing the memory limit on PHP stops it being killed when it goes past a certain value. However, it does NOT physically give your hardware more memory (or swap). Ultimately, if it needs memory which you don't physically have then things will break.
Object buffering in PHP : I don't know what it means, if you mean Output buffering with ob_start and ob_stop it is not related to object buffering and has not really an impact on memory usage of PHP.
Memory usage of PHP depends on the size of created objects while you build the response of the request. If you perform several times the same request the memory usage of each php execution should be the same.
With a 'no limit' on memory usage the only thing you do is avoiding a request crash because of too much memory usage. That mean if your problem is memory usage on your index page you can easily test it by setting some values in this setting, and decrease until it crash (64Mo, 32Mo, 16Mo, 8Mo, etc). You do not need ab for that.
Now, when you're using ab you make your apache server respond to several parallel requests. For each PHP request you have a new apache process created. And this new apache process will execute an independant PHP-thing, and it will take the same amount of memory as the others process doing the same thing (as you request the same page, and nothing is shared between different PHP execution, and each PHP execution is done in one apache process).
I assume you're using apache with mpm_prefork and mod_php, not any php-fpm or fastcgi php.
So If you have a memory problem in that situation it's maybe that you allow too much process for apache. By default it's 150, if each process takes 30Mb of RAM (check that with top) then it makes 30*150=4.3Go. See the problem?
3 easy solutions
decrease the number of apache process (MaxClients), and set the MinSpareServer, MaxSpareServer and StartServer to that same amount, you wont loose time creating and destroying apache processes.
limit the PHP application memory usage, then you'll be able to handle more process (well, not so easy, can be a long rewrite)
use APC, it decrease the memory usage (and speed up execution)
and after that the other solutions are more complex
use an apache in worker mode or nginx, and get php out of the webserver with php-fpm
use a proxy cache like varnish to catch requests that can be cached (pseudo static content), and avoid requesting apache & PHP too much.

up to date chatroom!y

I am looking to create an ajax powered chatroom for my website.
I have used yshout and it seems very good but crashes when there are too many connections.
What is the best way to go about doing this using the minimum resources possible?
Probably one of the following::
Exceeding the number available threads. Depending on your configuration, you'll have a limit to how many requests can be simultaneously served. Since yshout will be maintaining open connections for longer than normal requests, you're far more likely to exhaust your thread/process limit. See the relevant Apache documentation for more info (assuming Apache, of course).
Exceeding PHP memory limits. For the reasons above, you're likely to need more memory to handle the multiple long running HTTP requests. Try throwing more memory in your server and bumping the PHPs memory limit.

LAMP and memory/swap space problem

My LAMP application seems to eventually use up all of my server's memory and swap space. My gut feeling is that it has something to do with the external processes I have to call (as that is the only time the problem manifests).
I need to call GhostScript, ImageMagick's "convert", PDFTK, etc. constantly. When those processes are running, that's when I see my memory running out. So, questions:
Which techniques should I use to conclusively identify which process is actually causing the memory problems? My plan right now is to run the processes individually and just observe the memory usage via the *nix command "top". Is there a way I can do this programatically?
Is there any "memory flushing" solutions I can use? Would this be a good idea to do?
Another problem you might face is the fact that when forking, the application where you fork from is "doubled" so its memory consumption doubles. If you have a app server which is resident and keeps a lot of data cached this can be very significant.
A solution to this problem is to run a small resident script/program listening on a socket or named pipe to launch the external programs.
You can use top -b (or similar) to receive computer readable output and monitor the memory consumption there with a script.
BTW: Do not count swap space as "real" memory, your app should run without ever hitting swap space. Once you start hitting swap space performance goes down so fast that request start piling up, causing more memory to be used, causing more stuff to be swapped. If you see significant swap space being allocated then increase memory (or buy a bigger hosting plan)

Categories