We have a website which had a previous memory limit of 12 MB (12 MB in php.ini, and 16 MB in settings.php) and worked previously.
After moving to a new server it started giving memory limit errors and displaying half-blank screen.
We increaded the limit in both files (php.ini and settings.php) and now it works, but I dont understand how is it possible that now it needs a considerably larger amount of memory (it used to work with 12 MB, now it cont work with less than 20 MB).
I assume you did not change the OS in the process. Moving from Windows to Linux or vice versa is quite likely to change resource usage.
And this is a long shot, but perhaps you moved from a 32-bit system to a 64-bit one? This would slightly increase memory usage as addresses (pointers) are twice as large on 64-bit architectures, and code with lots of small objects uses plenty of pointers.
On the whole though, we can't tell you much without seeing what changed about the system.
12 is too low, if you don't use only drupal as it is. Higher is recommend, than more modules you will install, usually 96MB is enough with image processing...
12 MB is really very low. I would tend to ignore it and go on.
Ideas what could have changed, though:
The old server could have had modules installed that reduced memory usage, e.g. memcache
The new server may have to rely on GD library for image processing, while the old server maybe had ImageMagick (which is an external tool and doesn't count towards the memory limit)
Related
My PHP application on Windows+Apache has stopped with showing “Out of memory (allocated 422313984) (tried to allocate 45792935 bytes)”.
I can’t understand why it’s stopped because my machine has 4GB physical memory and I’ve set memory_limit directive for -1 in PHP.ini file. I’ve also restarted Apache.
I think 4GB is enough to allocate more than 422313984+45792935 byte memories.
Is there another setting to use memory for PHP or Apache?
I also summarize performance counter .It shows MAX memory usage was 2GB in total of machine. And the httpd process used 1.3GB.
I can’t show the code but actually the code fetches 30000 rows, 199 byte each, from DBMS and parsese into XML using simplexml_load_string() in a loop.
The code is normally finished if its data is small or shorten looping term like 30000 to 1000.
Another case is the first run after starting Apache will be succeeded.
I think some memory leak happen.
Actually I did echo PHP_INT_SIZE and PHP shows 4. So perhaps my PHP is 32-bit version.
If memory usage problem is from this version of PHP as Álvaro G. Vicario points at bellow, can I fix it by changing for 64-bit version of PHP? And how can I get to 64-bit version of PHP for Windows? I can’t find it in http://windows.php.net
«Out of memory» messages (not to be confused with «Allowed memory size exhausted» ones) always indicate that the PHP interpreter literally ran out of memory. There's no PHP or Apache setting you can tweak—the computer is just no able to feed PHP with more RAM. Common causes include:
Scripts that use too much memory.
Memory leaks or bugs in the PHP interpreter.
SimpleXML is a by no means a lightweight extension. On the contrary, its easy of use and handy features come at a cost: high resource consumption. Even without seeing a single line of code, I can assure that SimpleXML is totally unsuitable to create an XML file with 30,000 items. A PHP script that uses 2GB of RAM can only take down the whole server.
Nobody likes changing a base library in the middle of a project but you'll eventually need to do so. PHP provides a pull parser called XMLWriter. It's really not much harder to use and it provides two benefits:
It's way less resource intensive, since it doesn't create the complex object that SimpleXML uses.
You can flush partial results to file.
Can even write to file directly.
With it, I'm sure your 2 GB script can run with a few MB.
I'm trying to troubleshoot a memory issue I've run into with Wordpress and rather than bore you with the whole problem I was hoping to get a nice compact answer to three parts of my problem:
Normal Memory Footprint. I know there is no real "normal" Wordpress script and yet I think it would be quite useful to hear from people what a typical Wordpress script's memory footprint is. Let's call "normal" for sake of argument as a installation with very few plugins, a base type theme like twenty-twelve, and a script that has some DB retrieval but nothing monumental ... maybe a typical blog roll page or something. What I'm trying to understand is what is the baseline memory footprint (a range not a discrete number) that a more complicated script would be starting from?
Memory Ceiling Versus memory_get_usage(). I have been putting lots of logging in my scripts that pull out the memory usage by using PHP's memory_get_usage(true) call. This seems like one of the few troubleshooting techniques that determine where the memory is being used but what perplexes me is that I see memory usage ranging from 15M to 45M at the script level -- note this is with the "true" parameter so this includes the overhead of the memory manager - and yet in many instances I'll see a 27M script all of a sudden fall over with the message that the "Allowed memory size of 268435456 bytes exhausted". It is possible that maybe there is one very large memory request that takes place after the logging but I'm interested to hear if other people have found any differences between the memory limit and the memory reported by memory_get_usage()?
New Memory Ceiling Ignored. In a desperate attempt to get the site back to working -- and buy me time to troubleshoot -- I thought I'd just up the memory limit in the php.ini file to 512M but doing this seems to have had no impact. The fatal error continues to talk about the old 256M limit.
Any help would be appreciated. Thanks in advance.
Hopefully someone can answers your question so detailed. By my side:
Q: What is a normal amount of memory for a Wordpress script to use?
A1.- As a WP is a plugin driven CMS, memory depends on these plugins. As you must know there exists very bad coded ones. But an out-of-box WP has a very good performance.
A2.- To try helping you to find bottlenecks I recommend you to use BlackBox (WordPress Debug Bar Plugin )
... As for information you will find in profiler, these are: time passed
since profiler was started and total memory WordPress was using when
checkpoint was reached ...
I just found this interesting article:
WordPress Memory Usage & Website Outage Issues Resolved.
I ran a test for Wordpress 4.4 with a clean install on a windows 7 PC (a local install).
Memory Used / Allocated:
9.37 MB / 9.5 MB
Total Files: 89
Total File Size: 2923.38 KB
Ran in 1.27507 seconds
This was all done in the index file, timing before anything is called and memory / file usage after everything is 100% finished.
I tried a few pages (category, archive, single post, etc..) and all were very similar (within 1% difference) in files and memory usage.
I think it stands to reason this would be the best possible performance, so adding plugins /content will only bump these numbers up. May be possible a caching plugin would offer a little better performance though.
Joomla 1.5 - is enough 48MB memory in PHP settings ? or is better at least 64MB ?
I ask because sometime is realy slow respond.
Merax
Joomla is designed to run on the default 8M of memory that PHP.ini by default provides.
48 MB should be fine. Different from operating systems and classical applications that swap memory, increasing the memory limit in PHP will very rarely improve performance: If the memory limit is hit, the script usually simply crashes.
The reasons for the slowness of your site is most likely elsewhere, maybe on server level, maybe inside the Joomla application.
You may find the best resources on that in the Joomla community, which is very active. Googling turns up joomlaperformance.com which seems to have some articles.
Does PHP have a built-in limitation on how much memory it can use? In other words, if I have a machine with many gigs of RAM and change php.ini to allocate most of them, will scripts still hit some lower limit?
(If you're curious, the goal is to run an automatic documentation generator, written in PHP, on a very large PHP code base.)
PHP will consume as much as it has, this depends on your operating system. You can not extend the memory limit of PHP beyond what your OS has to offer.
Apart from the PHP ini directive memory_limit you are only bound by the machine's available RAM. Note that memory_limit is per script, so running multiple scripts at the same time can eventuall sum to more memory than you server has.
The maximum amount of memory per process can also be limited by the operating system and/or some configurable resource limits.
E.g. on a windows system a 32bit process is limited to 2/3/4 GB memory per process (depending on whether you use a 64bit version of windows and the setting of IMAGE_FILE_LARGE_ADDRESS_AWARE). A 64bit process might be limited to 2GB as well (with IMAGE_FILE_LARGE_ADDRESS_AWARE cleared).
On a linux system there are similar restrictions and often limits set via ulimit.
In a system I am currently working on, there is one process that loads large amount of data into an array for sorting/aggregating/whatever. I know this process needs optimising for memory usage, but in the short term it just needs to work.
Given the amount of data loaded into the array, we keep hitting the memory limit. It has been increased several times, and I am wondering is there a point where increasing it becomes generally a bad idea? or is it only a matter of how much RAM the machine has?
The machine has 2GB of RAM and the memory_limit is currently set at 1.5GB. We can easily add more RAM to the machine (and will anyway).
Have others encountered this kind of issue? and what were the solutions?
The configuration for the memory_limit of PHP running as an Apache module to server webpages has to take into consideration how many Apache process you can have at the same time on the machine -- see the MaxClients configuration option for Apache.
If MaxClients is 100 and you have 2,000 MB of RAM, a very quick calculation will show that you should not use more than 20 MB *(because 20 MB * 100 clients = 2 GB or RAM, ie the total amount of memory your server has)* for the memory_limit value.
And this is without considering that there are probably other things running on the same server, like MySQL, the system itself, ... And that Apache is probably already using some memory for itself.
Or course, this is also a "worst case scenario", that considers that each PHP page is using the maximum amount of memory it can.
In your case, if you need such a big amount of memory for only one job, I would not increase the memory_limit for PḦP running as an Apache module.
Instead, I would launch that job from command-line (or via a cron job), and specify a higher memory_limit specificaly in this one and only case.
This can be done with the -d option of php, like :
$ php -d memory_limit=1GB temp.php
string(3) "1GB"
Considering, in this case, that temp.php only contains :
var_dump(ini_get('memory_limit'));
In my opinion, this is way safer than increasing the memory_limit for the PHP module for Apache -- and it's what I usually do when I have a large dataset, or some really heavy stuff I cannot optimize or paginate.
If you need to define several values for the PHP CLI execution, you can also tell it to use another configuration file, instead of the default php.ini, with the -c option :
php -c /etc/phpcli.ini temp.php
That way, you have :
/etc/php.ini for Apache, with low memory_limit, low max_execution_time, ...
and /etc/phpcli.ini for batches run from command-line, with virtually no limit
This ensures your batches will be able to run -- and you'll still have security for your website (memory_limit and max_execution_time being security measures)
Still, if you have the time to optimize your script, you should ; for instance, in that kind of situation where you have to deal with lots of data, pagination is a must-have ;-)
Have you tried splitting the dataset into smaller parts and process only one part at the time?
If you fetch the data from a disk file, you can use the fread() function to load smaller chunks, or some sort of unbuffered db query in case of database.
I haven't checked up PHP since v3.something, but you also could use a form of cloud computing. 1GB dataset seems to be big enough to be processed on multiple machines.
Given that you know that there are memory issues with your script that need fixing and you are only looking for short-term solutions, then I won't address the ways to go about profiling and solving your memory issues. It sounds like you're going to get to that.
So, I would say the main things you have to keep in mind are:
Total memory load on the system
OS capabilities
PHP is only one small component of the system. If you allow it to eat up a vast quantity of your RAM, then the other processes will suffer, which could in turn affect the script itself. Notably, if you are pulling a lot of data out of a database, then your DBMS might be require a lot of memory in order to create result sets for your queries. As a quick fix, you might want to identify any queries you are running and free the results as soon as possible to give yourself more memory for a long job run.
In terms of OS capabilities, you should keep in mind that 32-bit systems, which you are likely running on, can only address up to 4GB of RAM without special handling. Often the limit can be much less depending on how it's used. Some Windows chipsets and configurations can actually have less than 3GB available to the system, even with 4GB or more physically installed. You should check to see how much your system can address.
You say that you've increased the memory limit several times, so obviously this job is growing larger and larger in scope. If you're up to 1.5Gb, then even installing 2Gb more RAM sounds like it will just be a short reprieve.
Have others encountered this kind of
issue? and what were the solutions?
I think you probably already know that the only real solution is to break down and spend the time to optimize the script soon, or you'll end up with a job that will be too big to run.