If I have two scripts and my memory_limit option is set to 64M (for example). If I run script 1 and it takes 40M and simultaneously run script 2 how much memory does it have free?
Does every script has up to 64M or they share this memory?
Each PHP script running independently is subject to its own memory limit (as set by the configuration option memory_limit). This limit is inherited into all included scripts (as those are dependent on the parent script).
An example - two scripts running in parallel:
you open a webpage /script1.php
you also open a webpage /script2.php
As these two are completely independent (there is no simple way to discover that "the other script" is running, or even that it exists), they will get a 64M limit each - so the amount of memory they are allowed to use is 64M for one, which comes out to 128M for two. (To clarify, it is not possible to "share" this memory between scripts: if script1.php only consumes 1MB, it can not "give" the rest of its limit to script2.php)
A different scenario - one script including another script:
you open a webpage /script1.php
that script has a line require('script2.php')
In this case, there is only a single 64MB limit: this is still considered a single script, no matter how many files it include()s or require()s. In other words, script2.php inherits the limit (as well as the other PHP settings) from script1.php and all the memory used here is counted towards that limit.
Note that it is possible to change this limit from inside the script (if your server's configuration allows this - most do). Using ini_set('memory_limit', '128M'); sets the new limit to 128 MB - but all the memory used by the script so far still counts towards this limit.
Scripts including other scripts and dynamically changing the memory limit:
you open the webpage /script1.php (in script1.php, limit 64 MB set from configuration)
that runs require('script2.php') (in script1.php, limit 64 MB inherited)
that runs ini_set('memory_limit','200MB') (in script2.php, changed explicitly to 200 MB)
that runs require('script3.php') (in script2.php, limit 200 MB as set above)
that runs require('script4.php') (in script3.php, limit 200 MB inherited)
that runs ini_set('memory_limit','-1') (in script4.php, changed explicitly to "no limit")
that runs require('script5.php') (in script4.php, "no limit on memory" inherited)
Note: It is possible to set a lower limit than it is currently, but this risks immediately overrunning it ("we're using 80 MB" - "set limit to 64 MB" - "out of memory error"); because of this risk, this is rarely attempted.
Note also that memory_limit is a configuration setting and as such it can be set in various places; or its modification can be prevented by the system administrator.
From the PHP Manual:
This sets the maximum amount of memory in bytes that a script is
allowed to allocate. This helps prevent poorly written scripts for
eating up all available memory on a server. Note that to have no
memory limit, set this directive to -1.
Meaning each script gets the same 64M amount of memory.
Related
Could it be that a POST request is limited to size? I have a large procedure I want to cache the output from. Basically I want to store a lare html-table in cache because of the growth a particulary project, the number of queries and thereby the responsetime is getting out of hand.
Now i'm sending the large output which is retrieved by an ajax-call, in another ajax-call (after the first one completes), but I only get a small piece of the data back. I think my ajaxfunction is correct because the stored output is always the same (in characters). But I'm missing about 90% of the output in the cache...
there is an 8 Mb max size for the POST method, by default (can be changed by setting the post_max_size in the php.ini file).
"Could it be that a POST request is limited to size?"
Yes, there is a PHP setting: post_max_size
I just ran into this problem myself and found that since PHP 5.3.9 there is a new setting available which restricts the total number of post variables (not just the size).
max_input_vars = 1000
This may have been the same issue you were running into if you were using a nightly or release candidate version of PHP at the time.
Look for these settings in your php.ini
; Maximum size of POST data that PHP will accept.
; http://php.net/post-max-size
post_max_size = 8M
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M
post_max_size specifies the maximum size of a POST, and since it has to be contained in memory, memory limit has to be bigger. The memory, will have to contain the running program and all the heap variables, including the POST.
If you increase only post_max_size over or near to the memory limit and forget to increase memory_limit, the POST size will be limited to a lower value when the memory is exhausted.
In this example you can see the default settings.
Is there any fixed time duration as how long a background task can run?
This is how I run the script (background task) manually:
php /var/www/html/app_v2/console.php massbulkinsert app.example.com 10 > /dev/null &
this script process huge data set, it takes about 1 hour to complete.
First time it stopped at 10100th record. second time it stopped at 9975th record. There is no pattern of it terminating.
top command and the mysql pid was at 98% and 100% and 130% most of the time and the free memory had about 200 MB. There is enough disk space.
Its a bit of a wild guess, but usually when you succeed with a smaller amount of data - and then gets crashes with larger amounts, it has to do with memory issues.
You should have a look at /etc/php5/cli. There is probably also a folder named cgi inthere - depending how your framework executes the background script i would expect either of these two configurations are used.
Files with extensions called 'ini' are configurations for PHP scripting, and these are among the values that you're interested in (values are defaults on debian 8):
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
; Maximum amount of memory a script may consume
; http://php.net/memory-limit
memory_limit = -1
Note, that there is also a timeout for how long the script can spend, reading the data sent to it through, say a pipe (max_input_time). But seeing your command, youre not piping values to it via stdin - but most likely reading a file already on the disk.
Hope it helps
I have 2 servers
Both server's have the same php memory_limit of 128M of data.
My Dev Server runs a script just fine, while on my prod server I am receiving a Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in ...
My question is what are other reasons I would be running out of memory in the prod environment even though my php memory_limits are the same?
Preface
PHP is module that runs top of Apache [HTTPD Server] this involves linking the php interpreter against a library of hooks published by the webserver
Cause
Now it can exhaust due to scripts running allocating memory [RAM] & reach its threshold & get such errors.
Example big loops running & saving lots of data in memory which may over RUN the Memory
Possible Optimization you can do
memory_limit = 32M to your server's main php.ini file (recommended, if you have access)
php_value memory_limit 32M in your .htaccess file in the
These are some work around for pages where you RUN out of Memory
ini_set('memory_limit', '-1'); overrides the default PHP memory limit (On individual php pages wherever you need extra memory)
Also some optimization you can do on HTTP Server (apache.conf or http.conf)
RLimitCPU,RLimitNPROC, RLimitMEM parameters can be adusted
Since you are running out of Memory you can adjust RLimitMEM
Syntax: RLimitMEM soft-bytes [hard-bytes]
Example: RLimitMEM 1048576 2097152
This directive sets the soft and hard limits for maximum memory usage of a process in bytes. It takes one or two parameters. The first parameter sets the soft resource limit for all processes. The second parameter sets the maximum resource limit. Either parameter can be a number, or ``max'', which indicates to the server that the limit should match the maximum allowed by the operating system configuration. Raising the maximum resource limit requires the server to be running as the user ``root'' or in the initial start-up phase.
You can put the lines in .htaccess files too [since in shared hosting you didnt have access to php.ini & http.conf files
I have a relatively small store, about 20k skus all simple products. I'm using magento 1.7.2 but had the same problem with all the older versions. I simply cannot export my products to a CSV. Out of memory when running directly from the dataflow profiles in the magento backend and same error when running it from shell.
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 71 bytes) in /home/public_html/app/code/core/Mage/Eav/Model/Entity/Attribute/Source/Table.php on line 62
I've increased the memory limits and execution times in magentos htaccess to 512m, magentos php.ini to 512m and my VPS php configuration ini to 512mb. it still burns through it in about 4 minutes and runs out of memory.
I'm so confused, my entire database (zipped) is only 28mb! what am I missing to make the magento export all products function work?
Magento dataflow does have a tendency to use massive amounts of memory making exports on large stores difficult. For stores with large product catalogues it is often a lot quicker and easier to write a script to export directly from the database rather than through dataflow.
This might be problem with your .htaccess file(s) aren’t overriding your memory_limit settings globally set in php.ini.
Other option is set memory_limit to unlimited in you index.php for testing. Then you will come to know whether changes in .htaccess is not getting effected or not.
I solved this problem by exporting 500, 1000 or as many as I wanted at a time (with a custom export script).
I made a file that received as parameters $start and $productsToExport. The file took the collection of products, and then used
LIMIT ($start-1)*$productsToExport, $productsToExport
This script only returned the number of products exported.
I made a second, master script that did a recursive AJAX to the first file, with the parameters $start = 0, $productsToExport = 500. When the AJAX was finished, it did the same with $start = 1, and so on, until no products are left.
The advantage of this is that it doesn't overload the server (one ajax is run only after the previous is finished) - and if an error occurs, the script continues. Also thememory_limit and max_execution_time are sa
If by 20k skus you mean 20.000 then this is totally possible. The export is very hungry for memory, unfortunately. I always increase the memory_limit to 2000M in this case and then it takes a while to create the file, but succeeds in the end.
We have recently replaced our flash-based uploader with this ajax one:
http://valums.com/ajax-upload/
Unfortunately, this uses the POST mechanism to transfer data to the server, which means that if I have a 50MB upload limit, I need to add at least 50MB to my PHP ini, as the data is added into the $_POST array.
Unfortunately, this means that all pages now have this huge added limit, which is not acceptable. I cannot even set the limit for the page on the fly, since using the ini_set would occur AFTER the $_POST processing had occurred. Can anyone think of an alternate solution?
In addition, anything exceeding the max limit causes a PHP seg fault / Fatal Error! Any ideas?
You can do something like this in Apache in the .conf file for your site's configuration:
<Location /upload.php>
php_value memory_limit 60M
</Location>
That makes the higher memory limit apply only to scripts invoked /upload.php in the URL. You can do similar overrides with <Files>, <FilesMatch>, <Directory>, etc...
The override has to be done at this level, since as you've found out, by the time ini_set would get executed, the script's already been killed.
The memory limit will not preallocate the 50Mb, it's a hard limit to how much it can use.
If previously no script is failing, it's quite probable that you won't notice the increased memory limit.