I've an issue with my cap deployment when the command runs :
/usr/bin/env composer install --prefer-dist --no-interaction --no-progress --optimize-autoloader
It takes few minutes before I receive this error message:
composer stdout: Warning: proc_open(): fork failed - Cannot allocate memory in phar:///usr/local/bin/composer/vendor/symfony/console/Application.php on line 954
composer stderr: Loading composer repositories with package information
The following exception is caused by a lack of memory and not having swap configured
Check https://getcomposer.org/doc/articles/troubleshooting.md#proc-open-fork-failed-errors for details
If i'm going to the current release of project on the server, and I remove vendor folder, then I execute "composer install", it works.
Locally, on my computer, mamp env, I execute this command with a memory limit :
php -dmemory_limit=2G /usr/local/bin/composer install
it works also.
The problem occurs only when I use capistrano deployment.
Any idea ?
For information :
I checked the server, it has 4Go Ram.
Swap is activated.
php memory limit is set -1.
Thanks,
I just upgraded my config with 6Go RAM. Composer install seems working ... but why does it need so much memory ?
Related
I bought source code from https://codecanyon.net/item/addchat-laravel-pro/25750268 and tried to install on my hosting server.
My hosting server is godady.
https://addchat-laravel-pro-docs.classiebit.com/docs/1.1/installation
This is install guide of addchat.
With putty, in terminal, I run the command https://addchat-laravel-pro-docs.classiebit.com/docs/1.1/installation and after that, I can see picture as follow;
If you have experience with addchat-laravel-pro and godady, please let me know.
I hope your help.
Thanks
when I run command "free -m", I can see picture as follow;
It's better to upgrade to composer v2, it's very fast.
Try:
COMPOSER_MEMORY_LIMIT=-1 composer require PACKAGE_NAME
Problem:
You got the following error on your production server after using composer require classiebit/addchat-laravel-pro.
./composer.json has been updated
Loading composer repositories with package information
Updating dependencies (including require-dev)
Killed
Because your server is running out of memory.
Solution:
You need to upgrade your server to one with more RAM.
You should install that package first on your local development server and git push to your remote server. Once you have git pulled on your production server you should run composer install rather than composer update.
UPDATED:
$>which composer
/usr/local/bin/composer
$>php -d memory_limit=512M /usr/local/bin/composer require classiebit/addchat-laravel-pro
UPDATED AGAIN:
The solution is to use the COMPOSER_MEMORY_LIMIT environment variable, setting its value to -1. It can be added to the current Terminal session with
$ export COMPOSER_MEMORY_LIMIT=-1
After
$ COMPOSER_MEMORY_LIMIT=-1 composer require classiebit/addchat-laravel-pro
UPDATED (2020/12/20):
I see that 49930 MB memory are cached. You could try flush the memory cache on your system.
sync; echo 3 > /proc/sys/vm/drop_caches
I am using Symfony 4...
I run this command:
php -d memory_limit=-1 composer.phar require form
The script runs successfully through these points...
./composer.json has been updated
Loading composer repositories with package information
Updating dependencies (including require-dev)
Nothing to install or update
Package symfony/lts is abandoned, you should avoid using it. Use symfony/flex instead.
Generating autoload files
ocramius/package-versions: Generating version class...
ocramius/package-versions: ...done generating version class
Executing script cache:clear [KO]
[KO]
Script cache:clear returned with error code 255
!!
!! // Clearing the cache for the dev environment with debug
!! // true
Then I get this error:
!! Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32768 bytes) in /Applications/XAMPP/xamppfiles/htdocs/xxxx/vendor/symfony/var-dumper/Cloner/Data.php on line 306
I can run this script without issue:
php -d memory_limit=-1 bin/console cache:clear
I do not know how to get around this because my command is saying to bypass memory limit. What can I do to get beyond this error, I cannot get anything installed at this point.
When Composer executes script it's a separate PHP call by Composer, so your command line directive doesn't apply to it.
The typical solution would be to configure PHP via its configuration file, so that all instances are affected.
If you don't have access to change PHP configuration your best option is probably to run composer with --no-scripts and then try to run necessary scripts individually with memory limit option.
Setting the memory limit for the whole php process might be dangerous, as this could lead to memory problems when executing PHP in a web context. So be sure to only update the configuration for CLI. Composer also provides a configuration point via environment variable COMPOSER_MEMORY_LIMIT You can either set it via export COMPOSER_MEMORY_LIMIT=-1 add it to your .bashrc if you always want to enable it or just prepend the command itself if you only need it once during installation:
COMPOSER_MEMORY_LIMIT=-1 composer require form
You can find all the different ways of getting around memory limit-related problems in Composer's Troubleshooting Guide as well.
I have small project made in symfony2 when I try to build it on my server it's always fails when unzipping symfony. Build was OK and suddenly composer won't unzip symfony and I didn't change anything. I tried to build with Jenkins and also manually from bash with same result. It's not permissions problem and also internet connection on my server is OK.
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
- Installing symfony/symfony (v2.3.4)
Downloading: 100%
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "unzip '/path/vendor/symfony/symfony/6116f6f3
d4125a757858954cb107e64b' -d 'vendor/composer/b2f33269' && chmod -R u+w 'vendor/composer/b2f33269'" exceeded the timeout of 300 seconds.
Check with composer update/install -o -vvv whether the package is being loaded from composers' cache.
If yes, try clearing composer's cache or try adding --cache-dir=/dev/null.
To force downloading an archive instead of cloning sources, use the --prefer-dist option in combination with --no-dev.
Otherwise you could try raising composer's process timeout value:
export COMPOSER_PROCESS_TIMEOUT=600 # default is 300
composer config --global process-timeout 2000
The easiest method is add config option to composer.json file, Add process-timeout 0, That's all. It works anywhere.
{
.....
"scripts": {
"start": "php -S 0.0.0.0:8080 -t public public/index.php"
},
"config": {
"process-timeout":0
}
}
Composer itself impose a limit on how long it would allow for the remote git operation. A look at the Composer documentation confirms that the environment variable COMPOSER_PROCESS_TIMEOUT governs this. The variable is set to a default value of 300 (seconds) which is apparently not enough for a large clone operation using a slow internet connection.
Raise this value using:
COMPOSER_PROCESS_TIMEOUT=2000 composer install
It's an old thread but I found out the reason for time out was running a php debugger (PHPStorm was listening to xdebug connections) which caused the process timeout. When I closed the PHPStorm or disabled the xdebug extension, no time out occurred.
old thread but new problem for me. No solutions here were working when trying to install google/apiclient (it failed on google/apiclient-services) on an Ubuntu VM within a Windows 10 host.
After noticing Windows' "antimalware executable" taking up considerable CPU cycles when doing this composer install/update, I disabled "real-time protection" on the Windows 10 machine, and my composer update/install worked!!
Hope that helps someone.
Deleting composer cache worked for me.
rm -rf ~/.composer/cache/*
The Symfony Component has process timeout set to 60 by default. That's why you get errors like this:
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "composer update" exceeded the timeout of 60 seconds.
Solution
Set timeout to 5 minutes or more
$process = new Process("composer update");
$process->setTimeout(300); // 5 minutes
$process->run();
I agree with most of what has been suggested above, but I had the same issue and what worked for me was deleting the vendor folder and re-run composer install
Regards
None of the solutions worked for me running on win10 wsl ubuntu (disabling firewall, removing debuggers, clearing cache, increasing timeout, deleting vendor).
The only way that worked was deleting vendor and composer.lock from the main machine, copying composer.json to a fresh machine, install php and composer, run composer install (it should take less than 1 second to execute), then copying the vendor dir to the other machine, and run composer update.
This is the problem slow NFS. Composer write cache into NFS directory. You must install composer globally and rewrite cache path.
This doesnt work:
php composer.phar install
Using this:
composer install
Before this run you must config composer globally. See this https://getcomposer.org/doc/00-intro.md#globally
Also, you must add this lines to your config.json:
"config": {
"cache-dir": "/var/cache/composer"
}
Works for me.
I am installing passport in laravel by composer using command
$ composer require laravel/passport
Using version ^6.0 for laravel/passport
./composer.json has been updated
Loading composer repositories with package information
Updating dependencies (including require-dev)
mmap() failed: [12] Cannot allocate memory
mmap() failed: [12] Cannot allocate memory
Fatal error: Out of memory (allocated 483401728) (tried to allocate 8388608 bytes) in phar:///opt/cpanel/composer/bin/composer/src/Composer/DependencyResolver/Solver.php on line 220
I got above errors please help me if you have any solutions.
I was able to install Passport by temporarily removing PHP's memory limit. I found this idea here: https://laravel.io/forum/02-11-2014-composer-running-out-of-memory
$ php -d memory_limit=-1 /usr/local/bin/composer require laravel/passport --verbose --profile
I like this solution because it overrides the PHP limit only once, so it allows you to push forward without any lasting affects. This will allow you to wait and see if you continue to get issues later, such as in the production environment, etc.
The default PHP installation allocates 500 MB~ RAM I believe, and when I ran that above command, it consumed 712 MB of RAM.
Extra note
At that above URL, there is also mention of committing the composer.lock file in the production environment. Historically, it can be a concern if, for example, you are developing localmachine on MacOS or Windows and then your production environment is Linux. It might not be likely, but it is possible a person could experience issues due to arbitrary packages determining what dependencies to select based on the detected operating system. If you commit the lock file, you are caching the packages/versions. The performance benefits will stem from that, but caching breeds rigidity.
I'm not sure of the true likelihood of what I'm saying. I'm saying this about composer, but I've seen it with npm and JavaScript.
Try the following steps:
sudo fallocate -l 2G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
In the root of your project in the composer.json in the line of require "require": {"laravel/ui": "^1.1"} and next, composer update
Source
Memory limit errors.
Composer may sometimes fail on some commands with this message:
PHP Fatal error: Allowed memory size of XXXXXX bytes exhausted <...>
In this case, the PHP memory_limit should be increased.
Note: Composer internally increases the `memory_limit` to 1.5G.
To get the current memory_limit value, run:
php -r "echo ini_get('memory_limit').PHP_EOL;"
Try increasing the limit in your php.ini file (ex. /etc/php5/cli/php.ini for Debian-like systems):
Use -1 for unlimited or define an explicit value like 2G
memory_limit = -1
Composer also respects a memory limit defined by the COMPOSER_MEMORY_LIMIT environment variable:
COMPOSER_MEMORY_LIMIT=-1 composer.phar <...>
Or, you can increase the limit with a command-line argument:
php -d memory_limit=-1 composer.phar <...>
This issue can also happen on cPanel instances, when the shell fork bomb protection is activated. For more information, see the documentation of the fork bomb feature on the cPanel site.
This answer might also be useful.
I have small project made in symfony2 when I try to build it on my server it's always fails when unzipping symfony. Build was OK and suddenly composer won't unzip symfony and I didn't change anything. I tried to build with Jenkins and also manually from bash with same result. It's not permissions problem and also internet connection on my server is OK.
Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
- Installing symfony/symfony (v2.3.4)
Downloading: 100%
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "unzip '/path/vendor/symfony/symfony/6116f6f3
d4125a757858954cb107e64b' -d 'vendor/composer/b2f33269' && chmod -R u+w 'vendor/composer/b2f33269'" exceeded the timeout of 300 seconds.
Check with composer update/install -o -vvv whether the package is being loaded from composers' cache.
If yes, try clearing composer's cache or try adding --cache-dir=/dev/null.
To force downloading an archive instead of cloning sources, use the --prefer-dist option in combination with --no-dev.
Otherwise you could try raising composer's process timeout value:
export COMPOSER_PROCESS_TIMEOUT=600 # default is 300
composer config --global process-timeout 2000
The easiest method is add config option to composer.json file, Add process-timeout 0, That's all. It works anywhere.
{
.....
"scripts": {
"start": "php -S 0.0.0.0:8080 -t public public/index.php"
},
"config": {
"process-timeout":0
}
}
Composer itself impose a limit on how long it would allow for the remote git operation. A look at the Composer documentation confirms that the environment variable COMPOSER_PROCESS_TIMEOUT governs this. The variable is set to a default value of 300 (seconds) which is apparently not enough for a large clone operation using a slow internet connection.
Raise this value using:
COMPOSER_PROCESS_TIMEOUT=2000 composer install
It's an old thread but I found out the reason for time out was running a php debugger (PHPStorm was listening to xdebug connections) which caused the process timeout. When I closed the PHPStorm or disabled the xdebug extension, no time out occurred.
old thread but new problem for me. No solutions here were working when trying to install google/apiclient (it failed on google/apiclient-services) on an Ubuntu VM within a Windows 10 host.
After noticing Windows' "antimalware executable" taking up considerable CPU cycles when doing this composer install/update, I disabled "real-time protection" on the Windows 10 machine, and my composer update/install worked!!
Hope that helps someone.
Deleting composer cache worked for me.
rm -rf ~/.composer/cache/*
The Symfony Component has process timeout set to 60 by default. That's why you get errors like this:
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "composer update" exceeded the timeout of 60 seconds.
Solution
Set timeout to 5 minutes or more
$process = new Process("composer update");
$process->setTimeout(300); // 5 minutes
$process->run();
I agree with most of what has been suggested above, but I had the same issue and what worked for me was deleting the vendor folder and re-run composer install
Regards
None of the solutions worked for me running on win10 wsl ubuntu (disabling firewall, removing debuggers, clearing cache, increasing timeout, deleting vendor).
The only way that worked was deleting vendor and composer.lock from the main machine, copying composer.json to a fresh machine, install php and composer, run composer install (it should take less than 1 second to execute), then copying the vendor dir to the other machine, and run composer update.
This is the problem slow NFS. Composer write cache into NFS directory. You must install composer globally and rewrite cache path.
This doesnt work:
php composer.phar install
Using this:
composer install
Before this run you must config composer globally. See this https://getcomposer.org/doc/00-intro.md#globally
Also, you must add this lines to your config.json:
"config": {
"cache-dir": "/var/cache/composer"
}
Works for me.