Finding source of vps memory spike (lemonstand) - php

Im running lemonstand which is a php based ecommerce software on a dreamhost vps.
About once or twice a week I get an email that my vps went more than 10% over its memory allocation and it is being automatically rebooted.
How do I find the source of this memory spike? I was trying to get newrelic installed to try and hunt it down, but it apparently didnt install correctly when I did the install command via ssh and well, I just dont know where to go next with it.
Is there another application or another way to hunt down the source of my memory spike?

Related

Docker 2.3.0.4 with Devilbox on Windows 10 WSL2 is extremely slow

I have windows 10 2004, 8-core amd cpu, 12gb ram, running an existing wsl2 debian instance just fine. I can run a typical web stack php-fpm nginx mariadb redis-server with existing WSL and the performance is fine. I have been reading that wsl2 and docker is much better now and wanted to try it out.
I installed docker 2.3.0.4 with wsl2 and installed devilbox as the web stack.
I have limited the wls process to 4gb and 4 cpus using the .wslconfig file and that's all cool and works fine.
What doesn't work fine is that a simple php page (e.g. <?php echo time(); ?> can take 15-30 seconds to show up! anything that talks to a database incurs a 90 second load time. running a stock standard empty wordpress site is out of the question. Running the same script / site / database on nginx/debian on WSL2 works perfectly.
I can't run docker desktop application after windows rebooted (it worked before I rebooted). THe icon docker installer placed on my desktop doesn't do anything when I launch it. I don't see a process turning up in task manager. Nothing seems to crash, it just doesn't work. I can right click on the docker icon in the tray and get to the dashboard that way. I generally work at the command line anyway so it's no big deal, thought I'd mention it.
the devilbox installation sets up a local web server and that all works reasonably well. i had to docker-compose up several times to get it to pull the containers properly but then I do have terrible internet. the web interface is pretty snappy, I can launch tools like phpmyadmin and they LOAD fine. If I try to restore a 10mb database with 100 tables, it will time out and crash and burn. Restoring the same database through the shell via mysql works fine, even if my disk load goes to 90% and stays there for 5 minutes.
any virtual host web server I create performs abysmally. php pages take around 90 seconds to show up. a page with phpinfo() might take 45-90 seconds of white screen to show up.
HOW do I figure out why docker is just so bad compared to straight up linux on WSL2 on the same environment? I'm guessing it's something to do with the io bit windows task manager says the disk io is averaging 2% load.
I think simply that your 2004-generation processor does not support any of the virtualization features.
Docker runs a LAMP stack reasonably fast on my mediocre i5 setup.
During the past months I had some performance issues and tried various things like copying source from the regular disk (i.e. C: a.k.a. /mnt/c/) to inside of a WSL directory (i.e. inside of \\WSL$\UBUNTU\home\user\) but git & PHPStorm had various hard to diagnose issues (although there was an improvement in latency - and an additional bouns the file system is actually case sensitive then - so 1:1 with server environment - otherwise it's on NTFS)
However, the biggest boost to performance was to minimize the number of mapped directories - initially I had set up so that the mariadb database would be accessible to the host machine:
mariadb:
image: mariadb:10
volumes:
- ./docker/mariadb:/shared # removed this for huge boost
Removing it made work with docker bearable.
I think it problem of docker volume and windows file sharing. Even though U use WSL2 it not mean U use linux file system. If you use file from windows system, for example: d:/workspace/myproject, it sill windows file and folder, windows will read and scan them before any excute from docker.
I found the solution. It is install your project in WSL distribution. Open your linux distribution, for me it Ubuntu 18.0, you will start at /home/%USER_PATH%, install your project here, for example: /home/mypc/workspace/myproject
You can remote your IDE to WSL, in VSCode, it support by extenstion Remote WSL. That make your project work on linux file system only, not share with windows, and much more faster

Devlibox (Docker) extremly slow on Mac

I started to use Devilbox on Mac instead of Valet Plus. Devilbox is great but it is extremly slow. I found Performance issues on Docker for Mac in documentation, so I added MOUNT_OPTIONS=,cached to .env file. Result is better performance, but still too slow (30 seconds to load page in Symfony). Devilbox as such runs fast but projects with cache folder nope.
This is my current Docker setting (I enabled maximum of sources):
That might be related this answer which I've answered last week.
Docker in MacOs is very slow
Because of Mac OS Docker client doesn't equal real Docker performance on Linux.
OK, I tried docker-sync and I did not realize any speed up. I decided to install Valet plus as I need to have multiple PHP version (easily switchable), MailHog, Xdebug, SSL on local domains, DnsMasq etc. All of this comes out of the box in Valet plus. I thought it would be much better to develop in Docker but Symfony uses really a lot of cached files on disk so this was really unusable (as page load was between 30 to 60 seconds).

Magento 2 goes terribly slow (Developer mode)

Recently I started developing magento 2 projects.
First I tried on Windows with xampp and it was a mess... every refresh page was a nightmare, about 30-40sec to load the page. I read about it, that Windows system files is so slow working with magento because the large structure it has, and the article almmost was forcing you to use linux for developing on magento projects.
The problem is I need Windows for another company apps that only works on Windows, I tried to install a virtual machine with Virtualbox, it improved a bit... but the fact I'm working on a virtual machine pissed me off...
The next solution and I'm working currently, is using vagrant. Okay, I feel good developing on this way but it keeps going slow... 15-20s...
My config on Vagrant is 5120MB (pc has 8GB) and use all my pc 4 cores.
I'm feeling so bad working like this... when I was working on my previous projects, with symfony/Laravel/Codeigniter, was like:
write some lines of code, tab to browser, F5, INSTANTLY see changes.
On M2: write some lines of code, tab to browser, F5, wait... wait... okay now it refreshes the page, but it's not loaded, wait... wait... hmmm almost... okay. No changes but I cleaned the cache... ohhh I guess I had to remove static files too. Go for it... wait again...
God... There's no way M2 goes faster? I'm only asking 5s or something like that... it's just I'm feeling so dumb looking the screen waiting all the time...
For aclarations, I'm only asking for development mode, I tried had to install another project of magento on production mode for testing things faster and then it's okay fluid as hell compared with developer mode... because... omg... just try to do an order workflow again and again...
Well that's all... The only thing I didn't try is using Linux environment on the computer... but it's just the same as using vagrant... I don't understand... how are you developing M2 developers? in special frontend developers... I don't believe they are working the same way as me... waiting 20sec for loading the pages + cleaning cache + removing static files, etc.
Details: I tried everything with vagrant but don't improve, I'm currently on Ubuntu 15.04, Apache 2.4, PHP 5.6 (I tried 7 but still the same) mysql 5.6
This is the network tab:
http://i.imgur.com/HG7mbeX.png
2018 Update, Magento 2.2.4
Vagrant + Windows + Magento2 = disaster. Vagrant + Apple + Magento2 = disaster.
Ubuntu + Magento2 = cooking on gas.
Simple modules, e.g. a widget, take many days more than the expected 2-3 hours and it is not possible to remember what you are doing if it takes a minute to open a page, particularly so if you have to clear caches, compile, upgrade or anything else that should take no-time-at-all.
This I have experienced first hand, from working in an office where the options are Mac or Windows. After spending a whole day trying to change the template directive and failing to make one configuration change in 8 hours, I thought about giving it a go on a linux box to see if I had gone mad or if this Vagrant contrivance is as helpful as that drunken bum sleeping rough in the park down the road.
The aged linux box with anaemic RAM, an old SSD, stock Apache and no fancy cache things completed the task without problem, I was able to switch between developer and production modes effortlessly and get what had taken me days to not do done in minutes.
The work machine was 8th generation i7, the Vagrant setup was very much someone's baby and a lot of time had been spent building the beast. Yet tectonic plates move faster. Vagrant and virtualisation might be fashionable but it is no use for M2 development. In fact I installed M2 and did all the db and vhost setup for it in less time than it takes for a Vagrant box to build.
As for performance, since M2 on a basic linux setup is 10x faster than some clumsy Vagrant effort, it is easy to see where the real speed problems of Magento 2 are. If you fire up Lighthouse in Chrome you will see TTFB is absolutely fine but the performance halves if you minify and merge the JS + CSS. This is because M2 has a megabyte of scripts to download. This is the performance killer. If you are working on a Vagrant box then you will never see this and not have the speed to fix it. By fix it I mean write a proper theme that doesn't have nonsense such as jQuery loading on every page.
For production you need something that scales so you can get the normal speed enhancements going for that, e.g. Redis, opcode caching, Varnish, tweaked php-fpm, tweaked MySQL/MariaDB. If you are developing on Linux then you can test these things on localhost knowing they will work fine on production. With that abomination that is Vagrant you will be dabbling with these optimisations prematurely because you are hoping and praying for a performant machine because you need to get work done. However, in so doing, and with the absence of native speed, you will not get anything done.
If you don't have a spare machine to put linux on then just go to the local tip, get any PC, shove an SSD in it and you are good to go.
This is my recipe for developing themes/modules in localhost for Magento 2.2 and 2.3:
MacBook Pro
Valet Plus (Nginx, MySQL 5.7, PHP7.1 and 7.2 - you can easily switch between PHP versions with valet use 7.1 or valet use 7.2) https://github.com/weprovide/valet-plus
memory_limit set to 4G
Be sure Magento is set to developer mode: php bin/magento deploy:mode:set developer
ALL CACHES ENABLED except FPC. Whenever I need to test a change involving config files, etc I manually delete the content of the var/cache folder or the generated/code folder for DI changes. The cache type that specially slows down everything is the Configuration cache, so it must be enabled or the frontend/backend pages will load painfully slow.
I use Grunt Watch and the Livereload Chrome extension to see my changes to .less files without having to deploy static files with every change. https://devdocs.magento.com/guides/v2.3/frontend-dev-guide/css-topics/css_debug.html
Whenever I change a JS file I navigate to pub/static/[adminhtml/frontend]/[theme]/[locale]/ and delete ONLY the folder where the static file corresponding to the JS file I changed lives in. This prevents me from having to deploy ALL the static files. Magento will regenerate just the static files for the deleted folder saving a LOT of time (be sure to do a hard refresh in your browser every time you delete a static file)
It’s still not a perfect setup but it’s the fastest way I’ve found so far to be productive without pulling my hair out.
I tried everything and the only thing it works is the virtual machine that provides bitnami. https://bitnami.com/stack/magento/virtual-machine
Seriously, I don't know what has this vm, but goes really fast. I tried creating my VM using a fresh installation of Ubuntu, CentOS, etc. But doesn't work so fine like this VM.
If you work in developer mode you need to disable JS/CSS merge, disable xdebug and enable opcache. Feel free to run thes MySQL queries on your dev DB and flush cache. This will increate the site performance in developer mode.
UPDATE core_config_data SET value = '0' WHERE path = 'dev/css/merge_css_files';
UPDATE core_config_data SET value = '0' WHERE path = 'dev/css/minify_files';
UPDATE core_config_data SET value = '0' WHERE path = 'dev/js/merge_files';
UPDATE core_config_data SET value = '0' WHERE path = 'dev/js/minify_files';
UPDATE core_config_data SET value = '0' WHERE path = 'dev/js/enable_js_bundling';
UPDATE core_config_data SET value = '0' WHERE path = 'dev/static/sign';
Try to disable synchronisation with default vagrant sync folder (just comment config.vm.synced_folder in VagrantFile and reload) - it's to slow when need to work with a lot of files...
Also in developer mode will be useful to generate static files:
bin/magento setup:static-content:deploy and ensure that all caches are enabled: bin/magento cache:status
If it don't help you can try Magento DevBox tool based on Docker: http://devdocs.magento.com/guides/v2.1/install-gde/docker/docker-over.html
In "developer" mode, all caches were disabled.That why magento become slow.
I suggest to enable caches by execute command
./bin/magento cache:enable
However, you need to clean cache ./bin/magento cache:clean every time you modify xml files or configurations.
my recipe:
Use *nix as your main OS
Use docker with PHP 7 and Nginx
use gulp for generating css and js (faster than grunt)
use redis and varnish
disable only needed caches
And the most valuable advice - you really need SSD to work with magento2 if you still trying to develop on HDD
p/s Magento 2 more complicated than Symfony/Laravel/CI (M2 consist Symfony
by the way) and can't be so fast as pure frameworks
For production environment:
You must use Redis for handle Cache, Full Page Cache et Session
(http://devdocs.magento.com/guides/v2.0/config-guide/redis/config-redis.html)
You must use Varnish for HTTP cache built in with Magento
(http://devdocs.magento.com/guides/v2.1/config-guide/varnish/config-varnish.html)
You need to set up production Magento mode.
(http://devdocs.magento.com/guides/v2.1/config-guide/bootstrap/magento-modes.html)
You must use ElasticSearch for search engine, EE only
(http://devdocs.magento.com/guides/v2.1/config-guide/elasticsearch/es-overview.html)
You must use PHP 7
You may use MariaDB even if it is not supported by Magento 2.
You must use CSS minification and JS minification and JS bundling (which works only on production mode).
Check the official Magento 2 documentation in order to set up this production configuration.
A bit late here but i think the answer while working on vagrant / docker is mostly that the I/O of files is terribly slow.
My solution was simply do disable the whole shared folder and replace it with a remote project (sftp connection) in PhpStorm. All files are so stored within the virtual machine and don't have to be synced everytime the page needs a reload.
The main benefit of course is, that it is amazingly fast while working on developer mode.
But also there are some minor problems while working with this setup:
You can't run commands straight from your terminal. You have to ssh into your vagrant for running magento2 cli commands.
After running composer updates you may have to download the whole folder again, because in PhpStorm remote changes are not downloaded automatically.
I made this vagrant which allow you to customize mount options and has great performance:
nfs mount or regular mount
directory mount /var/www/magento/app or whole project /var/www/magento
https://github.com/zepgram/magento2-fast-vm
You can work on a fast magento installation and adapt parameters depending on your work practice and your host machine perf.
For example, if your host machine doesn't support NFS option and has bad performance you can mount only app directory which is enough for development.
#Henry's Cat is right. Non linux os + Magento2 = disaster.
If you are not working hard with xmls you can turn on magento cache
bin/magento cache:enable
and use bin/magento cache:clean when you modify something in theses files
or better just disable certain cache types bin/magento cache:disable db_ddl full_page . #Igor Sydorenko is absolutely right, disabling css js merging/minifiying will IMPROVE A LOT developer mode performance.
In order to give flexibility to developers, Magento generates a lot of files. If it runs in production mode, the slowest part is the disk read which can be optimized.
But while running Magento 2 in developer mode, disk read and write operations make it too slow.
I was also experiencing the same while developing Magento 2 applications. My first suggestion is to move to SSD. However, it is not possible for every everyone every time.
It was also not possible for me to install SSD in my high-end laptop with lot of RAM and CPU power.
I found a work around which made my development considerably fast in localhost using Redis cache. Cache cleaning and warming became extremely fast which reduced my waiting time drastically to see the changes. Here is the full article to use Redis cache in localhost with Magento 2.
Ok so i have been working with Magento 2.2.7 from approx 6-8 months . so there are some notes you should consider :
1. use SSD Hard Disk (if possible)
2. configure grunt in magento. it will surely help to make frontend devlopment in magento fast. because grunt helps to compile less file without need of executing s:s:d command.
grunt with magento
3. do not enable xdebug.
4. disable cache only if you are reloading page too many times in a row.
I tried many machines and many configuration like:
Windows 10 - vagrant machine debian
Windows 10 - vagrant machine debian - docker
Windows 10 - vagrant machine ubuntu - docker
Windows 10 - vagrant machine ubuntu
The problem of bitnami machine : not realy easy to be configured for Xdebug
In my experiance the Best one is a vagrant machine for those who want to work on Windows:
https://app.vagrantup.com/certiprosolutions
So use this config on your Vagrant file:
config.vm.box = "certiprosolutions/ubuntu-lnmp"
config.vm.box_check_update = false
# box modifications, including memory limits and box name.
config.vm.provider "virtualbox" do |vb|
vb.name = "Magento 2.3.3 ubuntu ngnix"
vb.memory = 8240
vb.cpus = 2
#vb.customize [ "modifyvm", :id, "--uartmode1", "disconnected" ]
end
The advantages:
you can switch between many configuration of PHP
(5.6,7.0,7.1,7.2,7.3)
work on many version of Magento in the same environment
A little note. to make xdebug work you should change the configuration of xdebug to that:
[XDEBUG]
zend_extension=xdebug.so
xdebug.default_enable = 1
xdebug.remote_enable = 1
xdebug.remote_connect_back = 1
xdebug.remote_autostart = true
xdebug.remote_handler = dbgp
xdebug.remote_port = 9001
xdebug.remote_host=127.0.0.1
xdebug.remote_log="/tmp/xdebug72.log"
;xdebug.max_nesting_level = 1000

Composer Hanging on Installing dependencies

After reading others with this question, most said to be patient.
It's been running for roughly 15 hours now and still nothing.
I've got it on the AWS EC2 Micro, which I know the ram is low but I added a 512mb swap.
Both the memory and cpu sit around 90%.
Is it safe to say that it's not going to finish or will it eventually do something? Is there anyway to log this process to see what it's doing?
Composer is supposed to finish a reasonably sized install job within seconds. It might probably need some minutes to finish if there is a really huge amount of packages to be installed (by huge I mean more than I experienced yet, i.e. probably more than 100 packages).
The process uses an unusual amount of RAM compared with other PHP scripts, i.e. hundreds of megabytes is not uncommon.
I'd advise to not run Composer on such a badly equipped machine. Adding swap space will not help in any way, it will just make the machine read and write to disk heavily, delaying the whole process by orders of magnitudes (like 100 or 1000 times slower). You should run the install step on your development machine, and then copy everything to the Amazon instance.
How long does it run on your local machine?

How To Run HipHop/HHVM Using Build Ubuntu

I followed this tutorial for installing here:
https://github.com/facebook/hiphop-php/wiki/Building-and-installing-HHVM-on-Ubuntu-13.04
But I can't figure out how to run it. I've gone to to the hphp/hhvm/hhvm and I've run this on hhhm
root#hhvm-ubuntu:~/dev/hiphop-php/hphp/hhvm# ls
CMakeFiles CMakeLists.txt hhvm main.cpp process_init.cpp
cmake_install.cmake global_variables.cpp link_hphp.sh Makefile process_init.h
The problem is each time I run, the server crashes. Actually the server is slow with hhvm install, its a 1 GB instance on Rackspace. But how am I suppose to run hip-hop after compiling from source?
You just run hphp/hhvm/hhvm some_file.php if you want it in command line or hphp/hhvm/hhvm -m server /some/document_root/ for a server. Look on the wiki for more config information.
I don't have the link handy, but 1 gb is not enough to run HipHopVM. The process itself will easily chew up that amount of ram by itself. When it chews up more ram than you have it will slow to a crawl and then eventually crash.
Try using it in a 4gig instance. You may have better luck.
Take a look at this article for some more info on configuring hhvm.

Categories