Last week, I tried to deploy a simple symfony app on azure.
I choose the plan app service B2 (2cores / 3.5Go RAM).
PHP Version : 5.6.
First it took forever to complete the composer install. (I tried to go on S3, it was a little faster but not very different).
So I tried to optimize the php config, opcache, realpath_cache_size...etc (xdebug already disabled).
I even tried to enable wincache, but with no real improvment.
So now my app is deployed, but it is too slow to be usable.
A simple php app/console (in dev mode) takes ~23secondes.
It seems to recreate the cache everytime. On my local unix environnment (similar specs), it takes 6seconds when the cache is cold and 500ms when the dev cache is warm.
I think that the main problem is a filesystem issue, because to remove the dev cache folder it takes 16 seconds.
On my local unix environnment, similar specs, it takes ~200ms to remove the same folder.
Like I said I tried S3 Plan with a small improvment but not enough to explain this slowness.
One thing weird, it's that if I rerun the command php app/console just after it finished, the command takes 5seconds to run (much better). But If rerun it 5seconds after it finished, it takes 23seconds.
I already tried these solutions (even if the environnment is different) :
https://stackoverflow.com/a/17021255/6309878
Update : I tried to set the symfony app/cache folder to the local filesystem D:\local\cache, but no improvment, it may be worst.
Please try below steps and let me know if it improves the performance -
1) In the wwwroot directory of your site, create a .user.ini file (if it doesn’t already exist) and add “wincache.fcenabled=0”. This will disable Wincache.
2) Go to Azure Portal and go to the Application Settings for your app. In the App Settings section, add “WEBSITES_DYNAMIC_CACHE” with a value of 1.
3) Restart the site.
Related
I'm working on a "legacy" Symfony (it's using Symfony 4 but It's not maintained anymore). The problem is that the cache folder is growing every day, raising 50GB after a few months.
It's running as a DEV environment, but as the original developer left the company we would like to just "patch" the problem cleaning the cache after X time instead of changing the environment to a production one (which could lead to different problems and maybe it won't solve the cache issue), just like rotating Symfony logs where you can configure Symfony to log every day in a different file and remove old files automatically.
There is no ready made way to do this from within the application.
Just clear the cache every now and then. (bin/console cache:clear). You could even schedule this as a cron job to run overnight, the process usually takes only a couple/few seconds at most.
Make sure that you run the command with the same user that's running the application (e.g. www-data, because if you run the command as root the cache will be warmed up with the wrong permissions).
Mind you, running a production system in development mode is inherently dangerous, as it's more likely to leak configuration data on unexpected situations.
At work we took back our existing store running on Magento 2 from an external development agency. I need to get the project running in local development (with docker).
I familiarized myself with a vanilla project from the official docs and managed to get it running by downloading the vanilla template with composer, granting the proper permissions on files and folder and running the magento setup:install command.
My question is how do one goes when kick starting from an existing (production running) project?
Do I need to run setup:install again? If I do, why?
What do I need to import from production to ensure any content or configuration created via the admin is also running on my local setup? Should I import the complete Database from production?
I know our setup is using more than just php and mysql, but env.php seems to be listing only db configuration and admin url. Where can I get the complete service configuration informations about what our setup uses?
Anything else I am missing to get started with an existing project for local development?
As someone who is running Magento 2 on a local environment myself, hopefully I can shed some light on this.
If you have a direct copy of the live site, you do not need to run setup:install again.
Ensure you have a copy of the entire Magento 2 site (you can technically ignore the vendor folder, as you can run composer install and it will redownload those files, but that's up to you). Also get a copy of the entire database. Magento 2 is notorious for copying the same data to multiple tables so something could break if you don't have everything.
What do you mean by "service configurations" If you are referring to Magento 2 extensions, that data is saved in the database, not the env.php file. env.php is only for server side configurations, such as the DB information, Caching, and things of that nature. On mine, I use Redis for site Cache, so that would be included in that file as well, as an example.
When you first unpack the site to your local environment, run composer update in the directory. This will ensure you have all the proper files installed. If you are going to run a local dev environment, set the mode to development with the following command: bin/magento deploy:mode:set developer. This will allow you to make changes and to view those changes by just refreshing the page, rather than flushing cache all the time.
All queries are replied correctly by Eric. I am also not sure about "service configurations" you have mentioned here. If this is about third-party extensions/services you can check config.php file for this.
i'm using AWS for my application. current configuration is:
Load balancer --> Multiple EC2 instances (auto scaled) --> all mount a NFS drive with SVN sync
Every time we want to update the application we login to the NFS server (another EC2 instance), and execute svn update to the application folder.
I wanted to know if there is a better way (architecture) to work since i sometime get permission changes after SVN update and server take a while to update. (thought about mounting S3 as a drive).
My app is a PHP + Yii framwork + and mysql DB.
Thanks for the help,
Danny
You could use a slightly more sophisticated solution:
Move all your dynamic directories (protected/runtime/, assets/, etc.) outside the SVN-Directory (use svn:ignore) and symlink them into your app directory. This may require some configuration change in your webserver to follow symlinks, though.
/var/www/myapp/config
/var/www/myapp/runtime
/var/www/myapp/assets
/var/www/myapp/htdocs/protected/config-> ../../config
/var/www/myapp/htdocs/protected/runtime -> ../../runtime
/var/www/myapp/htdocs/assets -> ../assets
On deployment start with a fresh SVN copy in htdocs.new where you create the same symlinks and can fix permissions
/var/www/myapp/htdocs.new/protected/config-> ../../config
/var/www/myapp/htdocs.new/protected/runtime -> ../../runtime
/var/www/myapp/htdocs.new/assets -> ../assets
Move the htdocs to htdocs.old and htdocs.new to htdocs. You may also have to HUP the webserver.
This way you can completely avoid the NFS mount as you have time to prepare step 1 and 2. The only challange is to synchronize step 3 on all machines. You could for example use at to schedule the update on all machines at the same time.
As a bonus you can always undo a deployment if something goes wrong.
Situation might be more complex if you have to run migrations, though. If the migrations are not backwards compatible with your app, you probably can't avoid some downtime.
I have a developed a small web-app in Symfony 2 and Doctrine 2.
Can i deploy it to a web-host that doesn't give SSH access?
I ask this because i see there are a lot of task that must be done from the terminal, like updating the database schema, creating symlinks for the assets, clearing cache, etc...
Should not be a problem:
Create a copy of the system somewhere, ideally with identical DB connection params like the production system.
Run all the necessary tasks with the --env=prod parameter, if your DB settings allow it.
Clone the created production database to the production system (with phpMyAdmin). You can clone the schema from the production database, run app/console doctrine:schema:update --dump-sql locally and then run the generated SQL on the production server.
Copy all the files, excluding the dirs in app/cache and app/log
I have done this many times with SF 1.4, and it should be just as easy with SF 2.
Some low end hosts have restrictions that will cause issues for symfony, so its important to run the symfony compatibility checker script (you can upload it and then enter its URL in your browser to get the output). Once thats done, follow these simple steps:
copy over all the files for the project. I usually zip/tar the project folder, upload it, and unpack.
Export the database from your development environment and upload it to your new server.
Edit the config and update your database settings. If you have hardcoded paths somewhere in your code, now is the time to fix those as well.
Make sure that the user for apache (or whatever server software your host uses) has full access to the cache and log directories. This can be tricky on some hosts, I have had to contact support in the past to have someone log in and change permissions.
In your web hosts configuration tool, set the webroot for your site to the web folder in your project.
Maybe there is a way (with sftp for example), but it would be like trying to ride a bike with square wheels ;)
I work a lot with the WindowsAzure4E(clipse) IDE. And it's always pain to wait for the local test deployment)
Isn't there a way to develop on the deployed PHP files which must be stored somewhere to inetput or something else?
thx for your ideas.
Yes! In fact, I just got this working myself yesterday.
After installing PHP 5.3 with CGI support for IIS (making the necessary php.ini modifications of course), I simply created a new site in IIS that mapped to a role in the workspace for my Eclipse project.
Keep in mind that there's one hiccup to this and that is that the php_azure.dll file, used to access the service configuration and mount azure drives, was built to run in the azure fabric (either development or hosted). In my case, I don't NEED these features so I removed referrences to things like getconfig and poof the project loads in IIS just fine. I only need to make sure I start Azure Storage prior to launching the application.
I've been told that some folks are able to update their systems path environment variable with the location of the azure diagnostics dll (diagnostics.dll) and have it work without this modification. But this route didn't work for me. :(
I'll actually be blogging on this more this weekend as it took me a week of evenings to get things sorted out.
I found out that after the deployment the project files are copied to the folder ServiceDefinition.csx.
When you now edit the source code in this place, you can see the changes directly, without another deployment.