Does Magento or Zend cache DB connection strings? - php

I have a Magento app running on a pretty standard Apache/Zend platform. I am pointing Magento to a new db by modifying the local.xml, however it does not seem to be honoring my changes, even after a restart of the app.
More specifically, I have db1 and db2. I am changing Magento to point to db2, however the app continues pointing to db1. As a matter of fact, I can completely shut down db2, and even though the app is pointing to db2, it doesn't care...and continues running just fine. If I shut down db1, the app fails, even though it's config has it pointing to db2.
I am hoping that it is just a caching setting somewhere in Magento or Zend that I am unaware of, but I find it odd that it persists after an app restart.
Any thoughts?

Yes, the entire Magento configuration tree (of which local.xml is a part) gets cached. Use the UI in System -> Cache Management to clear the configuration cache, or manually blow away your cache storage (i.e. remove `var/cache/*')

Related

Extension Modules disappeared

I don't have any error logs or other info that I can figure out what's the issue, is not the first time that it happens. In my case is when I installed an extension, basically I do what is recommended after uploading the extension, clean cache and login again on admin, and everything is ok when I installed, the only problem is that sometimes the module disappeares has nothing was installed, and only shows again if a go and "clean Cache" on the admin panel.
If you use multiple web-nodes, separate admin nodes then it can be possible that the new codes might have not deployed to one of your servers and the old xmls are cached.
And I understand that you are using redis as your caching layer, so sometimes flushing the cache from magento admin panel doesnt flush the entire redis cache.
So you would need to try to connect to the redis server using redis-cli command then issue 'flushall' there.

Deploy Simple Symfony application to Azure : Slow?

Last week, I tried to deploy a simple symfony app on azure.
I choose the plan app service B2 (2cores / 3.5Go RAM).
PHP Version : 5.6.
First it took forever to complete the composer install. (I tried to go on S3, it was a little faster but not very different).
So I tried to optimize the php config, opcache, realpath_cache_size...etc (xdebug already disabled).
I even tried to enable wincache, but with no real improvment.
So now my app is deployed, but it is too slow to be usable.
A simple php app/console (in dev mode) takes ~23secondes.
It seems to recreate the cache everytime. On my local unix environnment (similar specs), it takes 6seconds when the cache is cold and 500ms when the dev cache is warm.
I think that the main problem is a filesystem issue, because to remove the dev cache folder it takes 16 seconds.
On my local unix environnment, similar specs, it takes ~200ms to remove the same folder.
Like I said I tried S3 Plan with a small improvment but not enough to explain this slowness.
One thing weird, it's that if I rerun the command php app/console just after it finished, the command takes 5seconds to run (much better). But If rerun it 5seconds after it finished, it takes 23seconds.
I already tried these solutions (even if the environnment is different) :
https://stackoverflow.com/a/17021255/6309878
Update : I tried to set the symfony app/cache folder to the local filesystem D:\local\cache, but no improvment, it may be worst.
Please try below steps and let me know if it improves the performance -
1) In the wwwroot directory of your site, create a .user.ini file (if it doesn’t already exist) and add “wincache.fcenabled=0”. This will disable Wincache.
2) Go to Azure Portal and go to the Application Settings for your app. In the App Settings section, add “WEBSITES_DYNAMIC_CACHE” with a value of 1.
3) Restart the site.

moving web development environment (xampp) from old computer (xp) to new computer (windows 7)

I'd like to setup xampp on different computer. The problem is that i've changed php.ini, added databases, etc. I could redo the changes. but i feel like that's not the principled thing to do.
Is is simple as installing xampp on my new machine and then copying all my files from the old installation? thank you in advance
I would always install a new environment on different system rather then copy it from somewhere else, because I never remember (and don't want to remember) which settings are platform specific and which aren't. Also it's usually easier.
However, you can try to just copy the php.ini (and maybe other configuration files, like the ones for apache and mysql) to the new system. Remember to backup the old files first.
For the database you should have a schema file (as SQL-file) to create the database structure anyway. For the db-content (and maybe -- if missing -- the schema) you can create a SQL-dump file from the old database.
For the PHP stuff it's enough to copy the PHP files and configuration. Same with the HTTPd stuff. For the database, you'll want to use mysqladmin/mysqldump/mysql to dump the database on the old system and load it on the new.
You can copy all your source files into the new server and that should be enough. You will have to redo your settings and databases or if you are comfortable, you may copy your settings file. Regarding creation of databases, I always have a sql file which contains all database creation statements checked into source control. Whenever I need a fresh database, I just run the script.
As an aside, I'm assuming you're comfortable with the various components and since you're starting from a fresh instance, you might want to consider installing all the components separately instead.

Setting up an efficient and effective development process

I am in the midst of setting up the development environment (PHP/MySQL) for my start-up. We use three sets of servers:
LIVE - the servers which provide the actual application
TEST - providing a testing version before it is actually released
DEV - the development servers
The development servers run SVN with each developer checking out their local copy. At the end of each day completed fixes are checked in and then we use Hudson to automate our build process and then transfer it over to TEST. We then check the application still functions correctly using a tester and then if everything is fine move it to LIVE. I am happy with this process but I do have two questions:
How would you recommend we do local testing - as each developer adds new pages or changes functionality I want them to be able to test what they are doing. Would you just setup local Apache and a local database and have them test locally on their own machine?
How would you recommend dealing with data layer changes?
Is there anything else you would recommend doing to really make our development process as easy and efficient as possible?
Thanks in advance
+1 to each developer running her own setup, complete with Apache and database.
Keep the database schema under version control.
Possibly you could keep (maybe in a separate repository) a small but representative set of data, in a test database. Each morning you check out the latest copy of this test database, and start hacking. When you change schemas, update your test data repository accordingly.
Anyone doing development SHOULD have their own local environment. I use Mac so I run MAMP so that I can have my own LAMP environment local and independent of any other environment. This will also allow me to know that nobody else is changing / working on the same components I am and removes any possible confusion. If you are a windows user, there are also easy to install local versions of the LAMP stack such as XAMP, etc. If you are running Linux as your desktop, you will most likely already know how to install LAMP for the flavor of Linux you are running.
The database schema version is a great idea. It is what we use as well. In addition to the schema under version control, we add a schema version table to the schema and keep it updated so we can quickly tell what version is in production/qa/dev when we need to compare.
As for the data layer changes there are two things I would recommend.
Always create your migration path, forward and backward. This means that when you have the schema you want to put on production to upgrade an existing schema, you should always make it part of the release. A clear concise process for ALTERing the tables. By the same token, you need to have a working and tested ROLLBACK version as well in case something goes wrong.
What I have found helpful is using a backup of production to load on my local (or QA/DEV) so that I have the most up-to-date data / schema to play with without affecting production. If you are not performing regular backups of production, maybe now is a good time to implement a policy. Then you will kill two birds with one stone. You will have backups for any outage and a useful live schema with data you can load to test with on another machine. This will also lend itself to raising any possible issues with schema changes as the data will be matching production. So if it works locally (and on DEV/QA), it reduces the risk of something going wrong in production.

Upgrades to Drupal in production

Does anyone have a good Drupal upgrade strategy for an install that is in production? No one talks about this in books and it's hard to find a definitive answer in forums and email lists.
Ex:
Lock down prod, don't allow
updates to data
copy prod
copy prod database to dev
turn off all modules in dev
upgrade core Drupal in dev (update db if necessary)
upgrade modules in dev (update db if
necessary)
turn on modules
test
migrate code and db to prod
turn site back on
Your strategy sounds good, but it would require a site to be in “read only” mode for quite a while. This is not always feasible. Also I am not quite sure why you would turn on and off all of the modules?
May I propose a slightly different approach
copy prod database to dev
replicate prod code in dev
upgrade core Drupal in dev
run update.php
test
For each module
. Upgrade modules in dev
. Run update.php
. Test
Put into maintenance mode
Backup database
Migrate code to production
Run update.php
Put back online test
This way there is a lot more testing but less downtime, also you will be able to work out which module breaks things if there is an error. It also dosn't rely on you uploading the DB from dev to live.
Don't think there's any need to turn off modules before running update.php anymore (except maybe between major versions). And I definitely wouldn't run update.php once per module - that doesn't make sense with the way update hooks work.
If you're at all comfortable with the command-line (and running on a Linux server) then definitely have a look at Drush. It can simplify the process and allow parts of it to be scripted.
In addition, if you're looking for a formal update process to move stuff from your dev server to production for a large site, you should also be getting up to speed on the hooks that run during install and update.

Categories