Drupal execution memory limit in hosting provider - php

We developed the Drupal based portal. After the testing and deployment (in locally) the site was working perfectly. But when I move the site to the online server (hosting), the site was facing the memory execution limit problem.
It's expecting minimum 96MB but the hosting provider is not able to increase that much!! Is any other way to fix this issue?? Or in Australia any other hosting provider is giving this much memory limit?

In my experience, 96MB is a lot for a Drupal site to require. Look over the modules you have installed on the site and disable and uninstall anything you don't actually need. For example, disable the core Color and Comment modules (enabled by default in Drupal 6), if you're not using them. If you have module like Views UI, Beautytips UI, ImageCache UI, or Rules Administration UI intalled, disable them once you're done configuring everything.
If there are only specific pages that are requiring too much memory, try optimizing those pages. For example, if there are lots of nodes being displayed in a view, try adding a pager to the view, to split the content across multiple pages.
If you have custom modules, try separating out the administrative UI code into a separate modulename.admin.inc file, so it will only be loaded when needed. Heck, consider doing that for publicly contributed modules too, then consider posting patches to the issue queues of those modules.
UPDATE: You may want to consider VPS hosting next. There you'll have more control over the system you're running on.
UPDATE 2: Depending on the needs and features of your site, you may be able to reduce your site's memory footprint by installing the Boost module, which creates a flat-file version of your site.

Related

Drupal 8 (with twig) development

we're developing a drupal 8 site with several developers. Therefor, each one of us has his own drupal instance on the dev machine (to avoid getting errors with different template file states and stuff), and they are set up like this:
/srv/www/devDudeA/html (instance A root)
/srv/www/devDudeB/html (instance B root)
/srv/www/devDudeC/html (instance C root)
and so on...
These instances are all connected to the SAME database. After writing a twig extension module (delivering a necessary twig filter) and activating it, some of my colleagues are getting several exceptions (saying the module couldn't be found), though they definitely have the same module files like me.
Do you know of trouble using the same database from different hosts? I suspect the cache is causing trouble (because i found some paths entries containing "devDudeA" or "devDudeB" in the database.
In my opinion the requiredments should be satisfied:
same module files in the same directory on each instance
same module information from (same) DB
IMHO there is the opcache and also a memcache. The integrated twig engine also has his own cache. Could it be a problem that one of us has his caching enabled while the others don't?
Im just getting crazy about this (O.x). Do you have any relevant instructions for me?
Thanks in advance & kind regards!
If you are going to have multiple devs actively working on multiple sites with a shared database odd bugs like this are going to come up. Likely a cache rebuild will resolve it, but a similar error may appear for another user. Basically you are doing something Drupal doesn't want to do.
Since you're in Drupal 8 the reasons people used for shared databases in Drupal 7 and before are largely gone. The Drupal 8 configuration management system is a vast improvement over the previous versions. You can now move site configuration using YML files between instances of the site. Those files can be managed with the rest of your code.
And you can look at using the deploy module (and supporting modules) to move content between instances.

What CMS can be used with Phalcon framework?

I've been using Wordpress for the past 4 years for developing small and medium websites. Now I have an enterprise project and I'm considering using Phalcon PHP framework.
My enterprise project will be handling a large amount of users and will be publishing articles with images. This is why I still want to use some sort of CMS.
I think framework like Phalcon is great for service and business layers, but it lacks the GUI / services found in various CMS's like Wordpress. I know Phalcon Eye is in development, but it's in very early development stage (I think).
Can Phalcon MVC be used alongside any CMS? If yes, wouldn't the speed of Phalcon bee compromised by much slower CMS? (And what CMS is recommended?)
Update
The first version of my enterprise project is currently using WP for handling user registration, page / template handling, articles etc. But that's just a small part of the solution. All other code is custom and I've realized that should use a solid framework like Phalcon, Laravel, Sympfony etc.
Update 2
What if I use a framework like Phalcon for my custom code, present data and form handling. Then I build a Wordpress service that will retrieve articles from WP DB's. That way I would not need to use wordpress for presentation, but I can use WP for handling articles, images and maybe even users. Bad idea?
You can use Yona CMS (built with Phalcon), whose code is hosted on GitHub, with modular structure and great speed of Phalcon Framework.
There are few large projects working on this CMS.
Using an existing CMS for the admin and writing a phalcon frontend for it is a very intriguing idea I have pondered on and off over the years. (I haven't done it yet because I have a custom CMS to maintain, which I am not sure how to replace with WP or joomla etc)
I think it would be possible to have a site that is much faster than a WP site by using phalcon, but I think the tradeoff is no WP plugins will work, and the more PHP you use to make them work, the more you erode the benefit of phalcon and you might have well just used WordPress.
I have never used Phalcon 2.0 with Zephir, so can't comment on that.
----- extra comment stuff----
I see a comment about updating phalcon, which I thought I would address you can update phalcon with 3 or 4 commands (or a single shell script), and it only takes affect when you restart your webserver. Apache can do a graceful restart which shouldn't affect any of your users.
Whether phalcon is harder to update than a framework written in PHP file comes down to your update method. Updating phalcon with git is far quicker, easier and safer than FTPing individual files for example. Naturally using git for both I don't see much of a difference, just as long as the webserver is clever enough to not open the php file just as you are copying it of course...
re: speed - phalcon is very fast (upto 10x faster than zend framework v1 IMO, YMMV), it might not be as fast as node depending on what you are doing, but if your PHP is far better than your JS and your Server Admin has never used node - like me then the difference in speed it didn't look like it was worth the extra effort.
I think as per your requirement you should go for a CMS, Phalcon does not provide you the functionalists of a CMS, it has it's own advantages. If you are using wordpress and not satisfied with its performance then there are many other popular CMS solutions available in PHP like Joomla or Drupal, you can look into that also, and choose the best that fits in your requirements.
Only a CMS based on phalcon, like phalconeye, may get the benefits of phalcon's speed.
If you want speed, avoid Drupal, that not where it is the better.

Combine database users/login for two sites- different directories- Joomla 3.0 / Jomsocial 3.0

I have two Joomla 3.0 sites installed on the same hosting account, each in their own directory on their own database. Both are subdomains of the same domain. The are hosted with Bluehost shared hosting at the moment, but I will be upgrading to VPS hosting once I go live.
Hosting environemnt:
Apache version 2.2.25
PHP version 5.4.17
MySQL version 5.5.33-log
Architecture x86_64
Operating system linux
Site #1:
subdomain1.domain.com
- is an online magazine with login/registration and ability to comment on articles. I am using the Zoo component for content articles rather than the Joomla default.I am also using the JFBConnect component and SCLogin module.
Site #2
subdomain2.domain.com
- is an installation of Jomsocial 3.0 with login/registration only, also using the FBConnect component and SCLogin module.
Both are using the same responsive Yootheme warp framework template. I intentionally installed them in different directories under the theory that if one were to malfunction, at least the other would be viable in the interim.
My question(s):
Is it possible for users to login and register just once in order to be recognized by both sites/databases and have full use of all functions in both?
Is my installation of Jomsocial on a separate directory the best course of action, given my rationale?
Are there any other considerations to the above scenaio that I might have overlooked?
Thank you in advance for your help!
Hmm, I have to say it's probably not the best method you've chosen here.
It's not so much the files that you have to worry about, it's the database as all data is stored here. I would personally suggest having you're main site on the main domain and taking daily backups of the database and a backup of the folder/files every time you, for example, install an extension which adds new files and folders..
If something goes wrong, simply uploads the backup.
Yes it is possible using jFusion. The warp framework, jomsocial etc has zero relevance. http://www.jfusion.org
That said, your assumptions for implementing this method for having a viable site available are seriously flawed.

How to configure Varnish VCL file for caching OSCommerce site?

I'm building a simple server setup for developping purposes, with Nginx, PHP-FPM, APC, Varnish and MySQL, using Ubuntu Server 12.04.
But now I want to deploy at this testing environment, an OSCommerce app. After some googling I couldn't find a way to properly configure OSCommerce with Varnish.
Actually I have Varnish configured for Wordpress (varnish file and default.vcl file) like one from there (GitHub-Nicolargo) but just a little bit modded.
So, should I use the same configuration for WP? If not, There's someone who knows how (or where should I find) to configure it properly for OSCommerce?
Hugs
Configuring Varnish for Open Source models such as Wordpress, Joomla, Drupal or OSCommerce is tricky: you can have a default VCL that works just fine in 50% of the cases, but as soon as extra modules are activated in the CMS, the caching stops working. That's because each module can alter cookies, caching-headers, ...
That pretty much makes each Varnish implementation a custom job: you can start from a basic VCL that works for the base CMS, but it'll require finetuning specific to that site.
I would therefore recommend starting with a basic VCL file, such as the one you linked or that you can find here. Afterwards, it's a matter of running varnishlog/varnishhist/varnishstat to find out which pages are not getting cached, determining why (a combination of cookies/headers/invalid VCL) and modifying the appropriate VCL file(s).
I know it's not the answer you were looking for directly, but if you could post some output of varnishlog where a request that -should- be cached is -not- being cached, we could take it from there.

How to efficiently manage multiple installations of a web application?

From my experience, one of the bigger problems we come across during our webdevelopment process is keeping different setups updated and secure across different servers.
My company has it's own CMS which is currently installed across 100+ servers. At the moment, we use a hack-ish FTP-based approach, combined with upgrade scripts at specific locations to upgrade all of our CMS setups. Efficiently managing these setups becomes increasingly difficult and risky when there are several custom modules involved.
What is the best way to keep multiple setups of a web application secure and up-to-date?
How do you do it?
Are there any specific tips regarding modularity in applications, in order to maintain flexibility towards our clients, but still being able to efficiently manage multiple "branches" of an application?
Some contextual information: we mainly develop on the LAMP-stack. One of the main factors that helps us sell our CMS is that we can plugin pretty much anything our client wants. This can very from 10 to to 10.000 lines of custom code.
A lot of custom work consists of very small pieces of code; managing all these small pieces of code in Subversion seems quite tedious and inefficient to me (since we deliver around 2 websites every week, this would result in a lot of branches).
If there is something I am overlooking, I'd love to hear it from you.
Thanks in advance.
Roundup: first of all, thanks for all of your answers. All of these are really helpful.
I will most likely use a SVN-based approach, which makes benlumley's solution closest to what I will use. Since the answer to this question might differ in other usecases, I will accept the answer with the most votes at the end of the run.
Please examine the answers and vote for the ones that you think have the most added value.
I think using a version control system and "branching" the part of the codes that you have to modify could turn out to be the best approach in terms of robustness and efficiency.
A distributed version system could be best suited to your needs, since it would allow you to update your "core" features seamlessly on different "branches" while keeping some changes local if need be.
Edit: I'm pretty sure that keeping all that up to date with a distributed version system would be far less tedious than what you seem to expect : you can keep the changes you are sure you're never going to need elsewhere local, and the distributed aspect means each of your deployed application is actually independent from the others and only the fix you mean to propagate will propagate.
If customizing your application involves changing many little pieces of code, this may be a sign that your application's design is flawed. Your application should have a set of stable core code, extensibility points for custom libraries to plug into, the ability to change appearance using templates, and the ability to change behavior and install plugins using configuration files. In this way, you don't need a separate SVN branch for every client. Rather, keep the core code and extension plugin libraries in source control as normal. In another repository, create a folder for each client and keep all their templates and configuration files there.
For now, creating SVN branches may be the only solution that helps you keep your sanity. In your current state, it's almost inevitable that you'll make a mistake and mess up a client's site. At least with branches you are guaranteed to have a stable code base for each client. The only gotcha with SVN branches is if you move or rename a file in a branch, it's impossible to merge that change back down to the trunk (you'd have to do it manually).
Good luck!
EDIT: For an example of a well-designed application using all the principles I outlined above, see Magento E-Commerce. Magento is the most powerful, extensible and easy to customize web application I've worked with so far.
I may be wrong, but it seems to me what Aron is after is not version control. Versioning is great, and I'm sure they're using it already, but for managing updates on hundreds of customized installations, you need something else.
I'm thinking something along the lines of a purpose-built package system. You'll want every version of a module to keep track of its individual dependencies and 'guaranteed compatibilities', and use this information to automatically update only the 'safe' modules.
E.g. let's say you've built a new version 3 of your 'Wiki' module. You want to propagate the new version to all the servers running your application, but you've made changes to one of the interfaces within the Wiki module since version 2. Now, for all default installations, that is no problem, but it would break installations with custom extensions on top of the old interface. A well-planned package system would take care of this.
To address the security question, you should look into using digital signatures on your patches. There are lots of good libraries available for public-key-based signatures, so just go with whatever seems to be the standard for your chosen platform.
Not sure whether someone's said this, there are a lot of long responses here, and I've not read them all.
I think a better approach to your version control would be to have your CMS sat on its own in its own repository and each project in its own. (or, all of these could be subfolders within one repo i guess)
You can then use its trunk (or a specific branch/tag if you prefer) as an svn:external in each project that requires it. This way, any updates you make to the CMS can be committed back to its repository, and will be pulled into other projects as and when they are svn updated (or the external is svn:switch 'ed).
As part of making this easier, you will need to make sure the CMS and the custom functionality sit in different folders, so that svn externals works properly.
IE:
project
project/cms <-- cms here, via svn external
project/lib <-- custom bits here
project/www <-- folder to point apache/iis at
(you could have cms and lib under the www folder if needed)
This will let you branch/tag each project as you wish. You can also switch the svn:external location on a per branch/tag basis.
In terms of getting changes live, I'd suggest that you immediately get rid of ftp and use rsync or svn checkout/exports. Both work well, the choice is up to you.
I've got most experience with the rsync route, rsyncing an svn export to the server. If you go down this route, write some shell scripts, and you can create a test shell script to show you the files it will upload without uploading them as well, using the -n flag. I generally use a pair of scripts for each environment - one a test, and one to actually do it.
Shared key authentication so you don't need a password to send uploads up may also be useful, depending on how secure the server to be given the access is.
You could also maintain another shell script for doing bulk upgrades, which simply calls the relevant shell script for each project you want to upgrade.
Have you looked at Drupal? No, not to deploy and replace what you have, but to see how they handle customizations and site-specific modules?
Basically, there's a "sites" folder which has a directory for every site you're hosting. Within each folder is a separate settings.php which allows you to specify a different database. Finally, you can (optionally) have "themes" and "modules" folders within sites.
This allows you to do site-specific customizations of particular modules and limit certain modules to those sites. As a result, you end up with a site that the vast majority of everything is perfectly identical and only the differences get duplicated. Combine that with the way it handles upgrades and updates and you might have a viable model.
Build into the code a self-updating process.
It will check for updates and run them when/where/how you have configured it for the client.
You will have to create some sort of a list of modules (custom or not) that need to be tested with the new build prior to roll-out. When deploying an update you will have to ensure these are tested and integrated correctly. Hopefully your design can handle this.
Updates are ideally a few key steps.
a) Backup so you can back out. You should be able to back out
the entire update at any time. So,
that means creating a local archive
of the application and database
first.
b) Update Monitoring Process - Have the CMS system phone home to look for a new build.
c) Schedule Update on availability - Chances are you don't want the update to run the second it is available. This means you will have to create a cron/agent of some kind to do the system update automatically in the middle of the night. You can also consider client requirements to update on weekends, or on specific days. You can also stagger rolling out your updates so you don't update 1000 clients in 1 day and get tech support hell. Staggered roll-out of some kind might be beneficial for you.
d) Add maintenance mode to update the site -- Kick the site into maintenance mode.
e) SVN checkout or downloadable packages -- ideally you can deploy via svn checkout, and if not, setup your server to deliver svn generated packages into an archive that can be deployed on client sites.
f) Deploy DB Scripts - Backup the databases, update them, populate them
g) Update site code - All this work for one step.
h) Run some tests on it. If your code has self-tests built in, it would be ideal.
Here's what I do...
Client-specific include path
Shared, common code is in shared/current_version/lib/
Site specific code is in clients/foo.com/lib
The include path is set to include from the clients/foo.com/lib, and then share/lib
The whole thing is in a version control system
This ensures that the code uses shared files wherever possible, but if I need to override a particular class or file for some reason, I can write a client specific version in their folder.
Alias common files
My virtual host configuration will contain a line like
Alias /common <path>/shared/current_version/public_html/common
Which allows common UI elements, icons, etc to be shared across projects
Tag the common code with each site release
After each site release, I tag the common code by creating a branch to effectively freeze that point in time. This allows me to deploy /shared/version_xyz/ to the live server. Then I can have a virtual host use a particular version of the common files, or leave it pointing at the current_version if I want it to pick up the latest updates.
Have you looked at tools such as Puppet (for system administration incl. app deployment) or Capistrano (deployment of apps in RubyOnRails but not limited to these)?
One option would be to set up a read-only version control system (Subversion). You could integrate access to the repository into your CMS and invoke the updates through a menu, or automatically if you do not want the user to have a choice about an update (could be critical). Using a version control system would also allow you to keep different branches easily
As people have already mentioned that using version control (I prefer Subversion due to functionality) and branching would be the best option. Another open source software available on sourceforge called cruisecontrol. Its amazing, you configure cruisecontrol with subversion in sach a way that any code modification or new code added in serversion, Cruise control will know automatically and will do build for you. It will save your hell of time.
I have done the same way in my company. we have four projects and have to deploy that project on different servers. I have setup cruiseconrol in such a way that any modification in code base triggers automatic build. and another script will deploy that build on the server. your are good to go.
If you use a LAMP stack I would definitely turn the solutions files into a package of your distribution and use it for propagate changes. I recommend for that matter Redhat/Fedora because of RPM and it's what I have experience on. Anyway you can use any Debian based distribution too.
Sometime ago I made a LAMP solution for managing an ISP hosting servers. They had multiple servers to take care of web hosting and I needed a way to deploy the changes of my manager, because every machine was self-contained and had a online manager. I made a RPM package containing the solution files (php mostly) and some deploying scripts that runned with the RPM.
For automated updating we had our own RPM repository set on every server in yum.conf. I set an crontab job to update the servers daily with the latest RPMs from that trusted repository.
Trustiness can be achieve too because you can use trust settings in the RPM packages, like signing them with your public key file and accepting only signed packages.
Hm could it be an idea to add configuration files? You wrote that a lot of small script are doing something. Now if you'd build them into the sources and steered them with configuration files shouldn't that "ease" that?
On the other hand having branches for every customer looks like an exponential growth to me. And how would you "know" which areas you've done something and do not forget to "make" changes in all other branches also. That looks quite ugly to me.
It seems a combination of revision controls, configuration options and/or deployment receipts seems to be a "good" idea.....
With that many variations on your core software, I think you really need a version control system to stay on top of pushing updates from the trunk to the individual client sites.
So if you think Subversion would be tedious, you've got a good sense for what the pain points will be... Personally, I wouldn't recommend Subversion for this, since it's not really that good at managing & tracking branches. Although benlumley's suggestion to use externals for your core software is a good one, this breaks down if you need to tweak the core code for your client sites.
Look into Git for version control, it's built for branching, and it's fast.
Check out Capistrano for managing your deployments. It's a ruby script, often used with Rails, but it can be used for all sorts of file management on remote servers, even non-ruby sites. It can get the content to the remote end through various stragegies including ftp, scp, rsync, as well as automatically checking out the latest version from your repository. The nice features it provides include callback hooks for every step of the deploy process (e.g. so you can copy your site-specific configuration files which might not be in version control), and a release log system--done through symlinks--so you can quickly roll back to a previous release in case of trouble.
I'd recommend a config file with the list of branches and their hosted location, then run through that with a script that checks out each branch in turn and uploads the latest changes. This could be cron'd to do nightly updates automatically.

Categories