Symfony 2 without SSH access - php

I have a developed a small web-app in Symfony 2 and Doctrine 2.
Can i deploy it to a web-host that doesn't give SSH access?
I ask this because i see there are a lot of task that must be done from the terminal, like updating the database schema, creating symlinks for the assets, clearing cache, etc...

Should not be a problem:
Create a copy of the system somewhere, ideally with identical DB connection params like the production system.
Run all the necessary tasks with the --env=prod parameter, if your DB settings allow it.
Clone the created production database to the production system (with phpMyAdmin). You can clone the schema from the production database, run app/console doctrine:schema:update --dump-sql locally and then run the generated SQL on the production server.
Copy all the files, excluding the dirs in app/cache and app/log

I have done this many times with SF 1.4, and it should be just as easy with SF 2.
Some low end hosts have restrictions that will cause issues for symfony, so its important to run the symfony compatibility checker script (you can upload it and then enter its URL in your browser to get the output). Once thats done, follow these simple steps:
copy over all the files for the project. I usually zip/tar the project folder, upload it, and unpack.
Export the database from your development environment and upload it to your new server.
Edit the config and update your database settings. If you have hardcoded paths somewhere in your code, now is the time to fix those as well.
Make sure that the user for apache (or whatever server software your host uses) has full access to the cache and log directories. This can be tricky on some hosts, I have had to contact support in the past to have someone log in and change permissions.
In your web hosts configuration tool, set the webroot for your site to the web folder in your project.

Maybe there is a way (with sftp for example), but it would be like trying to ride a bike with square wheels ;)

Related

How to get running on a local dev environment from an existing and complex Magento2 project

At work we took back our existing store running on Magento 2 from an external development agency. I need to get the project running in local development (with docker).
I familiarized myself with a vanilla project from the official docs and managed to get it running by downloading the vanilla template with composer, granting the proper permissions on files and folder and running the magento setup:install command.
My question is how do one goes when kick starting from an existing (production running) project?
Do I need to run setup:install again? If I do, why?
What do I need to import from production to ensure any content or configuration created via the admin is also running on my local setup? Should I import the complete Database from production?
I know our setup is using more than just php and mysql, but env.php seems to be listing only db configuration and admin url. Where can I get the complete service configuration informations about what our setup uses?
Anything else I am missing to get started with an existing project for local development?
As someone who is running Magento 2 on a local environment myself, hopefully I can shed some light on this.
If you have a direct copy of the live site, you do not need to run setup:install again.
Ensure you have a copy of the entire Magento 2 site (you can technically ignore the vendor folder, as you can run composer install and it will redownload those files, but that's up to you). Also get a copy of the entire database. Magento 2 is notorious for copying the same data to multiple tables so something could break if you don't have everything.
What do you mean by "service configurations" If you are referring to Magento 2 extensions, that data is saved in the database, not the env.php file. env.php is only for server side configurations, such as the DB information, Caching, and things of that nature. On mine, I use Redis for site Cache, so that would be included in that file as well, as an example.
When you first unpack the site to your local environment, run composer update in the directory. This will ensure you have all the proper files installed. If you are going to run a local dev environment, set the mode to development with the following command: bin/magento deploy:mode:set developer. This will allow you to make changes and to view those changes by just refreshing the page, rather than flushing cache all the time.
All queries are replied correctly by Eric. I am also not sure about "service configurations" you have mentioned here. If this is about third-party extensions/services you can check config.php file for this.

Handling File ownership issues in a PHP apache application

Env: Linux
PHP apps runs as "www-data"
PHP files in /var/www/html/app owned by "ubuntu". Source files are pulled from git repository. /var/www/html/app is the local git repository (origin: bitbucket)
Issue: Our Developers and Devops would like to pull the latest sources (frequently), and would like to initiate this over the web (rather than putty -> and running the git pull command).
However, since the PHP files run as "www-data" it cannot run a git pull (as the files are owned by "ubuntu").
I am not comfortable with both alternatives:
Running Apache server as "ubuntu", due to obvious security issue.
The git repository files to be "www-data", as it makes it very inconvenient for developers logging into the server and editing the files directly.
What is the best practice for handling this situation? I am sure this must be a common issue for many setups.
Right now, we have a mechanism where the Devops triggers the git pull request from the web (where a PHP job - running as "www-data" creates a temp file). And a Cron job, running as "ubuntu", reads the temp file trigger and then issues the "git pull" command. There is a time lag, between the trigger and the actual git pull, which is a minor irritant now. I am in the process of setting up docker containers, and have the requirement to update the repo, running on multiple containers within the same host. I wanted to use this opportunity to solve this problem, in a better way, and looking for advise regarding this.
We use Rocketeer and groups to deploy. Rocketeer deploys with the user set to the deployment user (ubuntu in your case) and read/write permission for it, and the www-data group with read/execute permission. Then, as a last step, it modifies the permissions on the web-writable folders so that php can write to them.
Rocketeer executes over ssh, so can be triggered from anywhere, as long as it can connect to the server (public keys help). You might be able to setup your continuous integration/automated deployment to trigger a deploy automatically when a branch is updated/tests pass.
In any case, something where the files are owned by one user that can modify them and the web group can read the files should solve the main issue.
If you are planning on using docker, the simplest way would be to generate a new docker image for each build that you can distribute to your hosts. The docker build process would simply pull the latest changes on creation and never update itself. If a new version needs to be deployed, a new immutable image with the latest code is created and distributed.

How to move Symfony2 application from localhost to deployment server (LAMP CentOS)

I have developed a Symfony2 app on my local machine. I access it using PHP's built-in server, via http://localhost:8000/
Question: how do I move this application to my VPS (LAMP CentOS) so that it can be accessed like this:
http://example.com
and is it any different for installing it on:
http://subdomain.example.com
PLEASE if by any chance you can provide a step by step instructions, it will be much appreciated and I'm sure useful to many others too. I'm used to working with stuff where you can simply move files from one place to another, update some config data and it works (Wordpress, coding without frameworks etc.)
The best solution to me is to have your project in a versionning system like git or svn without the vendors dir of course ...
This way, your simply have to do :
1) git clone your project into the prod dir
2) php composer.phar install to install your vendors
2b) create the mysql user with correct login and password according to your parameters.yml
3) php app/console doctrine:database:create to create your database with the credentials you set up in mysql
4) php app/console doctrine:schema:update --force to perform the database tables creation
5) testing the project :)
If you are not using a versionning system just upload your project to your server with an ftp software without the vendors directory ( it will be feeded by step 2) then perform 3rd , 4th and 5th steps !
For the subdomain part of your request you have to creats a subdomain folder on your server ( by using plesk if you have it ) or by manually creating a vhost config to specify the sub domain path. I can t provide you an example right now ( i m writing this from my mobile device and i don t have clear step by step procedure in order to achieve this )
[Install your environment]
So, First you need to install and run an apache Server. Find here the minimum basics to get your application running under Apache2. Then make sure you've PHP5 and MySQL up and running. Otherwise check,
Install and Configure MySQL Database Server
Installing and Configuring PHP
[Deploy your application]
Deploying can be a complex and varied task depending on your setup and
needs - Symfony.com
It's then up to you to choose the right way to deploy your application, you can do it,
Using Basic File Transfer
Using Source Control
Using Build scripts
I would recommend using Capifony which was build on top of Capistrano to adapt it to Symfony applications.
[Post-Deployment Tasks]
Your deployment process should be tailored to guarantee that all the required post-deployment tasks (Like updating your dependencies, setting your application configuration files, clearing the cache and dumping your assets, etc) are executed.
To get the big picture, read the How to deploy a Symfony2 application of the Cookbook.

Multiple server update architecture

i'm using AWS for my application. current configuration is:
Load balancer --> Multiple EC2 instances (auto scaled) --> all mount a NFS drive with SVN sync
Every time we want to update the application we login to the NFS server (another EC2 instance), and execute svn update to the application folder.
I wanted to know if there is a better way (architecture) to work since i sometime get permission changes after SVN update and server take a while to update. (thought about mounting S3 as a drive).
My app is a PHP + Yii framwork + and mysql DB.
Thanks for the help,
Danny
You could use a slightly more sophisticated solution:
Move all your dynamic directories (protected/runtime/, assets/, etc.) outside the SVN-Directory (use svn:ignore) and symlink them into your app directory. This may require some configuration change in your webserver to follow symlinks, though.
/var/www/myapp/config
/var/www/myapp/runtime
/var/www/myapp/assets
/var/www/myapp/htdocs/protected/config-> ../../config
/var/www/myapp/htdocs/protected/runtime -> ../../runtime
/var/www/myapp/htdocs/assets -> ../assets
On deployment start with a fresh SVN copy in htdocs.new where you create the same symlinks and can fix permissions
/var/www/myapp/htdocs.new/protected/config-> ../../config
/var/www/myapp/htdocs.new/protected/runtime -> ../../runtime
/var/www/myapp/htdocs.new/assets -> ../assets
Move the htdocs to htdocs.old and htdocs.new to htdocs. You may also have to HUP the webserver.
This way you can completely avoid the NFS mount as you have time to prepare step 1 and 2. The only challange is to synchronize step 3 on all machines. You could for example use at to schedule the update on all machines at the same time.
As a bonus you can always undo a deployment if something goes wrong.
Situation might be more complex if you have to run migrations, though. If the migrations are not backwards compatible with your app, you probably can't avoid some downtime.

php, user-uploaded files, version control, and website deployment

I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment.
There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways.
PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need.
Is there a better way to do this?
I usually dont keep user generated data/file in svn. only the code, db schema/test data. What i usually do to deploy is an rsync from an up to date working copy which excludes the upload dir and .svn dirs. IMO content should be handled by more traditional filesystem/db backup mechanisms and not version control.
EDIT:
Just to be clear your symlinking strategy seems like a good practice. youre jsut missing the backup part it think. Id probably just tar | gzip the uploaded stuff in the cron job instead of interacting with SVN. And then probably have a seperate one to use mysqldump to dump the db and gzip that as well.
I would continue with the practice of exporting the site as upgrades are needed but have a symbolic link to a directory outside of the version controlled directory with the user uploaded content. This way when you do an export you only need to recreate the symlink if it gets blown away. You should then of course be backing up that user content as needed.
Rather than manually doing the export and managing symlinks, you could make your deployment directory a subversion checkout (from a production branch). That way, deploying is as simple as checking in your updates to the production branch.
This works as long as you have sufficient control of your subversion server and hosting setup, and your subversion repository is "ready to run." In this situation, your user directory could be an empty placeholder in subversion and would be left alone by the update process that runs on commit (business as usual for svn update). I'd still recommend (as mentioned by #Flash84x and #prodigitalson) a separate process to back up the user content.
There's an Ars Technica article with a description of how to set this up.
Update: If you follow this approach, make sure that your web server does not allow access to the .svn files in the deployment checkout.

Categories