I am developing an application with Ruby on Rails (will be more to come) and it has got to the stage where I would like to remotely host a development version, followed by a production one. I have done lots of rails development but I am reviewing my current setup and would like to make sure I do things the industry recommended way form now on.
I already have a dedicated server which is running parallels plesk and has several domains on it. I have currently had some success by creating a new user "passenger" to run the rails app and deploying via that user to an apps directory under
/var/www/vhosts/myrailsapp.com/subdomains/dev/
which is the parallels-format for site directories, deploying using capistrano and running the passenger module for apache. I have basically been putting my rails files where I would put them if it were a plain php site or similar and I was wondering if this was the way things are usually done?
I also found some information online which points at putting my rails apps under
/var/apps/
or similar, but then it would conflict with the parallels plesk way of doing things which could potentially cause issues, or could it?
I have already looked at solutions like Heroku and they won't quite work as I need to run other programs alongside my rails app on the same server to handle some real time server to server file syncing of files uploaded using the app. Added to this I need to ideally be able to host normal PHP applications alongside my RoR ones to make best use of the server.
How should I ideally go about implementing this sort of setup for secure hosting and deployment? If needs be (i.e my current setup is far less than ideal) you could assume I am starting from a vanilla ubuntu server install which I would be open to if it produced a nicer system to manage.
I figured many people would have had similar situations and so any advice from any of you veteran Rails/PHP developers or server admins would be greatly appreciated.
Many Thanks,
Peter
Normally it's a bad idea to put your Rails project files anywhere in your public html space because you don' want anybody to be able to put something like http://yoursite.com/config/database.yml and access sensitive information. Even if that's not possible under normal situations, it could still happen if you have problems with Passenger starting up correctly or something similar.
So I would recommend putting your Rails apps in /var/apps or /srv/apps (as we've done) and setup the Apache config to point your domain or subdomain to that directory.
If you want to have your app accessible by a subdirectory on an existing domain it takes some additional setup but that can also be done.
Related
I have been developing a website that uses Laravel (v6) on the backend, and Nuxt.js (v2) on the frontend. The idea was for laravel to act as an api & oauth2 server, that also server side rendered the Nuxt.js app. From my research, it seemed like this was not only a common route, but not too much hassle to implement.
While developing, I have kept the backend and frontend as completely separate projects with their own git repos and all that jazz. This is my first time deploying/developing a project like this, where there are two completely applications for the backend and frontend, so all this is very new and a little challenging at times. Now when it came time to deploy them, I always imagined that I would somehow merge the projects and that I would be able to setup Laravel to server side render the Nuxt.js app. However, I am now at that stage and trying to merge them with great difficulty.
Currently I am using the "laravel-nuxt" composer package and "laravel-nuxt" npm package in an attempt to connect the projects in one repo. However, I am having difficulty doing this. I've searched far and wide for a good resource on this process and have yet to find one that explains the process thoroughly. I even purchased a course on Udemy on the topic only to find out they didn't merge the projects! They deployed Nuxt to firebase and didn't even cover how the deployment of laravel.
Anyway, this is my question(s): should or could I keep the projects separate and have 2 completely separate deployments? Or rather, if I keep them separate, how do I deploy nuxt in a way that still gets server side rendered? To me it doesn't matter if they are separate or together, but the most important part is that the nuxt app utlitlizes SSR (server side rendering) for SEO purposes. So am I on the right track? Should I keep these projects separate or should I continue trying to merge them?
Sorry if this is unclear, I am rather frustrated and kind of losing my mind. I would really appreciate any feedback or point in the right direction. Thank you for your time in reading this, and I otherwise hope you have a good day :)
I recently developed something with a similar structure, Nuxt.js frontend and Directus CMS as backend.
I kept backend and frontend separated repositories and also deployed both separately. The reason why I decided to do it that way was because both need different packages on the server side and use different eco systems.
Frontend needs only Node.js backend needs a webserver, database and PHP. I think this should not be mixed.
For backend I used my existing server where I already have stuff running like Nextcloud or a blog behind a nginx webserver.
For frontend I used Dokku which I can only recommend for deploying Node.js apps. Nuxt.js has instructions on how to deploy to it.
Most important for you is that SSR is done by Nuxt.js, you don't need a separate webserver for that. Just build it and use npm start. Depending on your installation/deployment you have to use nginx as proxy to avoid calling the app with a port number. Another thing that Dokku does for me automatically, if the app respects the PORT environment variable.
Good day to you all,
I am currently developing a project on Laravel. So far I have always developed online, directly editing my files on the webserver throuh FTP (using PSPad or similar simple editing tools).
What I want to do now (and what i believe most people actually do) is setup a (W)LAMP stack on my local machine and program locally. However it is a little bit unclear to me how to keep my local code (including databases) in sync with the live website. How do you folks do that? I know there's probably lots of ways and tools to do that, but what would be your advice for a best practice? Any advice would be very welcome :)
What many companies do is build offline, then push their edits up to a server using git.
Im no expert on the software so ill describe what you do in a basic form:
My advice would be to create an online repo (repository) to store your project while you edit/update.
There are several git project management systems such as github or bitbucket. I personally use bitbucket
What git does, is when you have built or added what you need offline on local (w)lamp, you then git push them up to your repo or server. The changed files then get merged with the existing on the repo or the server. If you'd like the most recent version of your project you'd simply just git pull them down.
Read the full documentation here to see the wide range of options available when using git
We have a settings array within our platform available as $res::Config.
At runtime, a variable is changed from 'dev' to 'live' after checking the HTTP Host, obviously depending on the IP address.
Within our framework bootstrapping, depending on the value of $res::Config->$env, or the environment set previously as either dev or live, the settings for the database connection are set. You store these settings in the Config array as db_live or db_dev.
However you do it, use an environmental variable to figure out whether you want live or dev, and set up and array of settings accordingly.
We also have sandbox and staging for intermittent development stages.
As for version control, use git or subversion.
Edit: It's also possible that within our vhost file, we setup an environmental variable as either live or dev, and our application reads from this accordingly. I'd suggest this approach :)
There are a number of ways of doing this. But this is a deceptively HUGE question you've asked.
Here is some good practice advice - go and research these items, then have a look at my approach.
Typically you use a precess called version control which allows you to create "versions" or snapshots of your system.
The commonly used "SVN" software is good, but the new (not really any more) kid on the block is GIT, and I personally recommend that.
You can use this system to push the codebase live in a controlled fashion. While the files/upload feature is essentially similar to FTP, it allows you to dump a specific version of your site live.
In environments where there are multiple developers, this is ideal - you can compare/test and work around each other, and version control tends to stop errors between devs.
So - advice part 1: Look up and understand version control, then use it to release CODE to the live environment.
Part 2: I use database dumps and farm them back to my machine to work with.
If the live database needs updating, I can work locally and simply export, then re-import on the live system.
For example: on a recent Moodle project I worked on, to refresh the whole database took seconds... I could push a patch and database update in a few minutes.
However: you should think about maintenance and scheduling... if the site is live and has ongoing data changes then you need to be careful with this. Consider adding a maintenance page.
Advice 2: go research SQL dump/export and importing.
I personally use phpmyadmin to dump and re-import, as it's very convenient.
Advice 3: Working locally then pushing live is MUCH BETTER PRACTICE. You're starting down a much safer and better road than you're on!
Hope that helps... but bear in mind - this is a big subject, so you'll need to research a fair bit.
I need to know which are my options when deploying codeigniter/laravel apps.
I develop locally all the time at my home and when i go to work i need some quick way to push all the changes to the server.
Application code needs to be updated, database schemas need to be migrated, and application servers must be restarted.
I do all of this manually wasting a lot of time and i need some automated way kinda like capistrano in Rails environment.
What are my options here ??
Update:
I got my own server machine, and everything needs to work on an intranet environment without internet connection.
I've used the following:
Salt - http://www.saltstack.org/
Worked well, a bit fiddly to setup. Super fast deployment. Lots of control. Less learning overhead that Puppet & Chef, has some level of native MySQL tools.
GitHub
Requires a internet connection to/from your machine - one where, at some level, the end point as write permissions to interactive scripts.... Works, but makes me nervous. Pulls are better than pushes, and it's better than most other solutions.
Custom shell scripting
Yeah - this is the most common, just tar up the entire CI dir once it's been validated on staging and push out using Salt...
Scalextreme
We've been looking at this for a few months - the interface is from the 1990's, but it's got really nice functionality, including system-independent script library that you can target at any machine.
Turnkey Linux
The hammer - this will migrate an entire system image from a desktop to EC2 is something like 5 minutes. Works great and you can also move stuff between VM systems. In the end, I think that updating AMI's on EC2 is so easy that this might be one of the answers...
Nothing has truly been satisfactory and DB schema changes are a huge pain. So much so that for client configs, we're moving from MySQL to Cassandra, which is basically schemaless. CI installer is interesting, but I'm not sure how it handles updates.
I recently came across this CodeIgniter Installer on GitHub. I've played around with it a few times and it works like a charm for me.
It's as simple as putting it in your root directory (alongside your system folder), generating a MySQL dump, and editing a few files. Full instructions are here
I hope it works for you as well as it did for me.
I found this Laravel Installer on Github which might be useful. (First I came across this question after searching for Laravel installer in Google, then searched Github for Laravel Installer)
We are a small team developing PHP applications in a LAN. Both on Mac and PC.
Individual developers check out and edit source code to their own machine, on which Apache is running. Local testing is then done over localhost.
For the DB, the application connects to a common MySQL installation, on a dedicated machine in the LAN. This works quite well because we rarely make (destructive) changes to the DB schema. This means that all the individual applications running access the same test data.
But uploaded files remain a problem: they are only uploaded to the dev's local machine, although a reference to them is stored in the central DB. This means that the other team members may be shown a broken link for a user uploaded image, that physically only exists on one devs local machine.
The ideal solution would be to have the entire persistance layer on a central machine. Any ideas on how best to achieve this?
Map a network folder or use a service like dropbox or similar. A local db is nice to have though and doesn't take up too much resources.
Basically you want some sort of shared filesystem. There are lots of options: a samba share, an NFS-mounted directory, Dropbox (or a similar service), etc. I would suggest looking into the available options to see what suits your infrastructure best.
i have dealt with this once and what i did was to use our NAS as the storage of everything. we developed our website on the NAS itself over FTP. it was like cloud development, IDE only on our side, every file we edited, image uploaded and so on was on the NAS. the website itself was also running on the NAS (since the NAS has the ability to be a server and have mySQL)
the NAS was turned from a network storage - to an actual local server.
By how I understood your question, i assumed you only needed to share uploaded resource (like images) and not develop in the same app at once.
First I suggest that you DON'T develop on the production machine at all.
Now, here are 2 ways you can do it though:
Modular Development:
you develop independently. You don't touch each other's code. you develop features separately from each other. that way, you won't be stepping into each other's shoes. this also promotes "loose coupling" which in le man's terms, "when one feature breaks, the other's won't"
you should check out this video on how you can break down your development into "modules". This is in JS though, but the architecture can still apply.
Version Controlled Development
Break your development into 3 layers:
Production (aka Stable) is the code that is public. You don't develop or touch code here. You only publish the code only when it is tested thoroughly. Also, this is NOT the public server. this is just the public code. however, what lives here is the actual replica of the public site.
Testing (aka Beta) is where you test your developed code. This system is for testing purposes only. You don't touch the code here either. You are just here to find bugs on your own. It's your "Quality Assurance Layer". This layer is also where your codes merge (discussed later)
Development (aka Alpha) is where you touch your code. Here, you share your code, test it, break it, try new features as well as fix the bugs you found in Testing
as you can see, you don't break your systems due to overwriting, or broken links etc.
Now, your development strategy. Use a version system like GIT (distributed) or SVN(central) and create 3 branches according to the ones above. For this example, this uses a distributed approach (i prefer it)
Assign a "maintainer/ring master" in your group who consolidates your work and publishes it to testing. What this maintainer does is to collect your "finished" developed code and puts it into his testing branch. anyone can then clone his testing branch to your testing branch to test your code. Whatever bug they/you find, you refine in the development and submit it to him again. only after that feature is quality assured, then the maintainer can publish it to the stable where he clones it to the public server.
After all that's done and when you have moved on, you just clone the stable branch to your development branch and you start anew. Now you have a fresh canvas to play with.Overwriting is handled by the version control system, and the maintainer. you need not worry about that.
as for resources, you would not want to bog down your local development system with arriving resources from the public server. version control systems also have "ignore lists" to prevent you from cloning some resources. clone only what's necessary. if you are developing a weather widget, you only need images for weather widget. you don't need images from the other widgets (unless neccessary)
Is it considered best practice to develop your php website directly within the htdocs folder?
The advantages, obviously, are that you can make a quick edit, navigate to localhost and instantly view the result.
When developing ASP.NET applications from visual studio, we usually publish our changes to IIS from the "development folder" and don't usually develop directly within 'inetpub' itself. Is there something similar for php development or developing within htdocs is just fine?
As long as the 'htdocs' folder is on a test server (or at least test folder, if you don't have a test server), it's generally considered fine. Like you say, it provides a more instant feedback.
For ASP, there are better solutions, like you have already stated.
Our company have done like this for 10 years. Nothing bad happened.
As we can see, phpeclipse, zend studio, and Myeclipse (for java web apps) are all using this kind of approach. After all, we need a web server to test and debug, and the folder under version control, the two have nothing conflict, it's convenient.