Best way to patch my local web applications - php

Hi can anyone please guide me on this. I have multiple raspberry pi with a local instance of php and each rpi contains a local web application. Now everytime i need to patch my code (php,html,jquery or css) i have to remotely connect to each rpi via teamviewer. But this becomes problematic since i have 100 rpi installed in different areas already and patching them with updated codes can be taxing if i need to use teamviewer.
Now i am willing to take a different approach... My plan is to tell the owners of the raspberry pi to go and click a link that will automatically download the files and if needed to overwrite existing files so that i dont need to connect to each rpi anymore.

One way would be using git and webhooks. You can then push your changes and have them automatically pulled onto each server. The downside being that you need to set up git on each server and configure a script to push to all servers which is quite a task.
Another option would be to use Ansible as you can push changes from a master raspberry pi to all the others. Again, setup is required, although you could set one up, put a setup script on github, for example, and have users run it. That way the next time you need to make changes; the users don't need to do anything.

Related

How to check if respberry pi is connected to internet from a website using php?

I am trying to make a web based home appliance automation project taking reference from here.
I am using:
RPi 3 modal B.
Website hosted on free-hosting website (000webhost).
PHP
-Python
I want to implement following more things in this:
Before activating on and off button, i want to check if the raspberry pi is on and online running python script to check if any command is provided to it, if on then only the ON and OFF button are in focus.
Since the webpage and the RPi is communicating by writing to and reading from text file, it is possible that pi turns off (due to power failure) causing the appliance to turn off but the file still contains status 'ON'. I want to check every 2-3 seconds if pi is online and working so that i can display on the website that the appliance is currently working and if pi get turned off i can display "There is some problem with Pi, check out."
I searched the web for the same but I am not getting any solution using PHP so that i can continuously check the status of my pi.
I got a solution to use cron jobs but no any free hosting website provide cron job that repeat every 2-3 seconds and the minimum time is 10 min.
Is there any way to check continuously if Pi is online from the website?
One I think is Pi itself show its online status by writing to another file but how will I make my webpage to check that file continuously or is there better method?
I will appreciate if I get any more suggestions to make this project better.
NOTE: I can only upload files on the website, i don't have any means to provide command to the server.
Not sure I entirely follow the architecture but you could look at the modified data of the file. Another thing that comes to mind is to have the Pi write a timestamp to the file instead of "ON". Using this timestamp, you will know when the file was last written and if it is more than X time old, assume the Pi is off. But you will have to make sure the clocks are all close.

Team Environment Setup and Remote Working

We are working on a project which is getting bigger and bigger.
Till now our work looked like this:
I have a web server, I'm coding with VIM directly over SSH
Another programmer send me new files I add them and integrate them (he have a copy of project and local server)
Designer sends me design I also integrate it.
It is pretty much waste of time, because every time they made change I have to integrate it.
Now we have a hosting which doesn't support SSH (I cannot use vim now). How should we work on a project like this? How should I set up my VIM to work on remote project (I don't want to download and upload every time I want to change file)?
You didn't mention your OS. That would be certainly hepful if you want a precise answer.
The first thing would be to find a better host. You can rent very decent dedicated servers for as low as 40 or 50 € or less. If your project is big and serious, 50 €/month or 100 €, or 200 € is perfectly acceptable and you can install/enable whatever you need. Depending on the size of your project, a VPS could be enough. Whatever the price, a web host without SSH access is worse than shit.
But you may not have any power on that area.
Since your server doesn't support SSH, a proper VCS is not an option. The only practical solutions I see are rather "old-school" but they work:
Solution A:
Download the whole site on your local machine with an FTP client.
Edit locally.
Test your changes with a local web server.
Upload the changed files when your tests are OK.
Solution B:
Connect to your server with an FTP client.
Use its "Edit Locally" feature to open the files in Vim.
Write your changes, the file is automatically updated on the server.
Solution C:
Use Vim's bundled netrw plugin: :e ftp://host/path/to/file. See :h netrw.
Note that the process will always be download -> edit -> save -> upload, whether you notice it or not. Depending on the solution you choose, the process can be horribly repetitive and inneficient or almost completely invisible.
But, seriously, get another server and use a VCS and a local server.
I recommend using version management software like git with SSH hooks that automatically upload changes to your server.
You can use a version manager, like git and make a git pull in the web server every time you have a stable version.
You collaborator can push the new content and you dont need manage the file youtself.

Deploy Content to Multiple Servers (EC2)

I’ve been working on a cloud based (AWS EC2 ) PHP Web Application, and I’m struggling with one issue when it comes to working with multiple servers (all under an AWS Elastic Load Balancer). On one server, when I upload the latest files, they’re instantly in production across the entire application. But this isn’t true when using multiple servers – you have to upload files to each of them, every time you commit a change. This could work alright if you don’t update anything very often, or if you just have one or two servers. But what if you update the system multiple times in one week, across ten servers?
What I’m looking for is a way to ‘commit’ changes from our dev or testing server and have it ‘pushed’ out to all of our production servers immediately. Ideally the update would be applied to only one server at a time (even though it just takes a second or two per server) so the ELB will not send traffic to it while files are changing so as not to disrupt any production traffic that may be flowing to the ELB .
What is the best way of doing this? One of my thoughts would be to use SVN on the dev server, but that doesn’t really ‘push’ to the servers. I’m looking for a process that takes just a few seconds to commit an update and subsequently begin applying it to servers. Also, for those of you familiar with AWS , what’s the best way to update an AMI with the latest updates so the auto-scaler always launches new instances with the latest version of the software?
There have to be good ways of doing this….can’t really picture sites like Facebook, Google, Apple, Amazon, Twitter, etc. going through and updating hundreds or thousands of servers manually and one by one when they make a change.
Thanks in advance for your help. I’m hoping we can find some solution to this problem….what has to be at least 100 Google searches by both myself and my business partner over the last day have proven unsuccessful for the most part in solving this problem.
Alex
We use scalr.net to manage our web servers and load balancer instances. It worked pretty well until now. we have a server farm for each of our environments (2 production farms, staging, sandbox). We have a pre configured roles for a web servers so it's super easy to open new instances and scale when needed. the web server pull code from github when it boots up.
We haven't completed all the deployment changes we want to do, but basically here's how we deploy new versions into our production environment:
we use phing to update the source code and deployment on each web service. we created a task that execute a git pull and run database changes (dbdeploy phing task). http://www.phing.info/trac/
we wrote a shell script that executes phing and we added it to scalr as a script. Scalr has a nice interface to manage scripts.
#!/bin/sh
cd /var/www
phing -f /var/www/build.xml -Denvironment=production deploy
scalr has an option to execute scripts on all the instances in a specific farm, so each release we just push to the master branch in github and execute the scalr script.
We want to create a github hook that deploys automatically when we push to the master branch. Scalr has api that can execute scripts, so it's possible.
Have a good look at KwateeSDCM. It enables you to deploy files and software on any number of servers and, if needed, to customize server-specific parameters along the way. There's a post about deploying a web application on multiple tomcat instances but it's language agnostic and will work for PHP just as well as long as you have ssh enabled on your AWS servers.

Updater of system in PHP for customers

.. and sorry for my english ..
I have something like IS written in PHP and I would like to make updating system for my customers. Here is my vision:
I upload new version to ftp server (or web server)
After click on update, system should compare version (done), backup old scripts (done) and make update like rsync. Delete deleted, change changed, add new files and folders.
For rsync I have to make ssh hole to my server and I don't want to do it. I found zsync, but it is designed for files, not for folder system.
Is there any easy way to do it? Some smart linux utility or some already done script in PHP?
Thanks for answers!
rsync can run on its own port, without any ssh involved at all: https://help.ubuntu.com/community/rsync#Rsync Daemon
But this assume you're comfortable allowing the world at large to access your program. Do you want to restrict it to just your customers?
You could also publish your source via git; git can also run as a daemon http://www.kernel.org/pub/software/scm/git/docs/git-daemon.html without requiring ssh. Again, it assume you're comfortable with the world at large getting your program.

What is a good solution for deploying a PHP/MySQL site via FTP?

I am currently working on a web application that uses PHP and MySQL, but I do not have shell access to the server (working on that problem already...). Currently, I have source control with subversion on my local computer and I have a database on the local computer that I make all changes to. Then, once I've tested all the updates on my local computer I deploy the site manually. I use filezilla to upload the updated files and then dump my local database and import it on the deployment server.
Obviously, my current solution is not anywhere near ideal. For one major thing, I need a way to avoid copying my .svn files... Does anyone know what the best solution for this particular setup would be? I've looked into Capistrano a bit and Ant, but both of those look like it would be a problem that I do not have shell access...
I'm using Weex to synchronize a server via FTP. Weex is basically a non-interactive FTP client that automatically uploads and deletes files/directories on the remote server. It can be configured to not upload certain paths (like SVN directories), as well as to keep certain remote paths (like Log directories).
Unfortunately I have no solution at hand to synchronize MySQL databases as well...
Maybe you could log your database changes in "SQL patch scripts" (or use complete dumps), upload those with Weex and call a remote PHP script that executes the SQL patches afterwards.
I use rsync in production but you could do this:
Add a config table into your site to hold what level of DB you are currently at.
During development, store each set of SQL changes into a single file (I use something like delta_X-up.sql). These will stay in your SVN as well. So, for example, if you are at delta_5 and add a table between the current release and the new release, all the SQL needed will be put in delta_6-up.sql
When it comes time to build, export the repo instead of using a checkout. This lets you ignore all the SVN cruft that comes along since you won't need that into production.
Use Weex to push those changes into production (this would be were I would use rsync but you don't have that option). Call a remote script that checks your config DB to see what delta level you are currently at, parse the directory with you delta_x-up.sql files and see if there are any new ones. If there are, read them and run the SQL inside.
You can do a subversion export (rather than checkout) to different directory from your working copy, then it will remove all the .svn stuff for you

Categories