So what can be best way to have a Backup of code and DB is it downloading Locally via http ?
But i fear it is security risk as some hacker might get access to it .
I am looking into compress then encrypt the compressed file.
But i dunno what encryption i should use and if linux CLI tool available for password protected encryption ?
Thanks
Arshdeep
The community over at Hacker News raves about Tarsnap. As per the site:
Tarsnap is a secure online backup service for BSD, Linux, OS X, Solaris, Cygwin, and can probably be compiled on many other UNIX-like operating systems. The Tarsnap client code provides a flexible and powerful command-line interface which can be used directly or via shell scripts.
Tarsnap is not free, but it is extremely cheap.
If you're worried about transports, use SSH. I tend to use replication over an SSH tunnel to keep a MySQL database in sync. A backup of the version control server (which is not the same as the deployment server) is passed by rsync over ssh. If you want to encrypt files locally you could use gpg, which would of course not work in tandem with the database replication, in that case you'd be forced to use a dump or snapshot of your database at regular intervals.
You don't make that much sense here.
If downloading locally then you don't go over public networks, so it is not an issue.
Unless you meant simply to download. But the question is to download what?
On the other hand, the issue of securing the upload (for initial setup) and for maintenance is as equally important.
Securing your resources such as code repository and database is critical, but if you can have SSH access to your server you already have encrypted tunnel established and transferring files over that tunnel (scp) is quite secure; if paranoid (or in need) you can bump up security on SSH server setting to version 2 only.
Related
I`m using ssh2 php extension to run commands from an server to another, i briefly use it as an API.
My question is if the pull request (SSH) can be intercepted or risk being hacked?
If the SSH pull request can be intercepted i want to know how.
Yes, of course, but it most likely can only be sniffed, without gaining much value.
Every connection can be intercepted if there is no secure communication.
However, it does not matter much, because you need a private key in order to access whatever server you're trying to access through SSH.
You can find more information here:
If a MITM has your public key and you are SSH-ing through the MITM, what is the maximum attack it can perpetrate?
Can an ssh key login to a secure remote server be compromised when on a network run by a bad actor
The SSH extension will use a system library to manage the connection, so it should be as secure as the shell ssh command.
A more important security concern is that you are giving the user running php permission to log in to the remote server and operate there. This means that if your web app is compromised the attacker will be capable to obtain access to the other server as well.
The whole point of SSH is to protect against eavesdroppers. SSH traffic is encrypted and the key is only known to the sender and the recipient through the magic of diffie-hellman key exchange.
Certainly some algorithms are going to be better than others. If your SSH server supports it you'd be better off using ChaCha20 instead of, say, arcfour. But tbh idk if libssh supports ChaCha20 either.
Welcome,
I've got a quick question about PHP development environment. To be specific, I would like to use a program (maybe XAMPP or WAMP) for home use, php development. I am mostly worried about the security aspect of these two programs or programs suggested. Would they be safe for home use while hooked up to the internet? Are there any security measures that can be used to disable remote access, so that only my pc were able to access and control? Additionally, what program would you guys suggest using for PHP development (mysql, apache).
Would they be safe for home use while hooked up to the internet? Are
there any security measures that can be used to disable remote access,
so that only my pc were able to access and control?
If you don't forward any ports and make sure to use a firewall on the computer (Windows Firewall should be enough in your case) in order to block all unwanted traffic coming to your PC, that shouldn't be too much of an issue.
Additionally, what program would you guys suggest using for PHP
development (mysql, apache).
This is too broad of a question, generally just use whatever you're comfortable with. If you don't know what you would be most comfortable with, just start using the default configuration. For instance, XAMPP uses MariaDB and Apache.
Looking for some suggestions on best way / possibility of implementing offsite backup of the data in my php app.
So we have an PHP app that runs on the clients local site, which dumps the MySQL data to a datefile.sql each night. what's the best way to get this moved to an external server.
We host a server that currently we manually FTP files to each morning. How best can we automate this, would we need to hard code in FTP credentials, what if we had multiple clients how could we separate out this so no hard coded credentials are needed.
The ideal situation would be to have a MySQL instance running on the external server that the local clients server replicates the data across to this on the fly and back if required. Not even sure that's possible?
Happy to try and explain further if needed.
Thanks,
Any ideas?
you could create a bash script running on your server, called by a cron at night, that uses rsync to fetch the sql file from the clients servers (if you have an ssh connection with them), and restore it on your own machine.
You can achieve this using cron. Just create a cronjob and schedule it to run when you need it to. For all the file-transfering hasle, you may use rsync (which also provides ways to transfer only different data etc).
Also, I think that MySQL has a build-in feature for replication and backups, but I'm not sure about this or how to configure it..
I have a Website with an FTP Server in BigRock.com. What my issue is, whenever there is a deployment, i will have to manually search and find all the files that need to be changed. Now the project is getting larger and larger and making these kind of changes is taking a lot of my valuable time.
So is there any Software/Tools available for Syncing with the FTP Server and changing files based on changes made locally? I am not sure about FileZilla Client since i couldn't find much options in it. I am pretty sure that there would be some solution for this. My project is done in Zend Framework with Doctrine ORM and Many other Libraries.
If you need a one-way synchronization from local files to server, you can check this free tool: http://sourceforge.net/p/phpfilesync/wiki/Home/
It could not be much easier to install or use.
try Allway Sync
It uses innovative synchronization algorithms to synchronize your data between desktop PCs, laptops, USB drives, remote FTP/SFTP servers, different online data storages and more. Encrypted archives are supported. Allway Sync combines bulletproof reliability with extremely easy-to-use interface.
url http://allwaysync.com/
I tried personally is working fine for ftp and local file sync. and also it is free..
Working with Assembla, i found the FTP Tool in it.
Here is the reference link,
http://blog.assembla.com/assemblablog/tabid/12618/bid/78103/Deploying-a-Web-site-is-easy-with-Assembla-s-FTP-tool-Video.aspx
Its Easy to Work Out,
Add the FTP tool using the Tools section of your Admin tab.
Deploy code to any server with ftp or sftp installed.
Set the deploy frequency to manual (you push a button) or schedule automatic deployments for every commit, hourly, daily, or weekly.
Only changed files are deployed for fast and accurate deployments.
Easily roll back and deploy prior revisions.
Add multiple servers for staging and production releases or simply to deploy different repository directories to different locations.
I'm working on making a website/web application that displays images every 5 or so minutes, kinda like a webcam. The images are uploaded to an SFTP server. How can I access those from the web? Does anyone have any recommendations for what to use as well? Right now I'm looking at PHP but have checked out javascript and ruby as well. Only the application needs to ssh to a predetermined place, not the users.
I was suggested by a friend to use rsync and setup passwordless ssh. Anyone ever do this? or is this a bad idea?
If the application is the only thing that needs to SSH then you can rule out javascript immediately. It's predominately a client-side language in these environments.
You may like at Net::SSH ruby library, or I'm sure there's a php equivalent. I have used Net:SSH and it's fairly straight forward.
You need to write a server-side script that connects to the SFTP server and forwards the image to the client.
cURL has SFTP support.
PHP supports SFTP. You need to install ssh2 extension,
http://www.php.net/manual/en/book.ssh2.php