Web Server Interrupt Driven File Transfer - php

I have a webpage that currently takes an upload from a user and stores this into a directory (/upload). [Linux based Server]
I am looking for a way instead of storing this on the server/in that directory to instead transfer the file onto a local machine. [Running Ubuntu 12.04]
Assuming I already have public/private keys setup how might I go about doing this?
Current Ideas:
ftp transfer
rsync

Ideas:
1) Stop running anything on the server, and forward every byte to your local box. Just run ssh -N -R :8080:localhost:3000 remote.host.com This will allow anyone to hit http://remote.host.com:8080 and get your port 3000. (If you do port 80, you'll need to SSH in as root.) Performance will be kinda bad, and it won't be that reliable. But might be fine for real-time transfer where you're both online at once.
2) use inotifywait to watch the upload dir on the server, and trigger rsync from the server to your local box. (Requires exposing SSH port of your box to the world.) If you sometimes delete files, use unison bidirectional file sync instead. (Although unison doesn't work on long filenames or with lots of files.)
3) Leave the system as-is, and just run rsync from cron on your local box. (Ok, not realtime.)
Of course, most people just use dropbox or similar.Alghough

Related

How to initialize a PHP websocket service in CPanel/External web host?

I have developed a web app which shows information in real time from certain actions carried out by different users. For this I use websockets, built in PHP, in a local environment (WAMP) and works fine, but I need this to also work on an external server (web hosting service), which I only have access to through the CPanel and FTP.
Locally I make the websocket work executing next code line through Windows' CMD:
C:\wamp64\bin\php\php7.2.10\php.exe -q C:\wamp64\www\myapp\websocket_daemon.php
My question is, how can I achieve the same result in CPanel, or maybe there is another way?
It is not likely for a shared hosting environment (i.e. Apache with VirtualHost config, PHP, MySQL, and a CPanel interface) to support your websocket application.
For websocket to work, you need to either:
have a port dedicated to websocket in-bound connections; or
have a HTTP/HTTPS server that knows when to upgrade a connection and proxy pass to your websocket application.
To run your own websocket service, you should think about using Virtual Private Server services such as Amazon EC2, DigitalOcean VPS.
For that purpose you will need to have CLI (Command-Line Interface) access to the (Linux) server involved. Assuming that you have such access, running the WS service would look something like
./websocket_daemon.php
The small script assumes that you are in the appropriate folder. However, you need to resolve a few things before you get there:
Step 1: SSH support on your machine
You will need to ensure that your OS supports SSH. Your OS appears to be Windows, so you will need to either install Putty of Git Bash. Read about these technologies
Step 2 Generate an SSH key
In your CPanel, you will need to generate SSH keys:
Click on Manage SSH Keys
Click on Generate a New Key
Use the settings you prefer in order to generate a key, don't worry, you can remove the SSH keys at any time and recreate them if you realize that you prefer a different way to generate them
Read more here: https://docs.cpanel.net/cpanel/security/ssh-access/
SSH keys are composite keys, that is, it consists of a private and a public key. You can share your public key with anyone, but never ever send your private key to anyone. It should be on your computer and possibly saved to backups. Read more about SSH keys here: https://sectigo.com/resource-library/what-is-an-ssh-key
Step 3: Ensure that your computer uses the SSH keys you have generated for CPanel
You will need to tell your OS where the SSH key-pair is located, luckily this is not a new problem, see an exhausting discussion about this topic here: https://serverfault.com/questions/194567/how-do-i-tell-git-for-windows-where-to-find-my-private-rsa-key
Step 4: Test your SSH
Run the following command into your CLI that supports SSH:
ssh <username>#path
If there is no error, then you have successfully tested SSH and you are almost ready to proceed further
Step 5: Upload your Websocket script
You can do this via FTP, as you already know, but you can also do it via SCP. scp would not only use your newly created SSH connection and having fun with it, but it's also secure. Syntax:
scp file.txt remote_username#10.10.0.2:/remote/directory
Step 6: SSH to the server
See Step #4.
Step 7: Navigate to your file's location
Step 8: Ensure that you have the rights to run it
See more here: https://alligator.io/workflow/command-line-basics-file-permissions/
Step 9:: Execute the file
Run
./websocket_daemon.php
If this succeeded, then the job is basically done, you will need some script to run it upon startup and to manage it, but this is not strictly related to the question.
https://oracle-base.com/articles/linux/linux-scripts-running-in-the-background
https://smallbusiness.chron.com/run-command-startup-linux-27796.html
However, if the issue is not resolved yet, read further
Step 10: Ensuring WS support on server
You will need to set up your own WS support. Since you have managed to do so locally on your Windows, hopefully your know-how will work on the remote Linux as well. If not, read more here:
PHP Websocket server in Linux hosting

Eclipse auto-copy via SSH from one server to another upon save, PHP

I'm writing a PHP app, and our team has our servers set up so that I have to copy code onto a second server every time I edit a file. Naturally this is against the DRY concept, and simply a monotonous chore.
When I make a change to the file, and go to refresh the webpage to see the changes, both of these servers have to be updated, or else there is a chance that I will be seeing the old version on the server that hasn't been updated yet. So I'm copying/pasting/saving hundreds of times a day while I develop this app!
I'm using Eclipse Luna 4.4.1. In the Remote Systems Explorer, the system type is "SSH Only." This is the same for server 2.
I would like there to be some kind of functionality where each time I save sample.php on server 1, the same file sample.php will be automatically updated and saved with the same code on server 2. I don't want to have to do anything else other than just save the file on server 1; after doing that I'd like for the file on server 2 to be automatically updated.
Both servers are running on nginx OS.
I've been reading a lot about this here on SE, but many of the question/answers are 5+ years old, and I wonder if there is an optimal way to do this today. I don't have much experience with Java, but I'm happy to do this any way that will work. Any plugin or other way would be great.
Take a look at running rsync over ssh, and put it on a cron to run the update on a periodic interval as you desire.
Here is an example of an rsync process I run to copy database backups (type / contents of the file are mostly irrelevant here):
rsync -ar -e "ssh -l <username>" <local_path_of_files_to_watch> hostname:<target_path_where_files_land>
This is what my command looks like to rsync databases from a remote system that monitors radio traffic and records decoded messages to a daily log file:
rsync -av --remove-source-files --progress --stats -e "ssh -l stratdb" 192.168.0.23:/var/log/stratux.sqlite.*.gz /Users/stratdb/sqlite-dbs/.
If you want this process to run without having to define a password, you'll want to setup .ssh with shared keys to allow passwordless login from server 1 -> server 2 for this process (Ideally this is done from a user that has limited capabilities on both servers).
There are some continues deployment processes you could use that would do builds and copies and the like (e.g. Jenkins) if you need really robust and highly configurable solution... but if you just need to brute-force copy files from one server to another with a single easy command (ad-hoc) or automagically.. I'll investigate rsync

Is it possible to determine the amount of free space on a remote FTP server without using scripts?

So, here is the setup I have to work with:
I have five servers total in different locations. One server is purely a web server for hosting static files. The other four servers are solely FTP servers, each containing files uploaded by users through PHP scripts.
What I want to do is be able to choose the server with the most free space available and send the next user-uploaded file to it. I've searched around, and there doesn't seem to be any way to do that with only FTP commands.
I found a question about Determining the Free Space of an FTP Server, which showed that it was possible to create and update a file periodically with a Linux Shell script, but, the servers I have are and will stay Windows machines.
My only solution would be to host web servers on the FTP servers with a simple index.php containing the remaining filesize determined by disk_free_space() but that seems a bit much for something so simple.
All that I'm looking for is a way to find out this information with FTP commands, or possibly be able to link the servers to a VPN somehow and use PHP to figure out the amount of free space, though I wouldn't know exactly how to do that, or even, if it would work...
If you are using IIS FTP server on the Windows machine, you can configure the IIS to include free disk space in the LIST command response.
In the IIS manager, go to your FTP site, and select FTP Directory Browsing applet. There, in the Display following information in directory listing setting, check the Available bytes.
Then, the LIST FTP command response will look like:
226-Directory has 27,906,826,240 bytes of disk space available.
226 Transfer complete.
You can test this with WinSCP FTP client, it can make use of this information. Just go to the Space available tab of the Server and Protocol Information dialog.
(I'm the author of WinSCP)
Other FTP servers support other ways to retrieve free disk space.
See How to check free space in a FTP Server?

Backing up uploaded document on Linux

So I have a PHP application running on linux machine which is using a mysql database. I have manage to add the back up my mysql database every day by adding a code in the CRONTAB. In my application clients are able to upload document, of which a saved in a directory in the application folder ie /myapp/uploaded_documents/, I am looking at backing up this directory.
My question is: how do I back up a directory to a certain remote location on a certain time every day? Is it possible to also password protect this directory on my app folder?
Thank you
As told in previous answer, to backup periodically on a remote machine you can use rsync+ssh+crontab. Just set ssh to access the remote machine without password following (for ubuntu distro) https://help.ubuntu.com/community/SSH/OpenSSH/Keys, then add to crontab a rsync job at the time and days you want (check man crontab to understand how to do this), telling rsync to backup over ssh on a remote machine, something like 0 2 * * * rsync -ae ssh dir_to_bkp name#host:dir_where_bkp to backup each day at 02:00 am the "dir_to_bkp" in the "host" machine using "name" user and "dir_where_bpk" as destination. The -e ssh option in rsync specify to use ssh.
The best way is to use rsync, as you would be uploading (most likely) changes only.
http://linux.die.net/man/1/rsync
Additionally you can create incremental backup:
http://www.mikerubel.org/computers/rsync_snapshots/
So my suggested solution would be rsync + crontab

Whats the best way to transport a file to a different server?

Im running a file host thats grown beyond the capacity of a single server, and I need to implement multiple server storage of files. I'd like to do this as cheap as possible, so those fancy mass storage methods are out of the question. I simply want to move a file thats uploaded by the user to the "gateway" server, which hosts all the http and mysql, to one of the media servers. It can either be done at the end of the user's request, or via cron every couple of minutes.
At this point the only method Im truly familiar with, is using ftp_put php function, and simply ftping the file to a different server, but I've had problems with this method in the past, especially for larger files, and a lot of the files that will be getting transferred will be over 100mb.
Can anyone suggest a good solution for this? Preferably Im looking for a purely software solution... hopefully nothing more than a php/bash script.
One method that requires no programmatic setup, and is quite secure is to mount a folder from the other server, and use php to save to that directory like you would normally.
You don't have to use sshfs, there are a bunch of other ways to provide the same solution. I would use sshfs for this situation as it works across ssh, and therefore is secure and relatively easy to setup.
To achieve this (using sshfs):
Install sshfs
Mount a ssfs file to a folder accessable to php
Use a php script to store the files within the sshfs mount, and therefore on the other server.
Another method is to setup an rsync command between the two using crontab.
you can write a bash script that run in cron
and use command line like scp or sftp or rsync
example :
[bash]# scp filename alice#galaxy.example.com:/home/alice
Copy "filename" into alice's home directory on the remote galaxy.example.com server
Use cron, a bash script, and rsync.
cron: "Cron is a time-based job scheduler in Unix-like computer operating systems. 'cron' is short for 'chronograph'."
bash: "Bash is a free software Unix shell written for the GNU Project."
rsync: "rsync is a software application for Unix systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate."
In summary, place the rsync command in a bash script and get cron to run it at regular intervals.
See here for some examples:

Categories