Our website currently backs up every night to a seperate server that we have which is fine, but when we go to dowload the files the next day it take's a long time to download the files (usually around 36,000+ images). Downloading this the following day takes quite some time and affects the speeds of everyone else using our network so ideally we would try and do this in the middle of the night - except there's no-one here to do it.
The server that the backup is on is running Cpanel which appears to make it fairly simple to run a PHP file as a Cron job.
I'm assuming the following, feel free to tell me I'm wrong.
1) The server the backup is on runs Cpanel. It appears that it shouldn't be too difficult to set up a PHP script to run as a Cron job in the middle of the night.
2) We could deploy a PHP script utilizing the FTP functions to connect to another server and start the backup of these files using this cron job.
3) We are running Xampp on a windows platform. It has Filezilla as part of it so I'm assuming it should be able to accept incoming FTP connections.
4) Overall - we could deploy a script on the backup server that would run every night and send the files back to my local computer running Xampp.
So that's what I'm guessing. I'm getting stuck at the first hurdle though. I've tried to create a script that runs on our local computer and sends a specified folder to the backup server when it executes, but all I seem to be able to find is scripts relating to single files. Although I've some experience of PHP, I haven't touched upon the FTP functions before which are giving me some problems. I've tried the other examples here on stack overflow with no success :(
I'm just looking for the most simplistic form of a script that can transfer upload a folder to a remote IP. Any help would be appreciated.
There is a fair amount of overhead involved in transferring a bunch of small files over FTP. Ive seen jobs take 5x as long, over a local network. It is by far easier to pack the files in something like a zip and send them in one large file.
you can use exec() to run zip from the command line (or whatever compression tool you prefer). After that, you can send it over ftp pretty quickly (you said you found methods for transferring 1 file). For backup purposes, having the files zipped would probably makes things easier to handle, but if you need them unzipped you can setup a job on the other machine to unpack the file.
Related
I'm currently developing a new website where a lot of files will be uploaded and downloaded.
When a file is uploaded to the server clamav will start a virus scan on the tmp file before it will be moved to the http server. Everything works great except when i use clamscan and it seems like clamav needs to upload the hole virus database before the scan starts every time and this stress my cpu to 50% for maybe 10 - 20 seconds.
This seem to be a big problem because if two users upload files at the same time to my website it will probably be very slow.
So I installed clamav deamon because it runs in the background and already have the virus database loaded so a lot of time and cpu power can be saved. But to the problem...
When i use clamdscan (clamav deamon) it cant access any of the tmp files that's uploaded with the php script. It only works when i use clamscan. This is probably because clamdscan is running in the background and uses some very strict user permissions.
But how can i solve this? Can i change the tmp php upload file permission from 0600 to 0644? Is that safe? Or should i change permission for clamav deamon?
I don't really know how to do this and if someone know and what to shear I'm very thankful.
Try passing --fdpass as an option to clamdscan.
I had the exact same issue and it worked for me.
--fdpass
Pass the file descriptor permissions to clamd. This is useful if clamd is running as a different user as it is faster than streaming the file to clamd. Only available if connected to clamd via local(unix) socket.
Source: https://linux.die.net/man/1/clamdscan
When I did run my web site on my old server, I launched Transmit on my Mac (OS X 10.11.6), connected to my server, Control-Click-Open the remote php file, made the fix and save. The file got updated on the server in a second. That was great to run some php/mysql/google_service test that I can't run locally.
Now I have just moved my project on an Amazon server, AWS. Every time I need to run a test (for example on the S3_Bucket, that I can't run locally), or modify a variable, change a flag... I have to do it on my local php/html/java/css/apis project, zip it, upload it via the Elastic Beanstalk panel, wait about half a minute, then run it. I have found no way to edit a single file in an easy way (Open, Write, Save) as I did before through Transmit. I can't go ahead this way. It's wasting my time.
Do you know any better way to develop/test/run my project on AWS?
Have you considered using Docker?
https://aws.amazon.com/about-aws/whats-new/2015/04/aws-elastic-beanstalk-cli-supports-local-development-and-testing-for-docker-containers/
Or use MAMP?
https://www.mamp.info/en/
I have an app on Heroku, using PHP and PostgreSQL. Now I would like to create backup of my database regularly, put it on a folder on the server, or record its S3 urls and download it.
I have been doing research on the topic. It seems that the best is to use pgbackups add-on which I already have and can use on local command line, like: heroku pgbackups:url --app=APP_NAME
I want to automate the process, lets say in a cron job. I see we have workers on Heroku, but I have never used them and this is still a development environment. A free plan does not have workers. Besides, my app doesnt really require background workers. I dont want to buy worker dynos only for automatic database backups. Which way should I go?
If I can create PHP cron jobs on Heroku, then I need to know: How can I run Heroku commands in PHP? I tried exec and passthru, but none of them seems to work on Heroku server. On my localhost, the above command (heroku pgbackups) works pretty well, providing the Heroku toolbelt installed on local server.
For Ruby, they have https://github.com/heroku/ toolkit for server-side commands. But I had no luck in my search for a PHP branch...
The overall purpose is to have the DB backup and store it on the server and download it. (Even though Heroku makes backups itself, we want to see it in our hands :)
How can I make it happen?
Probably the best thing to do is to have a cron job on your backup server to run heroku pgbackups:url and to get a URL to the latest pgbackup, and then download it with curl. Something like this:
curl $(heroku pgbackups:url) > latest_backup
For more info about heroku pgbackups:url, see:
https://devcenter.heroku.com/articles/pgbackups#downloading-a-backup
Doing anything with worker dynos wouldn't really make sense because that wouldn't really help you get the backup to your backup server unless you were downloading and re-uploading it or something. Just running a cron job on your backup server downloading once is a lot more straight forward.
Here is my situation:
I have multiple servers under a load balancer. My users are able to upload files to the server. These files need to modify and sent to another server. So to do that I decided to use Gearman to manage queues of the PHP scripts that will do the work.
Here is the problem. What happens if one server goes down... Then gearman executes the code to get a particular file but that file was on the server that went down. How can I setup gearman to wait for the server that went down to finally get the right file... Also it might happen that the servers might not go down but since gearman talks to the other servers it might execute the code on another machine looking for a file that is on another server.
How can I get around this problem? Without having to isolate each gearman servers from each other. Because I will use it for other things that don't have the same restrains.
Im running a file host thats grown beyond the capacity of a single server, and I need to implement multiple server storage of files. I'd like to do this as cheap as possible, so those fancy mass storage methods are out of the question. I simply want to move a file thats uploaded by the user to the "gateway" server, which hosts all the http and mysql, to one of the media servers. It can either be done at the end of the user's request, or via cron every couple of minutes.
At this point the only method Im truly familiar with, is using ftp_put php function, and simply ftping the file to a different server, but I've had problems with this method in the past, especially for larger files, and a lot of the files that will be getting transferred will be over 100mb.
Can anyone suggest a good solution for this? Preferably Im looking for a purely software solution... hopefully nothing more than a php/bash script.
One method that requires no programmatic setup, and is quite secure is to mount a folder from the other server, and use php to save to that directory like you would normally.
You don't have to use sshfs, there are a bunch of other ways to provide the same solution. I would use sshfs for this situation as it works across ssh, and therefore is secure and relatively easy to setup.
To achieve this (using sshfs):
Install sshfs
Mount a ssfs file to a folder accessable to php
Use a php script to store the files within the sshfs mount, and therefore on the other server.
Another method is to setup an rsync command between the two using crontab.
you can write a bash script that run in cron
and use command line like scp or sftp or rsync
example :
[bash]# scp filename alice#galaxy.example.com:/home/alice
Copy "filename" into alice's home directory on the remote galaxy.example.com server
Use cron, a bash script, and rsync.
cron: "Cron is a time-based job scheduler in Unix-like computer operating systems. 'cron' is short for 'chronograph'."
bash: "Bash is a free software Unix shell written for the GNU Project."
rsync: "rsync is a software application for Unix systems which synchronizes files and directories from one location to another while minimizing data transfer using delta encoding when appropriate."
In summary, place the rsync command in a bash script and get cron to run it at regular intervals.
See here for some examples: