Hosting on Android: Backing up by cloning to external SD card? - php

I'm hosting a small website on a pretty budget android tablet. If it were to fail, I would essentially lose my files. To get around this, I'm thinking of using PHP copy(); to clone the main directory to the tablet's SD card at my request.
Is this a good idea, or not? What are the risks? While this is happening, will PHP still continue to run other scripts?
The directory that will be cloned only contains the webpages, all of the images have been moved into another folder.
Here is how I am planning to accomplish it:
include("adminverify.php");
if(isset($_GET["websiteBackup"]) && $admin=true)
{
$sdcard = "/mnt/extsdcard";
$sourcefile = "/internal/www";
// Write to a file saying the backup has started
copy($sourcefile, $sdcard) or die("Could not be done");
// Write to a file saying the backup has finished
}
Alternatives are welcome. I would simply go on the tablet myself and copy the files over, but the tablet is honestly too laggy.
If it turns out PHP is not able to function while doing the backup, I will simply have it change the directory name and modify the 404 page to say that the website is temporarily unavailable.
If this helps, I'm using lighttpd and PHP 5.5.15

Related

Getting images onto server from separate web host

i have GoDaddy shared webspace with FTP access that contains a folder with images. These images are changing every day.
Im looking for advice on what i need to do to get these images onto my server in the workplace, maybe every hour or so. The server doesnt have IIS installed is there any other way to do this?
Am i able to do a PHP script that can put all the images onto the server using the ip or something?
I had the same issue as I had two images on a remote server which I needed to copy to my local server at a predefined time each day and this is the code I was able to come up with...
try {
if(#copy('url/to/source/image.ext', 'local/absolute/path/on/server/' . date("d-m-Y") . ".gif")) {
} else {
$errors = error_get_last();
throw new Exception($errors['type'] . " - " . $errors['message']);
};
} catch(Exception $e) {
//An Error Occurred Downloading The Requested Image
//Place exception handling code here
};
I then created a cron job which ran this file on a daily basis but you would be able to run is as frequently as you needed to based on how frequently the source image changes on the source server.
A few points to explain the code...
As the code is run in the background I use the # symbol to silence visual references to any errors which occur with the copy command as they won't be viewable anyway. Instead if copy results in an error it returns false and triggers throwing a new exception which takes the error type and error message and throws a new exception for it which can then be handled in the catch block and can be actioned however you need such as making a local log entry, sending an error email, whatever you need.
As the second paramater in the copy command you will see that the path results in the filename being named based on the current date. This name can be anything but needs to be unique in the destination for it to work. If you plan on overwritting the image each time the copy is done then you can statically code the name as the same name but if you want to maintain a history then you would need to come up with a file naming solution on your local server to ensure that the files don't get overwritten each time the cron runs.
The first parameter of the copy statement has been written on the assumption that the source image file is named the same each time, if the name changes the you will need to identify how the naming is achieved and code a variable to build that name and insert it as the source filename.
This code does not alter the format of the source image file so to ensure no corruption occurs and the image can still be shown after the copy you need to ensure that the source image file and the local copy of the image file have the same file extensions, so if the source image file is a .gif file then you need to make sure the file extension in the second copy parameter is also set to .gif
I'll try to answer here instead of continuing the comment-spam ;)
you got your webspace with FTP access. let's just call it webspace;
then you got your server at your workplace. let's just call it workplace;
after all you need one server (also can be webspace for example) where you are able to run PHP. let's call it php-server;
step 1
at the workplace setup a FTP server, for example FileZilla. you setup your FTP server so that you can connect to it. so make an account and set it up so you can access the folder(s) where you want to save your images. also make sure you got access from outside your workplace - firewall settings etc.
step 2
if you can run your PHP scripts on your webspace this would be the easiest way. you can directly access the image files, establish a connection to your FTP server at workplace and upload the files.
if the PHP server is somewhere else, you have to establish a connection from the php-server to your webspace; download your files to your php-server; upload your files to the FTP server at your workplace.
will be a bit of work as you can see, but should be possible to do.
Create a .bat file that uses the ftp command line functions to get the data you want from the server. Save that .bat file somewhere and create a Scheduled Task to run the script every hour.
You can even store the actual sequence of ftp commands in a separate file (e.g. ftpcmd.dat) and call them from the script
ftp -n -s:ftpcmd.dat SERVERNAME.COM
Example ftp command file:
user MyUserName
Password
bin
cd \your\path\images
mget * C:\Temp\
quit

ftp_get() does not download, instead moves file to website root folder

I have a website developed under CakePHP. I'm trying to change the download system from Apache to FTP.
I currently am doing the FTP by hand with PHP (not using any plugins or libraries).
My code succesfully connects to the FTP server etc. The problem I'm having is that when I call ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY), it succesfully executes, though the file does not download, instead, its moved to the /webroot/ folder, which serves, as the name suggests, as the root folder of the website (using .httaccess rules).
I mention the .httaccess because I may suspect this is what causing the FTP to route the download to "moving" it to the root, but I'm not sure.
The following is my FTP download code:
$conn_id = ftp_connect($host);
$login_result = ftp_login($conn_id,$ftp_user,$ftp_pass);
if($login_result)
{
try
{
if(ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY))
{
$this->log("Downloaded...","debug");
}
else
{
$this->log("Couldn't download...","debug");
}
}
catch(Exception $e)
{
$this->log("Error: " . $e->getMessage(),"debug");
}
}
else
{
$this->log("Coudln't connect to FTP server...","debug");
}
ftp_close($conn_id);
Yes, I checked (printed out) the $conn_id and the $login_result and they are working.
What is inside the paths?
$remoteFile = "/downloads/rdticoiroe1432584529/YourMusic.zip";
$localFile = "Music.zip";
The code does not throw any errors. I also tried using the fotografde/cakephp-ftp repo plugin for cakephp and FTP, and it does the same behaviour...
EDIT 1:
Its a music download site, right now we serve file downloads with Apache (which is very slow), the idea is to move to FTP downloads. I'm trying to use FTP protocol to download the file when the client requests download of it.
EDIT 2:
The whole idea of this question, is me trying to move to a more optimized method to serve files to clients. I own a 100Mbit transfer server and downloads are preeetty slow. I'm using Apache at the moment to download files to clients who request it.
I completely misunderstood about using PHP-FTP to serve files to the clients Hard Drive.
So, I'm looking for some guidance at what methods/protocols do people use when they serve files to clients who request download. (This is a music download site).
You have a fundamental misunderstanding here: the "remote file" in an FTP connection is something remote from you - a file on another server somewhere, for instance. In your case, I think the file you're trying to serve is "local" in this sense, but you want a way of getting it to a client quicker.
There is no way of pushing a file out to a client's hard drive (think of the security implications), you can only make it available. Right now, the client can request it using HTTP, and Apache will give them the content. If you installed an FTP server, they could connect with an FTP client and request it that way. Similarly, you could install a different web server, like nginx or lighttpd, which might be faster.
Nothing you do in PHP, however, will be faster than just Apache, because you still have Apache involved, but now you have your PHP code as well, which at best requires the time to execute your code, before doing what Apache was going to do anyway.

Upload file with sftp and php

I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.

Check the content of file before upload

I have to check the Content of a zip/rar file before uploading to the server.
Let me explain the scenario.
There are 2 types of users in my web project:
1: Normal Registered user
2: Administrator of the Project
Any Registered user can Create Pages on our Project,also they can create Themes For Pages.
Here one of us suggested a feature that to upload the Theme as theme pack [Compressed in Zip/Rar File].
If it is a Administrator then it is ok,there is no more security constraints.
But i am afraid in the case of Normal Registered Users.
My Problems are :
Assume that a Registered User uploading a theme pack that contains some malicious Files [Including PHP file] that may hurt the system.
I know that it is posible to check the Contents after upload,but what will happen if the use executed the File before that?
Ex : a user uploading a theme pack : contains some PHP codes & other large files,First our system will extract the content of the Theme Pack: Assume that the extraction of large files takes some time,and the smaller PHP file already extracted.So that the user can RUN the PHP file First.
The above one is my noob soubt,Actually i dont know other sides.
Please help me to figure out this problem.
Is it possible to upload the ZIP file in a secure manner ?
You won't be able to check this client side unless, of course, you had some kind of plugin (for all browsers) that did the checking/uploading for you. You'll have to handle this on the server side.
Also, Admins can upload viruses just as easily as non-admins. Some user's don't even know their machine has more viruses than a shanty-town brothel.
EDIT: Also, how is the user going to execute their PHP file on your server before you've checked it unless you run that php file? This sounds like a recipe for disaster anyway. All it will take is for something to slip through the cracks and a malicious user will destroy your site. Allowing normal people to upload executable script to your server is asking for serious trouble.
Unpack it in directory, which can't be reached through the web, check, then move back to web-folder, where it should be.
Assuming that you have your website in directory /var/www/website and user content goes to /var/www/website/user and is reachable through www.website.com/user/ :
Create temporary dir in /tmp unpack there, check, move to /var/www/website/user
If you don't have access to /tmp, you can create /var/www/website/tmp and prohibit access to it using your server settings
you can create a folder for putting the zip file and unzip.
and disable the php execute for the folder. that can solve your problem

Help understanding the setup of a CDN?

Here's a few questions I cannot find in search:
When adding CDN services to your website, do you still maintain/create local dynamic files on your origin server and point the CDN to that location, set a http rule, and have them upload it automatically if they aren't hosting it yet?
Let's say I have an avatar upload form on my origin server and after a cropping function, do I set the save image to the local directory or to the CDN?
The other question I have is if you save files locally first and wait for the CDN to pull them, how do you code for the page to know the difference? Do you use some thing like
// $filename = 'images/image.jpg';
function static_file($filename) {
$cdnfilepath = 'http://cdndomain.com/';
if (fopen($cdnfilepath.$filename, "r")) {
return $cdnfilepath.$filename;
} else {
return $filename;
}
}
Or, do you just PUT every dynamically created file that you would like the CDN to host directly to the CDN?
If anyone knows a good tutorial on this that would helpful. Sorry if any of this has been covered but I having been searching with no clear answers...
Sometimes there's no straight-forward way of uploading directly to your CDN.
For example with AWS you have to PUT the file, which means it still has to be uploaded to your server temporarily. What I do is upload the files to a temp directory then have a cron script run that PUT's the files onto AWS, so as not to cause the upload process to take any longer for the end-user.

Categories