Help understanding the setup of a CDN? - php

Here's a few questions I cannot find in search:
When adding CDN services to your website, do you still maintain/create local dynamic files on your origin server and point the CDN to that location, set a http rule, and have them upload it automatically if they aren't hosting it yet?
Let's say I have an avatar upload form on my origin server and after a cropping function, do I set the save image to the local directory or to the CDN?
The other question I have is if you save files locally first and wait for the CDN to pull them, how do you code for the page to know the difference? Do you use some thing like
// $filename = 'images/image.jpg';
function static_file($filename) {
$cdnfilepath = 'http://cdndomain.com/';
if (fopen($cdnfilepath.$filename, "r")) {
return $cdnfilepath.$filename;
} else {
return $filename;
}
}
Or, do you just PUT every dynamically created file that you would like the CDN to host directly to the CDN?
If anyone knows a good tutorial on this that would helpful. Sorry if any of this has been covered but I having been searching with no clear answers...

Sometimes there's no straight-forward way of uploading directly to your CDN.
For example with AWS you have to PUT the file, which means it still has to be uploaded to your server temporarily. What I do is upload the files to a temp directory then have a cron script run that PUT's the files onto AWS, so as not to cause the upload process to take any longer for the end-user.

Related

Serve static files from another server while keeping the process invisible

What I am trying to achieve is to serve files from an S3 bucket for multiple websites without showing the real path of the items. The files are stored in the S3 bucket like: bucket/website
Example:
domain: domain.com
static: static.domain.com
I do not want to create S3 buckets for every website, so I want to store all the files in a bucket folder and serve them from there with a script.
I currently got:
<?php
$path = "domain";
$file = "filename";
// Save a copy of the file
file_put_contents($file, fopen($path . $file, 'r'));
// Set the content type and output the contents
header('Content-Type: ' . mime_content_type($file));
readfile($file);
// Delete the file
unlink($file);
?>
but it's not that clean since I save the file and then output it. Saving the file could potentially cause a mess due to the fact that the websites might have files with the same name.
I'm also lacking the .htaccess file to make the proper rewrites so it would appear that you got the file from static.domain.com/file and not a static.domain.com/script.
Any suggestions on how to realize this would be great! Many thanks in advance!
Configure Apache to proxy the request to S3. Something like:
RewriteRule /path/to/static/files/(.*) http://my.bucket.s3.amazon.com/$1 [P]
This means a request to your web server for /path/to/static/files/foo.jpg will cause your web server to fetch the file from my.bucket.s3.amazon.com/foo.jpg and serve it as if it came from your web server directly.
However, note that this somewhat negates the point of S3 for the purposes of offloading traffic. It just offloads the storage, which is fine, but your web server still needs to handle every single request, and in fact has some overhead in terms of bandwidth due to having to fetch it from another server. You should at the very least look into caching a copy on-server, and/or using a CDN in front of all this.

Hosting on Android: Backing up by cloning to external SD card?

I'm hosting a small website on a pretty budget android tablet. If it were to fail, I would essentially lose my files. To get around this, I'm thinking of using PHP copy(); to clone the main directory to the tablet's SD card at my request.
Is this a good idea, or not? What are the risks? While this is happening, will PHP still continue to run other scripts?
The directory that will be cloned only contains the webpages, all of the images have been moved into another folder.
Here is how I am planning to accomplish it:
include("adminverify.php");
if(isset($_GET["websiteBackup"]) && $admin=true)
{
$sdcard = "/mnt/extsdcard";
$sourcefile = "/internal/www";
// Write to a file saying the backup has started
copy($sourcefile, $sdcard) or die("Could not be done");
// Write to a file saying the backup has finished
}
Alternatives are welcome. I would simply go on the tablet myself and copy the files over, but the tablet is honestly too laggy.
If it turns out PHP is not able to function while doing the backup, I will simply have it change the directory name and modify the 404 page to say that the website is temporarily unavailable.
If this helps, I'm using lighttpd and PHP 5.5.15

Security steps to be taken care while file uploads, as its said to be vulnerable when kept in root? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have been going through the file uploads through various tutorials and sources, found that uploading into root or any folder which is web accessible is a security issue and is advisable to keep the upload folder outside the root.
Now, if someone is on shared hosting server like Godaddy, the user will not be having access outside the root folder.
And if really nothing can be done, how these open source software like Wordpress, Joomla, Drupal keep their uploads securely, and almost very much sure about the security?
The thing is what all has to be taken care, to save data securely on web when the condition is that, we only have option to keep our files within root.
Few checklist which i know for secure file uploads when you are forced to keep your files within the public accessible area are as follows:-
Functions to Check Uploaded File Size and Type.
While storing files rename the file to some random names and track the filename through database, md5 and sha1 is great.
Disable Script Execution with .htaccess.
This is an example for calling the uploaded files:-
// this is just example only to show how we can get the files
$imgfile = $rsPhoto['photo']; // or value from database
list($width, $height, $type, $attr) = getimagesize($imgfile);
switch ($type)
{
case 1: $im = imagecreatefromgif($imgfile);
header("Content-type: image/gif");
break;
case 2:
$im = imagecreatefromjpeg($imgfile);
header("Content-type: image/jpeg");
break;
case 3:
$im = imagecreatefrompng($imgfile);
header("Content-type: image/png");
break;
}
This is an example, it is not about saving few image files and retrieving it, the data's as we all know categorized in crucial elements of any business success.. so when such kinds of critical and important data has to be handled, what all options we have to make things perfect and secure as possible?
References:-
Implementing Secure File Upload in PHP
EDIT 1:
Is it a good idea to permanently redirect the domain to a sub folder of your domain...
So that your root is / but after redirection your root is /main-website/.
So if i keep my upload folder in /upload/, i think it will be assumed as outside the web accessible/public area...??
Hence my domain www.xyz.com points to /main-website/, And the upload folder is out of the scope of this domain...
Just a thought came to my mind so putting it up
Thanks.
I will assume that the uploaded files must be World readable. I will assume too that the file can be of any type: pdf, images (pdf, png, ico), documents (docx, xls), etc
The best solution here (generic that not just applies for PHP projects, but web projects in general) is to add a layer of complexity or a layer of redirection: You will save the file with a custom name self generated in the server, and use a BBDD to store the original file name.
Ok, let's see it by example. Imagine we have this directories:
/ -> Root. Its World Readable
/files -> Where we will store your files. World Readable too.
So, when I upload a file to your site named "foo.png", you will save it to "/files" directory, but you must change the name of my file to a auto generated one[1]. Let's say "1234asd". You must write in a BBDD a new record with the old file name ("foo.png") with the new auto-generated one. So... now, ¿how can I download my file if I don't know the new name?
Well, you must create "/file.php" in you server, that will accept GET parameter called "filename". Then I can do: "/file.php?filename=foo.png" and your code will do the follow:
Search in the BBDD if the file "foo.png" exists. If exists, get the name of the real filename. If not exist just return a 404 HTTP Code.
Then read the file from the direcory "/files/1234asd" and return it to the user
This way, any kind of files can be uploaded to you server (*.php files too), and you are secured, because the files are not accessed directly, but throw you PHP code (file.php).
This offers additional advantages like being able to check if a user have permission to read the file he is requesting (by implementing some kind of simple authentication). If the file is not found instead of return a ugly 404 HTTP code (that it's the correct stuff to do in that case) you can return a custom error message saying "ooops! 404 - The file you request is not available" or something like that.
Note: this solution applies to a files from 0 to maybe.... 10MB-20MB. If you are working with larger files, this solution is not optimal, and another approach should be take.
[1] http://php.net/manual/en/function.uniqid.php
Here's a link to (the google cached text only version) an article that is useful in helping secure wordpress.
http://webcache.googleusercontent.com/search?q=cache:V5RddpaOH4IJ:gerry.ws/2008/10/152/setup-and-secure-your-wordpress-upload-directory.html&hl=en&gl=au&strip=1
(i've linked to the google cache version becuase their site makes my chrome/firefox lock up, text only doesn't).
Basically you put your uploads in a location that only the app can access it (above or outside the web location) and then:
limiting the mimetypes of file that can be uploaded (and validating the files to make sure they don't contain known buffer overruns, exploits like exif poisining, embedded executables, etc)
make sure you aren't allowing parent paths
make sure that your upload path calculation is run server side not through some sort of hidden form field etc
make sure the execute access of your server platform (e.g. php/apache) won't execute in that location
make sure that only the web server (e.g. apache) account has rights to write to the location
make sure your scripts validate the data being posted in the upload
see also: http://codex.wordpress.org/Hardening_WordPress
Beside the methods to check file while upload it to your server like: check extension/mimetype, limit upload size... I have more tips bellow.
When I use shared hosting : turn off php execution in upload folder, you can use htaccess with php_flag=off.
When I have dedicated server: Setup a storage without any script(php) execution and upload file via ftp.
While you may not have access to any other folder outside root, typically on shared hosting you have access to your home directory.
I have tested on a server I have, and I can store files on /home/myuser which may be outside webroot (typically located in /home/youruser/www).
Also, you can use /tmp which is writtable for everyone, but if you do so, do not store sensitive data there, as it is readable by all users of hoster.

Check if NFS share is up in PHP

I am working on a system that will store uploaded files. The metadata will go into a locally-accessible database, but the files themselves are going to be stored on a remote box via NFS so that PHP can interact with the server as if it was a directory.
I identified an issue that may occur if somebody attempts to upload a file when the NFS server is down or otherwise unavailable, which could cause the script to error out or hang. Obviously we want to avoid this scenario and handle it in a graceful manner, but we aren't sure how we can do this.
We are thinking of a) checking the server at page-display time and ghosting out the file upload portion of the form should the server be down, or b) checking the link before executing move_uploaded_file to store the uploaded document.
Is it possible to do this from within PHP, and if so, how?
Checkout http://www.php.net/manual/en/function.stream-set-timeout.php
You could write a simple check that tries to write to the NFS share with a 2 second timeout. If it succeeds, proceed with the move_uploaded_file. If it fails, give the user a graceful error.
I don't know what your setup looks like... If you are mounting it, could you use is_writable?
if (!is_writable('/path/to/nfs/share/mount')) {
die('NFS share is not writable!');
}
I'd try to write a small file for real at nfs-mountpoint, if success you're online and can write the posted file.
If not, cache it at webserver-disk for later (automatic) save.
Check if you can opendir() the directory?
<?php
$dir = "/etc/php5/";
// Open a known directory, and proceed to read its contents
if (is_dir($dir)) {
if ($dh = opendir($dir)) {
// do your stuff
closedir($dh);
}
}
?>

Web Development: How can I allow a user to upload files directly to my CDN (Cachefly)?

I have a PHP web-application that allows users to upload images to my web site. I'm doing this using a simply HTML <form enctype="multipart/form-data">
However, instead of having those images uploaded to my web server - I would to have those images uploaded directly to my CDN (Cachefly - which is another server).
Is this possible ... to have a web-application allow a user to upload images directly to another server?
In case it helps, here's my PHP code:
$target_path = "/home/www/example.com/uploads/";
$target_path = $target_path . basename( $_FILES['uploadedfile']['name']);
if(move_uploaded_file($_FILES['uploadedfile']['tmp_name'], $target_path)) {
// file has been uploaded **LOCALLY**
// HOWEVER, instead of it being upload locally, I would like the file
// to be directly uploaded to the CDN ('other' server)
...
} else{
// error: file did not get uploaded correctly
....
}
i think in case of a CDN ... u will first have to receive files on ur server and then using the CDN API upload to their 'bucket'. i dont think u can upload directly to a CDN unless there is a way to map it as a directory on ur server.
Moving / Uploading a file to a service or for you non-direct-accesable server is usually done by using the provider's API
Moving / Uploading a file to a server 'owned' by yourself can be done by using PHP + FTP extensions (for more information: pear.php.net or pecl.php.net)
Moving / Uploading a file to a server 'owned' by yourself and being one of many in a cluster is usually done by uploading the file temporary on 1 server and afterwards a .sh, .bash or whatever is called which activates further transfer processes to another server.
I don't think it's possible to directly upload to another server, but I could be wrong. I had a similar problem, and I used PHP's FTP capabilities (http://us3.php.net/manual/en/book.ftp.php). I still used my server as a middle-man, meaning I uploaded the files to my server, then FTP transferred them to the target server, and then deleted the file from my server.
You could recieve it on your webserver and then transfer it to the CDN via fileshare or FTP.
If the CDN is web-facing, you could re-direct the request to that server and send control back to your webserver form once the file is uploaded. It's probably better to do the file transfer in the back end though and keep the user connected to the web server.
Sure.
Somewhere in your code there is a "$target_directory" variable that needs to be set. It won't be called that, but depeding on how your function is set up it needs to be there -somewhere-. Just use an absolute path for the directory you want the files to land in. Also, make sure that directory is CHMOD'd to 777 so it can be written into.
Post your code and I can help more.
Yes amazon web services already allows you to upload to amazon S3 directly from the user's browser:
Documentation: http://doc.s3.amazonaws.com/proposals/post.html
Additionally that S3 bucket can be exposed via the amazon CDN (or any other cdn that can point to a customer's origin server)

Categories