Storing and retrieving user files on a different server / sub domain - php

I'm looking for some quick info about best practices on storing user's uploaded files on different servers or sub domains...
For example, photos on facebook of course arent on facebook.com/files/users/453485 etc...
but rather on photos.ak.fbcdn.net or whatever...
I'm wondering how with php i can upload to a different server whilst maintaining a mysql connection to my original... is it possible?

Facebook uses a content delivery network (cdn, hence fbcdn or facebook content delivery network) and probably uses webservices to pass binary data (photos) from server to server.
Rackspace Cloud offers a similar service. Here is an example application of their PHP library to access their webservice api: http://cloudfiles.rackspacecloud.com/index.php/Sample_PHP_Application

I'm going to make the assumption that you have multiple webservers, and want to be able to access the same set of files on each one. In that case, some sort of shared storage that each machine can access might be a good place to start.
Here are a couple options I've used:
Shared NFS Volume [ http://en.wikipedia.org/wiki/Network_File_System_(protocol) ]
MogileFS [ http://www.danga.com/mogilefs/ ]
Amazon S3 [ http://aws.amazon.com/s3/ ]
If you don't have control over the hardware or aren't able to install extra software, I'd suggest Amazon S3. There is an api that you can use to shuttle files back and forth. The only downside is that you don't get to use storage that you may already use, and it will cost you some money.
If you do have access to the hardware and software, MogileFS is somewhat like S3, in that you have an api to access the files. But is different in that you get to use your existing storage and get to do so for no additional cost.
NFS is a typical place where people will start, because it's the simplest way to get started. The downside is that you'll have to be able to configure servers, and setup a NFS volume for them to mount.
But if I were starting a high-volume photo hosting service, I'd use S3, and I'd put a CDN like Akamai in front of it.

Related

Image Storage Technique

We have six appication server(under LVS) which send request randomly to all servers.
Around 8 years back we used to store images in databases.
Pros : Can be accessed from all the application server
Cons : Slow
Around 5 years back, we shifted images to store as a file on one of the six application server with the help of nginx rules that make sure all image read/write request go to single server.
Pros : Fast
Cons : All images read/write request go to single server.
Question: Is there any better images to solve the following issue:
1. Can be accessed from all application server.
2. Fast access
Note : we move images to common image server after some time.
We don not move instantly as we dont want to reply on that server and also it will increase user upload time.
You can leverage the power of Content Delivery Networks (CDN) and storage buckets which is provided by services like AWS.
Upload all images to a single server say an AWS S3 bucket https://aws.amazon.com/s3/ which will help you to get all images from a central server and will be accessible from all application servers.
You can then link your S3 bucket with a CDN service like AWS's Cloudfront or some of the free services like Cloudflare.
https://aws.amazon.com/cloudfront/
Read more about how to use S3 for PHP here:
https://devcenter.heroku.com/articles/s3
http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrieveObjSingleOpPHP.html
Read more about linking an S3 bucket to cloudfront here:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/MigrateS3ToCloudFront.html
So AWS's S3 will provide you globally accessible images and Cloudfront CSN will provide you amazing speeds.

Google Compute Engine - PHP upload files to bucket

I created a Google Cloud Platform account, deployed the LAMP server, and logged in through SFTP and uploaded my existing site.
My site is currently running on another server, so I also uploaded all my databases and everything.
On my current server with hostgator I upload files to the /home/username/public_html/uploads folder, but on google you can upload to buckets. I figured I would use their Cloud Storage instead.
I tried doing a PHP move_uploaded_file() to the bucket: gs://mybucket/uploads; however, that's not working.
What's the trick? The only documentation I can find is using the App Engine, but I'm just using the Compute Engine with LAMP installed.
Also, which would be better? Should I stick with saving the uploads to the LAMP server, or should I use buckets?
Thanks!
You could use the json api directly:
https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload
There is also a PHP library for using the Google APIs in general:
https://developers.google.com/api-client-library/php/start/get_started
Then you could use that library to insert objects into your bucket with a call like this:
https://cloud.google.com/storage/docs/json_api/v1/objects/insert
If you're already running a LAMP stack, it's fine to serve files directly from the VM itself, using its local disk; you don't need to start using Google Cloud Storage for typical, simple use cases.
Google Cloud Storage, which is where the buckets terminology comes from, is useful for storing lots of files, or for serving a very large site, either for processing via Hadoop or for serving to a global audience. Because this involves more work on your part, this makes sense if your site is so popular that the cost of serving data directly from the VM is more expensive than using Google Cloud Storage.
However, Google Cloud Storage has much higher scalability than disks on a VM (though you can attach additional disks, you're limited to having up to 10TB of persistent disk per VM, and it's not an automatic process), but that depends whether your use case will need that much space.
Bottom line: if your site is working fine with the LAMP stack on your VM, it's fine to keep it. It's a good idea to keep track of your TCO so keep an eye on the cost of your compute, storage, and network traffic, and see at which point it may make sense to move some of your assets to Google Cloud Storage.
You can use Google Cloud Platform Pricing Calculator to estimate your costs, but also take a look at pricing for Google Compute Engine (which includes compute, storage, and networking) as well as Google Cloud Storage, and see how your particular use case will work out.

Online file storage logic

I want to develop a website like file manager. Where user register and will get fix disk space lets say 20MB.
Now user can upload their pdf, doc, txt, jpeg etc files upto their disk limit.
I can develop upto this using PHP.
Now below is my issue:
1) If user's files are corrupted they can rollback their folders before 2-3 days.
Files must be secure and safe from viruses as users are uploading their important documents.
Is there any 3rd party storage server who provides such facility?
2) Also all files should be previewed from browser.
I am using Google doc viewer. Is is good and safe way to preview file in browser?
But google links are accessible from all, I need to add some restrictions as file can be viewed only by their owner.
I know it's a major task, but i just need some sort of logic. Please share your thoughts.
Thanks.
Any cloud storage service can be used for this. You'll get HDD space. There is not storage server who provides revision control system for this. You can use git, svn for this though. But as the files are binary you can not get full facility of these tools.
How file will be previewed depends on you. If you use PHP you make the site and at the backend you use the API to interact with the storage service. Google doc is not an option for this if you use PHP. Also note Google links can be made private.
I suggest you this,
Find a cloud storage service and use the storage in your server. Any will do.
Create UI using PHP and control the access using PHP too.
Manipulate files in your server directly or in 3rd party storage server via API
Use a revision control system to track the changes. And use its API in PHP end.
Some cloud storage service
Amazon S3. It also supports Versioning.
Google Cloud Storage
Microsoft Azure
Try Microsoft SkyDrive or Google Drive or Dropbox

How can I create an Amazon S3 clone on my host?

I am currently building a storage service, but I am very small and don't want to setup or pay for an Amazon S3 account. I already have my own hosting service which I want to use. However, I want to make it simple to switch to Amazon S3 if the need arises. Therefore, I would like to have essentially an S3 'clone' on my server, which I can simply redirect to amazons servers at a later time. Is there any package that can do this?
EDIT: I am on a shared server where I cannot install software, so is there a simple php page that can do this?
Nimbus allows for that. From FAQ:
Cumulus is an open source implementation of the S3 REST API. Some
features such as versioning and COPY are not yet implemented, but some
additional features are added, such as file system usage quotas.
http://www.nimbusproject.org/doc/nimbus/faq/#what-is-cumulus
http://www.ubuntu.com/cloud
you would need several host computers but it works well.
I too have a backup service, but it's based on several 48TB raid arrays.

Syncing data and images with a client in a web application

I'm writing a web application in PHP which needs to store images and image meta data. In future, the application may need to work offline on the client. A user might need to download all the images and data to his laptop before going to a remote area without internet access. Whilst at the remote location the user could add new images to the system and be able to compare them with his local copy of the image database. When returning to an area with internet access, the user would run a sync operation which would copy his new images to the server and retrieve any new ones.
I've looked at the new web storage / localstorage options in HTML5 (web sql database seems to have been dropped) and I think this is going to be too limited as there is only 5MB space and one or two images could easily exceed that.
Is what I want to do actually possible / practical with a browser-based web application? Or should I be looking at writing a desktop/tablet application with local file storage capabilities for users without net access. Initially, it does need to be a web application, I'm just trying to think ahead. Will I give myself more options in future by using something like couchDB for the backend from the start? As I understand it, this comes with good syncing functionality.
Thanks,
I decided to use Titanium Desktop.
http://www.appcelerator.com/products/titanium-desktop-application-development/

Categories