Google Compute Engine - PHP upload files to bucket - php

I created a Google Cloud Platform account, deployed the LAMP server, and logged in through SFTP and uploaded my existing site.
My site is currently running on another server, so I also uploaded all my databases and everything.
On my current server with hostgator I upload files to the /home/username/public_html/uploads folder, but on google you can upload to buckets. I figured I would use their Cloud Storage instead.
I tried doing a PHP move_uploaded_file() to the bucket: gs://mybucket/uploads; however, that's not working.
What's the trick? The only documentation I can find is using the App Engine, but I'm just using the Compute Engine with LAMP installed.
Also, which would be better? Should I stick with saving the uploads to the LAMP server, or should I use buckets?
Thanks!

You could use the json api directly:
https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload
There is also a PHP library for using the Google APIs in general:
https://developers.google.com/api-client-library/php/start/get_started
Then you could use that library to insert objects into your bucket with a call like this:
https://cloud.google.com/storage/docs/json_api/v1/objects/insert

If you're already running a LAMP stack, it's fine to serve files directly from the VM itself, using its local disk; you don't need to start using Google Cloud Storage for typical, simple use cases.
Google Cloud Storage, which is where the buckets terminology comes from, is useful for storing lots of files, or for serving a very large site, either for processing via Hadoop or for serving to a global audience. Because this involves more work on your part, this makes sense if your site is so popular that the cost of serving data directly from the VM is more expensive than using Google Cloud Storage.
However, Google Cloud Storage has much higher scalability than disks on a VM (though you can attach additional disks, you're limited to having up to 10TB of persistent disk per VM, and it's not an automatic process), but that depends whether your use case will need that much space.
Bottom line: if your site is working fine with the LAMP stack on your VM, it's fine to keep it. It's a good idea to keep track of your TCO so keep an eye on the cost of your compute, storage, and network traffic, and see at which point it may make sense to move some of your assets to Google Cloud Storage.
You can use Google Cloud Platform Pricing Calculator to estimate your costs, but also take a look at pricing for Google Compute Engine (which includes compute, storage, and networking) as well as Google Cloud Storage, and see how your particular use case will work out.

Related

Creation of large files in Google App Engine flexible using PHP

I have project that needs to output hundreds of photos from one template file, compress them into a .zip file, and push them to the customer's browser. After that, the .zip file can be deleted.
Google App Engine (PHP) does not allow you to write files like you would in a standard web server.
How can this be accomplished with GAE flexible?
As you have already known, App Engine flexible does not allow you to write files on the system even if it runs on a VM. The reason is that it runs within more Docker containers and you will not have the guarantee that you will find the file.
An alternative for this is to change a bit your workflow and to use Cloud Storage as an intermediate. You can send the photos directly to Cloud Storage, and the users will be able to download them directly from Cloud Storage. Here you have a guide on how to achieve this from App Engine flex for PHP.

Google Cloud Platform files location

I wanted to move a website from a shared server to Google Cloud but I cannot wrap my head around it. Before giving up completely, I decided to make this question:
I already completed the Hello World tutorial (https://cloud.google.com/php/getting-started/hello-world). But what if I want to update the index.html file? Where would I find it?
I was expecting to see it in one of the storage Buckets, but that's not the case... even when installing a Kubernetes Engine.
If you decide to use Google App Engine Flexible (as the hello world sample app that you linked to) you need to understand the idea of this additional layer of abstraction over your server(s). App Engine Flexible is designed to make things easier for you - you focus on your code in your local machine where you modify it, update it and then with one command (gcloud app deploy) you instruct the App Engine to do one of the following:
start a VM (your server) and a Docker container with your app in it
if it's not already running
in case you are updating an existing app, it will update the code in the VM which is your server. If your app receives a lot of traffic, you may have more than one container and VM running and all of them will get updated.
Both things are presented schematically in the image in this section.
This way you can develop your app locally and not worry about actually getting inside the server with for e.g. ssh. Your code is there in those VM(s) and App Engine manages it for you (however, if you really need to, it is still possible to ssh into the VM in App Engine Flex environment).
If you have a static website, it can be hosted in the Storage buckets, which is a different scenario. However, as you're using PHP I assume it's more likely that your website is dynamic.

Online file storage logic

I want to develop a website like file manager. Where user register and will get fix disk space lets say 20MB.
Now user can upload their pdf, doc, txt, jpeg etc files upto their disk limit.
I can develop upto this using PHP.
Now below is my issue:
1) If user's files are corrupted they can rollback their folders before 2-3 days.
Files must be secure and safe from viruses as users are uploading their important documents.
Is there any 3rd party storage server who provides such facility?
2) Also all files should be previewed from browser.
I am using Google doc viewer. Is is good and safe way to preview file in browser?
But google links are accessible from all, I need to add some restrictions as file can be viewed only by their owner.
I know it's a major task, but i just need some sort of logic. Please share your thoughts.
Thanks.
Any cloud storage service can be used for this. You'll get HDD space. There is not storage server who provides revision control system for this. You can use git, svn for this though. But as the files are binary you can not get full facility of these tools.
How file will be previewed depends on you. If you use PHP you make the site and at the backend you use the API to interact with the storage service. Google doc is not an option for this if you use PHP. Also note Google links can be made private.
I suggest you this,
Find a cloud storage service and use the storage in your server. Any will do.
Create UI using PHP and control the access using PHP too.
Manipulate files in your server directly or in 3rd party storage server via API
Use a revision control system to track the changes. And use its API in PHP end.
Some cloud storage service
Amazon S3. It also supports Versioning.
Google Cloud Storage
Microsoft Azure
Try Microsoft SkyDrive or Google Drive or Dropbox

How can I create an Amazon S3 clone on my host?

I am currently building a storage service, but I am very small and don't want to setup or pay for an Amazon S3 account. I already have my own hosting service which I want to use. However, I want to make it simple to switch to Amazon S3 if the need arises. Therefore, I would like to have essentially an S3 'clone' on my server, which I can simply redirect to amazons servers at a later time. Is there any package that can do this?
EDIT: I am on a shared server where I cannot install software, so is there a simple php page that can do this?
Nimbus allows for that. From FAQ:
Cumulus is an open source implementation of the S3 REST API. Some
features such as versioning and COPY are not yet implemented, but some
additional features are added, such as file system usage quotas.
http://www.nimbusproject.org/doc/nimbus/faq/#what-is-cumulus
http://www.ubuntu.com/cloud
you would need several host computers but it works well.
I too have a backup service, but it's based on several 48TB raid arrays.

Storing and retrieving user files on a different server / sub domain

I'm looking for some quick info about best practices on storing user's uploaded files on different servers or sub domains...
For example, photos on facebook of course arent on facebook.com/files/users/453485 etc...
but rather on photos.ak.fbcdn.net or whatever...
I'm wondering how with php i can upload to a different server whilst maintaining a mysql connection to my original... is it possible?
Facebook uses a content delivery network (cdn, hence fbcdn or facebook content delivery network) and probably uses webservices to pass binary data (photos) from server to server.
Rackspace Cloud offers a similar service. Here is an example application of their PHP library to access their webservice api: http://cloudfiles.rackspacecloud.com/index.php/Sample_PHP_Application
I'm going to make the assumption that you have multiple webservers, and want to be able to access the same set of files on each one. In that case, some sort of shared storage that each machine can access might be a good place to start.
Here are a couple options I've used:
Shared NFS Volume [ http://en.wikipedia.org/wiki/Network_File_System_(protocol) ]
MogileFS [ http://www.danga.com/mogilefs/ ]
Amazon S3 [ http://aws.amazon.com/s3/ ]
If you don't have control over the hardware or aren't able to install extra software, I'd suggest Amazon S3. There is an api that you can use to shuttle files back and forth. The only downside is that you don't get to use storage that you may already use, and it will cost you some money.
If you do have access to the hardware and software, MogileFS is somewhat like S3, in that you have an api to access the files. But is different in that you get to use your existing storage and get to do so for no additional cost.
NFS is a typical place where people will start, because it's the simplest way to get started. The downside is that you'll have to be able to configure servers, and setup a NFS volume for them to mount.
But if I were starting a high-volume photo hosting service, I'd use S3, and I'd put a CDN like Akamai in front of it.

Categories