Creation of large files in Google App Engine flexible using PHP - php

I have project that needs to output hundreds of photos from one template file, compress them into a .zip file, and push them to the customer's browser. After that, the .zip file can be deleted.
Google App Engine (PHP) does not allow you to write files like you would in a standard web server.
How can this be accomplished with GAE flexible?

As you have already known, App Engine flexible does not allow you to write files on the system even if it runs on a VM. The reason is that it runs within more Docker containers and you will not have the guarantee that you will find the file.
An alternative for this is to change a bit your workflow and to use Cloud Storage as an intermediate. You can send the photos directly to Cloud Storage, and the users will be able to download them directly from Cloud Storage. Here you have a guide on how to achieve this from App Engine flex for PHP.

Related

PHP files on Google cloud storage without AppEngine

I'm trying to link php files on Google cloud storage, but getting them as plain text. I want to connect to the other file but without App Engine, all the documentation of this is written exclusively for app engine and can't even find if bucket storage has php support even tho it should. Any clue of how to link a PHP file correctly on Cloud Storage?
Google cloud storage is storage service which can store your object(files) and return when you ask for it.Files like HTML,JavaScript and CSS that render inside browser can be placed on google cloud storage to get performance in loading time.But in your php file need execution engine to run it code.Placing them on storage and accessing will get to file context.

How to manipulate cloud server files using Laravel?

We have developed our application in Laravel and now we are planning to transfer it to Amazon server where we have to separate our application logic with file storage. Basically, we want to move our whole application storage to a cloud server(Amazon S3) and application logic to Amazon EC2 server.
In our system, we manipulate (resize images, merge images, make thumbnails from videos, etc) many storage files locally. We will not be going to store any files on the application server once we migrate to Amazon server. So, our concern is how we can manipulate cloud server files?
Earlier all files are present on the application server so file manipulation was easy to process but after migrating whole storage to cloud server how we can manipulate files that are on the cloud server with manipulation logic resides on the application server?
Any Response will be helpful
Thanks in advance...
To manipulate S3 file, I think first we need to download the file locally. Once, we have file locally we can apply any operation on that particular file. We can delete the local file later.
Here are the documents to directly upload from or download to a local file using the Amazon S3.
https://aws.amazon.com/blogs/developer/transferring-files-to-and-from-amazon-s3/
https://docs.aws.amazon.com/aws-sdk-php/v3/guide/
Thanks

How can i upload file to azure wwwroot folder

I use this code
move_uploaded_file($file_tmp,$us_id.".jpg");
but after run this script it not error but file not appear into folder ,how can it do?
before these, I test in localhost it work.
You haven't specified what $file_tmp contains, but... in an Azure Web App, the root folder is at d:\home\site\wwwroot. And you'll find the %HOME% environment variable set to d:\home.
Edited based on #David's comments and Kudu's Azure runtime environment
In a Cloud environment, saving files to the current filesystem of your Web App is not advised. If you simply upload your site through FTP, you might not have issues, but if you rely on Continuous Integration or automated deployment scenarios, your site folder might change or have content overwritten.
That is why, for storage of files that need to be accesed in the future or need to be permantently saved, you might want to use something like Azure Blob Storage. Not only is really cheap, but you can apply CDN over it for improving your files delivery.
Here is how to use it on PHP and the Azure SDK for PHP.
As I leverage your code at Azure window sever can't upload file from php in my test project, it works perfectly on my side, even I don't value the $us_id in your code, the picture is still updated to the uploadimg folder with the name .jpg.
I suspect whether if your Web Apps's disk space has reached the limitation of your App Service plan.
As every App Service pricing tier has a limit disk space and will shared in all the web apps in this App Service plan. You can check the metric on dashboard page of your web app portal, e.g.
You can refer to https://azure.microsoft.com/en-us/pricing/details/app-service/ for details.

Google Compute Engine - PHP upload files to bucket

I created a Google Cloud Platform account, deployed the LAMP server, and logged in through SFTP and uploaded my existing site.
My site is currently running on another server, so I also uploaded all my databases and everything.
On my current server with hostgator I upload files to the /home/username/public_html/uploads folder, but on google you can upload to buckets. I figured I would use their Cloud Storage instead.
I tried doing a PHP move_uploaded_file() to the bucket: gs://mybucket/uploads; however, that's not working.
What's the trick? The only documentation I can find is using the App Engine, but I'm just using the Compute Engine with LAMP installed.
Also, which would be better? Should I stick with saving the uploads to the LAMP server, or should I use buckets?
Thanks!
You could use the json api directly:
https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload
There is also a PHP library for using the Google APIs in general:
https://developers.google.com/api-client-library/php/start/get_started
Then you could use that library to insert objects into your bucket with a call like this:
https://cloud.google.com/storage/docs/json_api/v1/objects/insert
If you're already running a LAMP stack, it's fine to serve files directly from the VM itself, using its local disk; you don't need to start using Google Cloud Storage for typical, simple use cases.
Google Cloud Storage, which is where the buckets terminology comes from, is useful for storing lots of files, or for serving a very large site, either for processing via Hadoop or for serving to a global audience. Because this involves more work on your part, this makes sense if your site is so popular that the cost of serving data directly from the VM is more expensive than using Google Cloud Storage.
However, Google Cloud Storage has much higher scalability than disks on a VM (though you can attach additional disks, you're limited to having up to 10TB of persistent disk per VM, and it's not an automatic process), but that depends whether your use case will need that much space.
Bottom line: if your site is working fine with the LAMP stack on your VM, it's fine to keep it. It's a good idea to keep track of your TCO so keep an eye on the cost of your compute, storage, and network traffic, and see at which point it may make sense to move some of your assets to Google Cloud Storage.
You can use Google Cloud Platform Pricing Calculator to estimate your costs, but also take a look at pricing for Google Compute Engine (which includes compute, storage, and networking) as well as Google Cloud Storage, and see how your particular use case will work out.

Online file storage logic

I want to develop a website like file manager. Where user register and will get fix disk space lets say 20MB.
Now user can upload their pdf, doc, txt, jpeg etc files upto their disk limit.
I can develop upto this using PHP.
Now below is my issue:
1) If user's files are corrupted they can rollback their folders before 2-3 days.
Files must be secure and safe from viruses as users are uploading their important documents.
Is there any 3rd party storage server who provides such facility?
2) Also all files should be previewed from browser.
I am using Google doc viewer. Is is good and safe way to preview file in browser?
But google links are accessible from all, I need to add some restrictions as file can be viewed only by their owner.
I know it's a major task, but i just need some sort of logic. Please share your thoughts.
Thanks.
Any cloud storage service can be used for this. You'll get HDD space. There is not storage server who provides revision control system for this. You can use git, svn for this though. But as the files are binary you can not get full facility of these tools.
How file will be previewed depends on you. If you use PHP you make the site and at the backend you use the API to interact with the storage service. Google doc is not an option for this if you use PHP. Also note Google links can be made private.
I suggest you this,
Find a cloud storage service and use the storage in your server. Any will do.
Create UI using PHP and control the access using PHP too.
Manipulate files in your server directly or in 3rd party storage server via API
Use a revision control system to track the changes. And use its API in PHP end.
Some cloud storage service
Amazon S3. It also supports Versioning.
Google Cloud Storage
Microsoft Azure
Try Microsoft SkyDrive or Google Drive or Dropbox

Categories