We have a zip file containing a large JSON file in it that we need to unzip. We currently fetch the file in a lambda using Laravel, copy and unzip it locally (in the lambda), and then upload the JSON file back to S3 for processing.
It however seems that the lambda doesn't process large files (moving it back to s3 after unzipping in the lambda). (180MB). We could continue to up the lambda resources (deployed via Laravel Vapor), however, we're looking for an in-memory option that could perhaps provide a streaming way of unzipping.
Is there such a solution for PHP?
Related
I want users to upload large files (like HD videos and big PDFs) and am using this line of code to upload the files. Storage::disk('s3')->putFile('uploads', new File($request->file('file_upload')));
the problem is even though I have a good internet speed it takes a very long time to upload the file. what can I do to get a faster file upload?
There are actually two network calls involved in the process.
From client-side the file gets uploaded to your server.
Via server-to-server call the file gets uploaded from your server to s3.
The only way to reduce the delay is to directly upload the files from client to s3 using client-side SDKs securely. With this the files will be directly stored in the S3 bucket.
Once the files are uploaded to s3 via AWS S3 client-side SDKs, you can post the attributes of the file along with the download URL to Laravel and update it to DB.
The plus point of this approach is it allows you to show actual file upload progress at the client-side.
This can be done via the AWS amplify library which provides great integration with S3: https://docs.amplify.aws/start
The other options:
JS: https://softwareontheroad.com/aws-s3-secure-direct-upload/
Android:
https://grokonez.com/android/uploaddownload-files-images-amazon-s3-android
iOS:
https://aws.amazon.com/blogs/mobile/amazon-s3-transfer-utility-for-ios/
Please use this
$file_name = $request->file('name');
$disk = Storage::disk('s3');
$disk->put($filePath, fopen($file_name, 'r+'));
Instead of
Storage::disk('s3')->put($filePath, file_get_contents($file_name));
And also increase
post_max_size,upload_max_filesize,max_execution_time,memory_limit
I want to transcode video in 360p, 480p, 720p and then upload to amazon s3.
Currently we are using php library FFMPEG
I have successfully transcode video on my server. But I did not get that how to achieve same on amazon s3.
Do I need to first upload original video on s3 and then get that video and transcode in different format and send to amazon s3? is it possible?
Or if any other way than please suggest me.
S3 is not a block file system, it is an object file system. The difference here is that, normally, you cant mount a S3 bucket like a standard unix FS and work on file with fopen(), fwrite() ect... Some trick exists to work on S3 like any other FS but I would suggest an other option :
You have to transcode the video on a localy mounted FS (like an AWS EFS, or a local file system), then "push" (or upload) the whole transcoded video onto the S3 bucket. Of course, you can improve this process in may ways (remove temp file, do parallel works, use Lambda service, or task in containers...). You should avoid to do many upload/download from or to S3 (because it is time and cost consuming). Use a local storage as much as possible, then push the resulting data when they are ready on S3.
Also AWS have a service to do video transcodification : https://aws.amazon.com/en/elastictranscoder/
I want to download all blob from specified container as a zip file.
Is there any way to download its as zip directly from Azure, no need to process it on my server?
Currently I think like below:
file_put_contents("file_name", get_file_contents($blob_url));
I will store all files on my server and then create zip file of those and then will force to download. .
Azure has no such facility to generate a zip file for a bundle of blobs for you. Azure Storage is just... storage. You'll need to download each of your blobs via php sdk (or directly via api if you so choose).
And if you want the content zip'd, you'll need to zip the content prior to storing in blob storage.
Your code (in your question) won't work as-is, since get_file_contents() expects to work with file I/O, and that's not how to interact with blobs. Rather, you'd do something like this:
$getBlobResult = $blobClient->getBlob("<containername>", "<blobname>");
file_put_contents("<localfilename>", $getBlobResult->getContentStream());
I'm using phpseclib to connect to SFTP from PHP.
I need to get a zip files from the SFTP Server. Those zip files contains xml and jpg files. I should extract data from xml and stream from jpg files and then save all in the database. I can't download the zip file because I haven't write permissions.
Can I get the stream content file (zip, xml and jpg) from remote files? Note that I'm using phpseclib.
You can get the stream content, however for the ZIP you need to program yourself a library that operates on the stream for the ZIP format (I do not know which one of the existing libraries can that out of the box, the ones I know can't, perhaps pclzip).
Then you need to operate on in-memory streams from the ZIP for the XML and JPG files which again needs to use stream compatible libraries - most likely you will need them to write yourself in context of PHP. Or at least invest a fair amount of time to deal this way.
But yes, it's perfectly possible. And (hopefully) you can do it.
I have a Codeigniter web app that is uploading many tiny files every hour to Amazon S3, which is causing my S3 request charges to shoot up real fast. One way to overcome this will be to zip up the file, upload the zip file to S3, then unzip it when it is on S3.
Can this be done using EC2? Or is there a better method to achieve this? Thank you!!
EDIT: If I were to use EC2, do I use PHP to trigger the creation of a EC2 instance, upload the PHP file required to unzip the zipped files, copy the uncompressed files to S3, then destroy the EC2 instance?
If you have an EC2 machine in the same region I would suggest you upload it there zipped and then it drop it to s3 from there unzipped. S3 cannot unzip it on its own as its all static.
Theres no charges between ec2 and s3 so ec2 can handle the unzipping and then write it out into your s3 bucket without additional transfer charges.
You can write code in a lambda to unzip a file of S3 bucket, you just have to use it, AWS Lambda will do this for you.
Referece:
https://github.com/carloscarcamo/aws-lambda-unzip-py/blob/master/unzip.py
https://github.com/mehmetboraezer/aws-lambda-unzip
S3 is just storage. Whatever file you upload is the file that is stored. You cannot upload a zip file then extract it once its in S3. If you wrote the application the best thing I could say is to try to re-design how you store the files. S3 requests are pretty cheap... you must be making a lot of requests.
I have been using this service to unzip files full of thousands of tiny image files, each zip I upload is about 4GB, and costs around $1 to unzip using http://www.cloudzipinc.com/service/s3_unzip, maybe that might help someone.
Having said that, you might find it easier to use Python with the Boto library. That will work far more efficiently than PHP.