Large file upload to google cloud storage using PHP - php

I'm trying to upload large files from server to cloud storage (files over 500mb) and I'm getting PHP time outs. I've tried looking at the Google Client Library documentation and I've crawled through stackoverflow, but I can't find anything that could help me. Also is there any way of tracking the progress of the upload?
Here's the code I'm using at the moment:
$options = [
'resumable' => true,
'chunkSize' => 524288
];
$uploader = $bucket->getResumableUploader(
fopen('uploads/' . $name, 'r'),
$options
);
try {
$object = $uploader->upload();
} catch (GoogleException $ex) {
$resumeUri = $uploader->getResumeUri();
$object = $uploader->resume($resumeUri);
}

Related

FFMPEG conversion (h.264) taking long time for short videos

I am trying to record the video and upload into the aws s3 server. Vuejs as front end and php Laravel as backend, I was not using any conversion before saving it to s3. Due to this if any recording recorded from android cannot be played in apple device due to some codecs..
To over come this, I am using ffmpeg to encode in X264() format to make it play in apple and android device regardless on which device the recording is done.
1 min video taking 6-7 minutes using ffmpeg. I thought may be aws s3 taking time to save, i commented "saving to s3 bucket code" still very slow to save temp public folder in php.
please check the code if i am missing anything to make conversion quick. if any solution update answer with reference link or code snippet with reference to my code below.
public function video_upload(Request $request)
{
// Response Declaration
$response=array();
$response_code = 200;
$response['status'] = false;
$response['data'] = [];
// Validation
// TODO: Specify mimes:mp4,webm,ogg etc
$validator = Validator::make(
$request->all(), [
'file' => 'required'
]
);
if ($validator->fails()) {
$response['data']['validator'] = $validator->errors();
return response()->json($response);
}
try{
$file = $request->file('file');
//convert
$ffmpeg = FFMpeg\FFMpeg::create();
$video = $ffmpeg->open($file);
$format = new X264();
//end convert
$file_name = str_replace (' ', '-', Hash::make(time()));
$file_name = preg_replace('/[^A-Za-z0-9\-]/', '',$file_name).'.mp4';
$video->save($format, $file_name);
$file_folder = 'uploads/video/';
// Store the file to S3
// $store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents($file));
$store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents($file_name));
if($store){
// Replace old file if exist
//delete the file from public folder
$file = public_path($file_name);
if (file_exists($file)) {
unlink($file);
}
if(isset($request->old_file)){
if(Storage::disk('s3')->exists($file_folder.basename($request->old_file))) {
Storage::disk('s3')->delete($file_folder.basename($request->old_file));
}
}
}
$response['status'] = true;
$response['data']= '/s3/'.$file_folder. $file_name;
}catch (\Exception $e) {
$response['data']['message']=$e->getMessage()."line".$e->getLine();
$response_code = 400;
}
return response()->json($response, $response_code);
}
Its blocking point for me. I cannot let user to wait 5-6 mins to upload 1 min video.

Microsoft OneDrive SDK run as cron

Hi am using this OneDrive SDK . https://github.com/krizalys/onedrive-php-sdk . I am using PHP. What I need is to create a cronjob that will fetch my files in OneDrive and save it to my local directory. It works fine following the tutorial in the SDK. However, the way that SDK works is it needs you to be redirected to Microsoft account login page to be authenticated.
This will require a browser. My question, is it possible to do this by just running it in the backend like a cron? I can't seem to find a way to do it using the SDK. In Google, they will provide you a key to access the services without logging in every time. I am not sure about OneDrive.
$localPath = __DIR__.'/uploads/';
$today = date('Y-m-d');
$folder = $client->getMyDrive()->getDriveItemByPath('/'.$today);
echo "<pre>";
try {
$files = $folder->getChildren();
$createdDirectory = $localPath.$today;
// Check if directory exist
if(is_dir($createdDirectory)){
echo "\n"." Directory ".$createdDirectory." already exists, creating a new one..";
// Create new directory
$uuid1 = Uuid::uuid1();
$createdDirectory = $createdDirectory.$uuid1->toString();
echo "\n".$createdDirectory." created..";
}
// Create directory
mkdir($createdDirectory);
echo "\n".count($files)." found for ".$today;
// Loop thru files inside the folder
foreach ($files as $file){
$save = $file->download();
// Write file to directory
$fp = fopen($createdDirectory.'/'.$file->name, 'w');
fwrite($fp, $save);
echo("\n File ".$file->name." saved!");
}
} catch (Exception $e){
die("\n".$e->getMessage());
}
die("\n Process Complete!");
My code is something like that in the redirect.php
It doesn't look like that SDK supports the Client Credentials OAuth Grant. Without that, the answer is no. You may want to look at the official Microsoft Graph SDK for PHP which supports this via the Guzzle HTTP client:
$guzzle = new \GuzzleHttp\Client();
$url = 'https://login.microsoftonline.com/' . $tenantId . '/oauth2/token?api-version=1.0';
$token = json_decode($guzzle->post($url, [
'form_params' => [
'client_id' => $clientId,
'client_secret' => $clientSecret,
'resource' => 'https://graph.microsoft.com/',
'grant_type' => 'client_credentials',
],
])->getBody()->getContents());
$accessToken = $token->access_token;

Google Cloud Storage File Download via PHP

I'm totally new to Google Cloud Storage (since I used Amazon S3 until now).
I want to set up my web application, so that users can download files directly from the Google Cloud Storage.
I've already tried it, using the Google Api PHP Client, but didn't get a functionally code.
I've uploaded a test file named "test.jpg" to my bucket "test-bucket-46856" which I want to download via signed url (so that users only have time limited access), but I have no idea how to get this started.
Please help. Thanks.
//Edit:
Found the perfect solution. Here the link for all others who are also searching for this solution:
https://gist.github.com/stetic/c97c94a10f9c6b591883
<?php
/*
* PHP Example for Google Storage Up- and Download
* with Google APIs Client Library for PHP:
* https://github.com/google/google-api-php-client
*/
include( "Google/Client.php" );
include( "Google/Service/Storage.php" );
$serviceAccount = "1234-xyz#developer.gserviceaccount.com";
$key_file = "/path/to/keyfile.p12";
$bucket = "my_bucket";
$file_name = "test.txt";
$file_content = "01101010 01110101 01110011 01110100 00100000 01100001 00100000 01110100 01100101 01110011 01110100";
$auth = new Google_Auth_AssertionCredentials(
$serviceAccount,
array('https://www.googleapis.com/auth/devstorage.read_write'),
file_get_contents($key_file)
);
$client = new Google_Client();
$client->setAssertionCredentials( $auth );
$storageService = new Google_Service_Storage( $client );
/***
* Write file to Google Storage
*/
try
{
$postbody = array(
'name' => $file_name,
'data' => $file_content,
'uploadType' => "media"
);
$gsso = new Google_Service_Storage_StorageObject();
$gsso->setName( $file_name );
$result = $storageService->objects->insert( $bucket, $gsso, $postbody );
print_r($result);
}
catch (Exception $e)
{
print $e->getMessage();
}
/***
* Read file from Google Storage
*/
try
{
$object = $storageService->objects->get( $bucket, $file_name );
$request = new Google_Http_Request($object['mediaLink'], 'GET');
$signed_request = $client->getAuth()->sign($request);
$http_request = $client->getIo()->makeRequest($signed_request);
echo $http_request->getResponseBody();
}
catch (Exception $e)
{
print $e->getMessage();
}

Laravel and AWS PHP SDK - Unable to delete a local file after it was uploaded to S3

I am trying to delete a file from a local directory right after I have uploaded it to AWS S3. When I run it on Vagrant I get an error = Text-file:busy, and when I run it on xampp I get the error :permission denied. For some reason the AWS S3 PutObject method is not releasing the file handle. I have tried to unset the s3 object but that didn't work.
Here is the code:
$tempName = public_path().'/path/to/file'
//Initialize AWS
$s3 = AWS::createClient('s3');
//Upload image to AWS
try {
$reponse = $s3->putObject(array(
'Bucket' => 'zotamoda',
'Key' => $productImage->image_folder."/".$productImage->image_name,
'SourceFile' => $tempName,
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
// The AWS error code (e.g., )
echo $e->getAwsErrorCode() . "\n";
// The bucket couldn't be created
echo $e->getMessage() . "\n";
}
//Delete image from temporary location
unlink($tempName);
You could try:
Storage::disk('s3')->put($productImage->image_folder."/".$productImage->image_name, file_get_contents($tempName), 'public');
unlink($tempName);
or, assuming that $tempName is relative to your project root:
Storage::disk('local')->delete($tempName)
I think you should try calling:
gc_collect_cycles();
before deleting the file

Using PHP to upload to Amazon S3

I've spent the last few hours following tutorials for implementing file uploads to Amazon S3 using php. I uploaded the most recent version of Donovan Schönknecht's S3 class to my server (as S3.php) and I am trying to use the following code to test upload capability. I know this code will work because I've seen numerous examples in action.
<?php
require('S3.php');
$s3 = new S3('KEY', 'SECRET KEY');
//insert into s3
$new_name = time() . '.txt';
S3::putObject(
'upload-me.txt',
'bucketName',
$new_name,
S3::ACL_PUBLIC_READ,
array(),
array(),
S3::STORAGE_CLASS_RRS
);
?>
I get an error 500 server error when I attempt to load this page. Additionally, every other reputable tutorial of this nature has given me the same error 500.
I verified that my key and secret key are valid by connecting to S3 with Cyberduck.
Does anyone have a clue as to what I could be doing incorrectly?
Thanks,
Sean
As it turns out, I was missing the cURL extension for PHP and this was causing an issue as the S3 class I was using required the use of cURL. All is working now.
You should also consider using the official AWS SDK for PHP. Examples for using S3 with the SDK can be found in their S3 user guide.
You can download most recent version of Amazon PHP SDK by running following composer command
composer require aws/aws-sdk-php
Further configuration to upload file on Amazon s3 are following
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// Set Amazon s3 credentials
$client = S3Client::factory(
array(
'key' => "your-key",
'secret' => "your secret key"
)
);
try {
$client->putObject(array(
'Bucket'=>'your-bucket-name',
'Key' => 'your-filepath-in-bucket',
'SourceFile' => 'source-filename-with-path',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Get step by step details from here Amazon S3 File Upload Using PHP
Following example worked for me:
<?php
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$client = S3Client::factory([
'version' => 'latest',
'region' => 'us-west-1',
'credentials' => [
'key' => "<scret-key>",
'secret' => "<my-secret>"
]
]);
try {
$client->putObject([
'Bucket' =>'<my-bucket-name>',
'Key' => '<file-name>',
'SourceFile' => '<file-path-on-server>', // like /var/www/vhosts/mysite/file.csv
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Getting security credentials:
https://aws.amazon.com/blogs/security/wheres-my-secret-access-key/
https://console.aws.amazon.com/iam/home?#/security_credential
Getting region code
https://docs.aws.amazon.com/general/latest/gr/rande.html
Use this one to Upload Images using a form and it's working Fine for me
you may try using it with your code
$name = $_FILES['photo']['name'];
$size = $_FILES['photo']['size'];
$tmp = $_FILES['photo']['tmp_name'];
//////Upload Process
// Bucket Name
$bucket = 'bucket-name';
require_once('S3.php');
//AWS access info
$awsAccessKey = 'awsAccessKey';
$awsSecretKey = 'awsSecretKey';
//instantiate the class
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
//Rename image name.
$actual_image_name = time();
//Upload to S3
if($s3->putObjectFile($tmp, $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}else{
echo 'error uploading to S3 Amazon';
}
I never found a updated Script with Amazons latest sdk. i have made it by myself. it woks as a php commandline interpreter script. give it a try :
https://github.com/arizawan/aiss3clientphp
I'm not familiar with S3 API, but i used it as the storage with https://github.com/KnpLabs/Gaufrette. Gaufrette is a library that provides pretty nice abstraction layer above S3 and other file services/systems.
Here is sample code to upload images on Amazon S3.
// Bucket Name
$bucket="BucketName";
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'ACCESS_KEY');
if (!defined('awsSecretKey')) define('awsSecretKey', 'ACCESS_Secret_KEY');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
if($s3->putObjectFile($tmp, $bucket , $image_name_actual,S3::ACL_PUBLIC_READ) )
{
$message = "S3 Upload Successful.";
$s3file='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
}
else{
$message = "S3 Upload Fail.";
}
}
Below is the best solution. Its using multipart upload.Make Sure to install Aws SDK for PHP before using
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
try {
$s3Client = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => 'AKIAUMIZJR5U5IO7M3',
'secret' => 'BmcA3vFso1bc/9GVK7nHJtFk0tQL6Vi5OoMySO',
],
]);
// Use multipart upload
$source = 'https://b8q9h6y2.stackpathcdn.com/wp-content/uploads/2016/08/banner-for-website-4.png';
$uploader = new MultipartUploader($s3Client, $source, [
'bucket' => 'videofilessandeep',
'key' => 'my-file.png',
'ACL' => 'public-read',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
} catch (Exception $e) {
print_r($e);
}

Categories