FFMPEG conversion (h.264) taking long time for short videos - php

I am trying to record the video and upload into the aws s3 server. Vuejs as front end and php Laravel as backend, I was not using any conversion before saving it to s3. Due to this if any recording recorded from android cannot be played in apple device due to some codecs..
To over come this, I am using ffmpeg to encode in X264() format to make it play in apple and android device regardless on which device the recording is done.
1 min video taking 6-7 minutes using ffmpeg. I thought may be aws s3 taking time to save, i commented "saving to s3 bucket code" still very slow to save temp public folder in php.
please check the code if i am missing anything to make conversion quick. if any solution update answer with reference link or code snippet with reference to my code below.
public function video_upload(Request $request)
{
// Response Declaration
$response=array();
$response_code = 200;
$response['status'] = false;
$response['data'] = [];
// Validation
// TODO: Specify mimes:mp4,webm,ogg etc
$validator = Validator::make(
$request->all(), [
'file' => 'required'
]
);
if ($validator->fails()) {
$response['data']['validator'] = $validator->errors();
return response()->json($response);
}
try{
$file = $request->file('file');
//convert
$ffmpeg = FFMpeg\FFMpeg::create();
$video = $ffmpeg->open($file);
$format = new X264();
//end convert
$file_name = str_replace (' ', '-', Hash::make(time()));
$file_name = preg_replace('/[^A-Za-z0-9\-]/', '',$file_name).'.mp4';
$video->save($format, $file_name);
$file_folder = 'uploads/video/';
// Store the file to S3
// $store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents($file));
$store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents($file_name));
if($store){
// Replace old file if exist
//delete the file from public folder
$file = public_path($file_name);
if (file_exists($file)) {
unlink($file);
}
if(isset($request->old_file)){
if(Storage::disk('s3')->exists($file_folder.basename($request->old_file))) {
Storage::disk('s3')->delete($file_folder.basename($request->old_file));
}
}
}
$response['status'] = true;
$response['data']= '/s3/'.$file_folder. $file_name;
}catch (\Exception $e) {
$response['data']['message']=$e->getMessage()."line".$e->getLine();
$response_code = 400;
}
return response()->json($response, $response_code);
}
Its blocking point for me. I cannot let user to wait 5-6 mins to upload 1 min video.

Related

Uploading recording file stream to aws s3 in PHP is very slow

In my application I can record video and save it to aws s3 bucket using vueJS as front end and Laravel php as backend.
I am using ffmpeg to upload recording stream to s3 bucket.
1 min video taking 4 mins and
3 mins video taking 9 mins (Always not successfully uploaded, some times it fails)
Below is the code in backend.
public function video_upload(Request $request)
{
// Response Declaration
$response=array();
$response_code = 200;
$response['status'] = false;
$response['data'] = [];
// Validation
// TODO: Specify mimes:mp4,webm,ogg etc
$validator = Validator::make(
$request->all(), [
'file' => 'required',
]
);
if ($validator->fails()) {
$response['data']['validator'] = $validator->errors();
return response()->json($response);
}
try{
$file = $request->file('file');
//covert
$ffmpeg = FFMpeg\FFMpeg::create();
$video = $ffmpeg->open($file);
$format = new X264();
$format->on('progress', function ($video, $format, $percentage) {
echo "$percentage % transcoded";
});
$video->save($format, 'output.mp4');
//end convert
$file_name = str_replace ('/', '', Hash::make(time())).'.mp4';
$file_folder = 'uploads/video/';
// Store the file to S3
// $store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents($file));
$store = Storage::disk('s3')->put($file_folder.$file_name, file_get_contents('output.mp4'));
if($store){
// Replace old file if exist
//delete the file from public folder
$file = public_path('output.mp4');
if (file_exists($file)) {
unlink($file);
}
if(isset($request->old_file)){
if(Storage::disk('s3')->exists($file_folder.basename($request->old_file))) {
Storage::disk('s3')->delete($file_folder.basename($request->old_file));
}
}
}
$response['status'] = true;
$response['data']= '/s3/'.$file_folder. $file_name;
}catch (\Exception $e) {
$response['data']['message']=$e->getMessage()."line".$e->getLine();
$response_code = 400;
}
return response()->json($response, $response_code);
}
I was researching on Transfer Acceleration and multipart upload but question is do i do from aws end or in backend.

Not getting images from s3 bucket in codeigniter

I am trying to get images from s3 bucket but it is not working . i have searched a google lot but cant find any solution for this.
Simply what i want is that, i want images from the s3 bucket and view them in html page .
here is how i am doing this. i am using codeigniter for this. these images are private thats why i am doing this.
hope you can help me .
if ($info['photo']!='')
{
$awsAccessKey = 'access_key'; //AWS account access key
$awsSecretKey = 'secret_key'; //AWS account secret key
$s3 = new S3($awsAccessKey, $awsSecretKey);
try
{
$command = $s3->getObject('bajwa-staging', 'IMG20191013221832.jpg');
//The period of availability
$request = $s3->createPresignedRequest($command, '+20 minutes');
//Get the pre-signed URL
$signedUrl = (string) $request->getUri();
}
catch (Exception $e)
{
echo 'Message: ' .$e->getMessage();
exit;
}
}

Large file upload to google cloud storage using PHP

I'm trying to upload large files from server to cloud storage (files over 500mb) and I'm getting PHP time outs. I've tried looking at the Google Client Library documentation and I've crawled through stackoverflow, but I can't find anything that could help me. Also is there any way of tracking the progress of the upload?
Here's the code I'm using at the moment:
$options = [
'resumable' => true,
'chunkSize' => 524288
];
$uploader = $bucket->getResumableUploader(
fopen('uploads/' . $name, 'r'),
$options
);
try {
$object = $uploader->upload();
} catch (GoogleException $ex) {
$resumeUri = $uploader->getResumeUri();
$object = $uploader->resume($resumeUri);
}

How to get the image from bucket to preview it to the user

I am working on Google Cloud Storage in which I am trying to crop and upload an image. In this I've uploaded the image and fetching it back to crop it. I have used following methods to do so:
Method 1:
$options = ['gs_bucket_name' => $my_bucket];
$upload_url = CloudStorageTools::createUploadUrl('/upload/handler', $options);
using these docs. But in this I get class not found. I tried including the file for example:
require_once APPPATH."google/appengine/api/cloud_storage/CloudStorageTools.php";
$options = ['size' => 400, 'crop' => true];
$image_file = "gs://my_bucket/shiva.jpg";
$cloud_tools = new CloudStorageTools;
$img = $cloud_tools->getImageServingUrl($image_file, $options);
but the I get class not found for
use google\appengine\CreateEncodedGoogleStorageKeyRequest;
ans etc. I checked the CreateEncodedGoogleStorageKeyRequest under the appengine folder. I found it missing there. I don't know whats going on.
Method 2:
I tried uploading the file using the following code.
function upload_user_image($image_file, $bucket_name = ''){
$client = google_set_client();
$storage = new Google_Service_Storage($client);
$sfilename = $image_file['name']; //filename here
$obj = new Google_Service_Storage_StorageObject();
$obj->setName($sfilename);
$obj->setBucket("my_bucket"); //bucket name here
$filen = $image_file['path'];
$mimetype = mime_content_type($filen);
$chunkSizeBytes = 1 * 1024 * 1024;
$client->setDefer(true);
$status = false;
$filetoupload = array('name' => $sfilename, 'uploadType' => 'resumable');
$request = $storage->objects->insert("my_bucket",$obj,$filetoupload);
$media = new Google_Http_MediaFileUpload($client, $request, $mimetype, null, true, $chunkSizeBytes);
$media->setFileSize(filesize($filen));
$handle = fopen($filen, "rb");
while (!$status && !feof($handle)) {
$chunk = fread($handle, $chunkSizeBytes);
$status = $media->nextChunk($chunk);
}
$result = false;
if($status != false) {
$result = $status;
}
fclose($handle);
// Reset to the client to execute requests immediately in the future.
$client->setDefer(false);
return true;
}
I got succeed in uploading the image using above code but now stuck in getting the image and previewing it in html. (I want to crop the image and then upload again). For which I tried following:
Method a:
$image = file_get_contents("https://storage.cloud.google.com/my_bucket/shiva.jpg");
echo $image;
using these docs. In which I get a login box in my html where I fill my Google credentials and get redirected to image. But don't get the image preview in my html code.
Method b:
I tried
https://www.googleapis.com/storage/v1/b/my_bucket/o/shiva.jpg
using these docs. But I get output :
{
"error": {
"errors": [
{
"domain": "global",
"reason": "required",
"message": "Anonymous users does not have storage.objects.get access to object my_bucket/shiva.jpg.",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Anonymous users does not have storage.objects.get access to object my_bucket/shiva.jpg."
}
}
Method c:
I tried it using the following function:
function get_user_image($image_file){
$instance = &get_instance();
// $client = google_set_client();
// $storage = new Google_Service_Storage($client);
$sfilename = $image_file; //filename here
$storage = new Google\Cloud\Storage\StorageClient(['projectId' => $instance->config->item('google_project_id')]);
$bucket = $storage->bucket('my_bucket');
$object = $bucket->object($sfilename);
$stream = $object->downloadAsString();
$im = imagecreatefromstring($stream);
if ($im !== false) {
header('Content-Type: image/png');
imagepng($im);
imagedestroy($im);
}
else {
echo 'An error occurred.';
}
}
using these docs
I am stuck fro last three days. I need to display the image to user in html. Please anyone guide me what am I missing? Please give the proper way to accomplish this.
Since you are comfortable with these objects being anonymously visible, the easiest solution to display them as images on a website would be simply to mark them as publicly accessible and then to embed them in HTML like so:
<IMG SRC="https://storage.googleapis.com/BUCKET_NAME/imageName.jp‌​eg" />

Laravel - Pass uploaded filename to new function

I'm using Laravel 5.3 and need to upload an xml file and then submit the contents to an api. The client wants it as 2 buttons/user functions where the user should first upload the file and then with a second click submit the contents.
The uploading is working fine and the xml reading and submitting to api is also working properly. I just can't get my upload controller to pass the filename over to the submitting controller. There is no need to store the filename for future use and the processes will follow each other - ie user will upload one file and submit, then upload next file and submit.
Any help would be highly appreciated
upload function:
public function handleUpload(Request $request)
{
$file = $request->file('file');
$allowedFileTypes = config('app.allowedFileTypes');
$rules = [
'file' => 'required|mimes:'.$allowedFileTypes
];
$this->validate($request, $rules);
$fileName = $file->getClientOriginalName();
$destinationPath = config('app.fileDestinationPath').'/'.$fileName;
$uploaded = Storage::put($destinationPath, file_get_contents($file->getRealPath()));
if($uploaded) {
$file_Name = ($_FILES['file']['name']);
}
return redirect()->to('/upload');
}
submit function:
public function vendorInvoice()
{
$fileName = $file_Name;
$destinationPath = storage_path('app/uploads/');
$xml = file_get_contents($destinationPath.$fileName);
$uri = "some uri";
try {
$client = new Client();
$request = new Request('POST', $uri, [
'Authorization' => '$username',
'ContractID' => '$id',
'content-type' => 'application/xml'
],
$xml);
$response = $client->send($request);
}
catch (RequestException $re) {
//Exception Handling
echo $re;
}
}

Categories