i'm using https://github.com/PHP-FFMpeg/PHP-FFMpeg package to convert user video uploads to mp4 format, note that I'm using Laravel framework for development process.
But I'm facing error for this and Im not really sure what im doing wrong because all it says is that Encoding is failed. I attached the errors image below you can see that at the bottom.
This is my function which handles user uploads:
public function upload()
{
$this->putVideoIntoStorage();
$this->ffmpeg->saveVideo($this->storageManager->getAbsolutePathOf($this->video->getClientOriginalName()));
return $this->saveVideoIntoDatabase();
}
This is the function which handles saving and converting video to mp4 using php-ffmpeg
public function saveVideo(String $path)
{
$ffmpeg = FFMpeg::create([
'ffmpeg.binaries' => config('services.ffmpeg.ffmpeg_path'),
'ffprobe.binaries' => config('services.ffmpeg.ffprobe_path'),
'timeout' => 3600, // The timeout for the underlying process
'ffmpeg.threads' => 12, // The number of threads that FFMpeg should use
]);
$video = $ffmpeg->open($path);
$video->save(new X264(), 'video.mp4');
}
This is the error im getting: Errors Image
I can provide more details if you need just ask me, I would be really glad if someone can help me through this.
Here's the proper way to do it:
$format = new FFMpeg\Format\Video\X264();
$format->setAudioCodec("libmp3lame");
$video->save($format, '/path/to/new/file');
Related
I'm encrypting my images and uploaded it to S3 buckets.
I'm using of SoareCostin\FileVault library to encrypt/decrypt it
https://github.com/soarecostin/file-vault
I wanted to get the image or at least base64 of the image
FileVault::streamDecrypt('image.png')
however I'm having problems creating an image from it.
This works perfectly:
return response()->stream(function() use ($s3path) {
FileVault::disk('s3')->streamDecrypt($s3path);
}, 200, ["Content-Type" => 'image/png']);
however I wanted to get the base64 or the image with this code:
$stream = FileVault::disk('s3')->streamDecrypt($s3path);
$data = file_get_contents($s3path);
return 'data:image/png;base64,' . base64_encode($data));
I hope someone could help me out. Thanks in advance
I'm using https://github.com/vimeo/laravel package to upload my video to Vimeo. But there's a little problem with the file size, so I edited the php and nginx configuration to allow up to 500 request size... which isn't good (Keep in mind this is my test server and not production). I'm wondering if the package itself streams the file and uploads it or it uses as much memory as the file size and upload it at once..
Heres my code :
public function UploadToVimeo(Request $request){
$this->validate($request, [
'class_id' => 'required|exists:teacher_classes,class_id',
'video_name' => 'required|mimes:mp4,mov,ogg,qt',
]);
$file = $request->video_name;
$result = Vimeo::upload($file);
if($result){
$str = str_replace('/videos/','',$result);
TeacherClass::where('class_id',$request->class_id)->update(['url'=>'https://vimeo.com'.$str]);
}
return back()->with('result',$str);
}
Can someone explain to me how the package works? Or a way to stream the file?
Thank you
I am trying to download a file that I stored on S3 to my local Laravel installation to manipulate it. Would appreciate some help.
I have the config data set up correctly because I am able to upload it without any trouble. I am saving it in S3 with following pattern "user->id / media->id.mp3" --> note the fact that I am not just dumping files on S3, I am saving them in directories.
After successfully uploading the file to S3 I update the save path in my DB to show "user->id / media->id.mp3", not some long public url (is that wrong)?
When I later go back to try and download the file I am getting a FileNotFoundException at S3. I'm doing this.
$audio = Storage::disk('s3')->get($media->location);
The weird thing is that in the exception it shows the resource that it cannot fetch but when I place that same url in a browser it displays the file without any trouble at all. Why can't the file system get the file?
I have tried to do a "has" check before the "get" and the has check comes up false.
Do I need to save the full public URL in the database for this to work? I tried that and it didn't help. I feel like I am missing something very simple and it is making me crazy!!
Late answer but important for others,
$s3_file = Storage::disk('s3')->get(request()->file);
$s3 = Storage::disk('public');
$s3->put("./file_name.tif", $s3_file);
The response of $s3_file will be a stream, you can save that stream data to file using Laravel put file method, you will find this stream file in storage/public directory.
You can give your Content-Type as desired and Content-Disposition as 'attachment' because your files are coming from S3 and you have to download it as an attachment.
$event_data = $this->ticket->where('user_id', $user_id)->first();
$data = $event_data->pdf;
$get_ticket = 'tickets/'. $data;
$file_name = "YOUR_DESIRED_NAME.pdf";
$headers = [
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="'. $file_name .'"',
];
return \Response::make(Storage::disk('s3')->get($get_ticket), 200, $headers);
Say, you have AWS S3 as your default storage.
And you want to download my_file.txt from S3 to my_laravel_project\storage\app\my_file.txt
And you want to make it a one-liner
Storage::disk('local')->put('my_file.txt', Storage::get('my_file.txt'));
I'm working on a symfony app with videos. And i need to stream those videos and then display in an html5 player.
I used this blog post : http://ailoo.net/2013/03/stream-a-file-with-streamedresponse-in-symfony/ and i think it's working fine.
$response = new StreamedResponse(
function () use ($file) {
readfile($file);
}, 200, array('Content-Type' => 'video/' . $formatVideo)
);
return $this->render('SQMovieBundle:Movie:displayMovie.html.twig', array('test' => $response));
But it's the first time i work with things like buffer, i read i have some configuation to do with apache (i'm using mamp on mac os).
And i have no idea for display my video in twig.
Feel free to give me indications.
There's much cleaner way of streaming a file in Symfony, simply return an instance of the BinaryFileResponse class:
return new \Symfony\Component\HttpFoundation\BinaryFileResponse($pathToTheFile);
I want to upload a big video file to my AWS S3 bucket. After a good deal of hours, I finally managed to configure my php.ini and nginx.conf files, so they allowed bigger files.
But then I got a "Fatal Error: Allowed Memory Size of XXXXXXXXXX Bytes Exhausted". After some time I found out larger files should be uploaded with streams using fopen(),fwrite(), and fclose().
Since I'm using Laravel 5, the filesystem takes care of much of this. Except that I can't get it to work.
My current ResourceController#store looks like this:
public function store(ResourceRequest $request)
{
/* Prepare data */
$resource = new Resource();
$key = 'resource-'.$resource->id;
$bucket = env('AWS_BUCKET');
$filePath = $request->file('resource')->getRealPath();
/* Open & write stream */
$stream = fopen($filePath, 'w');
Storage::writeStream($key, $stream, ['public']);
/* Store entry in DB */
$resource->title = $request->title;
$resource->save();
/* Success message */
session()->flash('message', $request->title . ' uploadet!');
return redirect()->route('resource-index');
}
But now I get this long error:
CouldNotCreateChecksumException in SignatureV4.php line 148:
A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Stream\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.
So I am currently completely lost. I can't figure out if I'm even on the right track. Here are the resource I try to make sense of:
AWS SDK guide for PHP: Stream Wrappers
AWS SDK introduction on stream wrappers
Flysystem original API on stream wrappers
And just to confuse me even more, there seems to be another way to upload large files other than streams: The so called "multipart" upload. I actually thought that was what the streams where all about...
What is the difference?
I had the same problem and came up with this solution.
Instead of using
Storage::put('file.jpg', $contents);
Which of course ran into an "out of memory error" I used this method:
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
// ...
public function uploadToS3($fromPath, $toPath)
{
$disk = Storage::disk('s3');
$uploader = new MultipartUploader($disk->getDriver()->getAdapter()->getClient(), $fromPath, [
'bucket' => Config::get('filesystems.disks.s3.bucket'),
'key' => $toPath,
]);
try {
$result = $uploader->upload();
echo "Upload complete";
} catch (MultipartUploadException $e) {
echo $e->getMessage();
}
}
Tested with Laravel 5.1
Here are the official AWS PHP SDK docs:
http://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-multipart-upload.html
the streaming part applies to downloads.
for uploads you need to know the content size. for large files multipart uploads are the way to go.