Will uploading my video through Vimeo package affect my server? - php

I'm using https://github.com/vimeo/laravel package to upload my video to Vimeo. But there's a little problem with the file size, so I edited the php and nginx configuration to allow up to 500 request size... which isn't good (Keep in mind this is my test server and not production). I'm wondering if the package itself streams the file and uploads it or it uses as much memory as the file size and upload it at once..
Heres my code :
public function UploadToVimeo(Request $request){
$this->validate($request, [
'class_id' => 'required|exists:teacher_classes,class_id',
'video_name' => 'required|mimes:mp4,mov,ogg,qt',
]);
$file = $request->video_name;
$result = Vimeo::upload($file);
if($result){
$str = str_replace('/videos/','',$result);
TeacherClass::where('class_id',$request->class_id)->update(['url'=>'https://vimeo.com'.$str]);
}
return back()->with('result',$str);
}
Can someone explain to me how the package works? Or a way to stream the file?
Thank you

Related

Encoding failed during converting and saving video using php-ffmpeg in laravel

i'm using https://github.com/PHP-FFMpeg/PHP-FFMpeg package to convert user video uploads to mp4 format, note that I'm using Laravel framework for development process.
But I'm facing error for this and Im not really sure what im doing wrong because all it says is that Encoding is failed. I attached the errors image below you can see that at the bottom.
This is my function which handles user uploads:
public function upload()
{
$this->putVideoIntoStorage();
$this->ffmpeg->saveVideo($this->storageManager->getAbsolutePathOf($this->video->getClientOriginalName()));
return $this->saveVideoIntoDatabase();
}
This is the function which handles saving and converting video to mp4 using php-ffmpeg
public function saveVideo(String $path)
{
$ffmpeg = FFMpeg::create([
'ffmpeg.binaries' => config('services.ffmpeg.ffmpeg_path'),
'ffprobe.binaries' => config('services.ffmpeg.ffprobe_path'),
'timeout' => 3600, // The timeout for the underlying process
'ffmpeg.threads' => 12, // The number of threads that FFMpeg should use
]);
$video = $ffmpeg->open($path);
$video->save(new X264(), 'video.mp4');
}
This is the error im getting: Errors Image
I can provide more details if you need just ask me, I would be really glad if someone can help me through this.
Here's the proper way to do it:
$format = new FFMpeg\Format\Video\X264();
$format->setAudioCodec("libmp3lame");
$video->save($format, '/path/to/new/file');

Laravel Download from S3 To Local

I am trying to download a file that I stored on S3 to my local Laravel installation to manipulate it. Would appreciate some help.
I have the config data set up correctly because I am able to upload it without any trouble. I am saving it in S3 with following pattern "user->id / media->id.mp3" --> note the fact that I am not just dumping files on S3, I am saving them in directories.
After successfully uploading the file to S3 I update the save path in my DB to show "user->id / media->id.mp3", not some long public url (is that wrong)?
When I later go back to try and download the file I am getting a FileNotFoundException at S3. I'm doing this.
$audio = Storage::disk('s3')->get($media->location);
The weird thing is that in the exception it shows the resource that it cannot fetch but when I place that same url in a browser it displays the file without any trouble at all. Why can't the file system get the file?
I have tried to do a "has" check before the "get" and the has check comes up false.
Do I need to save the full public URL in the database for this to work? I tried that and it didn't help. I feel like I am missing something very simple and it is making me crazy!!
Late answer but important for others,
$s3_file = Storage::disk('s3')->get(request()->file);
$s3 = Storage::disk('public');
$s3->put("./file_name.tif", $s3_file);
The response of $s3_file will be a stream, you can save that stream data to file using Laravel put file method, you will find this stream file in storage/public directory.
You can give your Content-Type as desired and Content-Disposition as 'attachment' because your files are coming from S3 and you have to download it as an attachment.
$event_data = $this->ticket->where('user_id', $user_id)->first();
$data = $event_data->pdf;
$get_ticket = 'tickets/'. $data;
$file_name = "YOUR_DESIRED_NAME.pdf";
$headers = [
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="'. $file_name .'"',
];
return \Response::make(Storage::disk('s3')->get($get_ticket), 200, $headers);
Say, you have AWS S3 as your default storage.
And you want to download my_file.txt from S3 to my_laravel_project\storage\app\my_file.txt
And you want to make it a one-liner
Storage::disk('local')->put('my_file.txt', Storage::get('my_file.txt'));

How do I upload big (video) files in streams to AWS S3 with Laravel 5 and filesystem?

I want to upload a big video file to my AWS S3 bucket. After a good deal of hours, I finally managed to configure my php.ini and nginx.conf files, so they allowed bigger files.
But then I got a "Fatal Error: Allowed Memory Size of XXXXXXXXXX Bytes Exhausted". After some time I found out larger files should be uploaded with streams using fopen(),fwrite(), and fclose().
Since I'm using Laravel 5, the filesystem takes care of much of this. Except that I can't get it to work.
My current ResourceController#store looks like this:
public function store(ResourceRequest $request)
{
/* Prepare data */
$resource = new Resource();
$key = 'resource-'.$resource->id;
$bucket = env('AWS_BUCKET');
$filePath = $request->file('resource')->getRealPath();
/* Open & write stream */
$stream = fopen($filePath, 'w');
Storage::writeStream($key, $stream, ['public']);
/* Store entry in DB */
$resource->title = $request->title;
$resource->save();
/* Success message */
session()->flash('message', $request->title . ' uploadet!');
return redirect()->route('resource-index');
}
But now I get this long error:
CouldNotCreateChecksumException in SignatureV4.php line 148:
A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Stream\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.
So I am currently completely lost. I can't figure out if I'm even on the right track. Here are the resource I try to make sense of:
AWS SDK guide for PHP: Stream Wrappers
AWS SDK introduction on stream wrappers
Flysystem original API on stream wrappers
And just to confuse me even more, there seems to be another way to upload large files other than streams: The so called "multipart" upload. I actually thought that was what the streams where all about...
What is the difference?
I had the same problem and came up with this solution.
Instead of using
Storage::put('file.jpg', $contents);
Which of course ran into an "out of memory error" I used this method:
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
// ...
public function uploadToS3($fromPath, $toPath)
{
$disk = Storage::disk('s3');
$uploader = new MultipartUploader($disk->getDriver()->getAdapter()->getClient(), $fromPath, [
'bucket' => Config::get('filesystems.disks.s3.bucket'),
'key' => $toPath,
]);
try {
$result = $uploader->upload();
echo "Upload complete";
} catch (MultipartUploadException $e) {
echo $e->getMessage();
}
}
Tested with Laravel 5.1
Here are the official AWS PHP SDK docs:
http://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-multipart-upload.html
the streaming part applies to downloads.
for uploads you need to know the content size. for large files multipart uploads are the way to go.

uploading image file in dropbox using dropbox api in php

How can I upload image files to Dropbox with the Jimscode PHP Dropbox component for CodeIgniter? It uses the add() method for uploading files. It successfully uploads all other files types (e.g. pdf) to Dropbox. But when I upload image files, it uploads empty files to Dropbox. That is of 0 bytes and unable to preview that image in Dropbox. My code:
public function add($dbpath, $filepath, array $params = array(), $root=self::DEFAULT_ROOT)
{
$dbpath = str_replace(' ', '%20', $dbpath);
$filename = rawurlencode($filepath);
$parstr = empty($params) ? '' : '&'.http_build_query($params);
$uri = "/files_put/{$root}/{$dbpath}?file={$filename}{$parstr}";
//$uri = reduce_double_slashes("/files/{$root}/{$dbpath}?file={$filename}{$parstr}");
$specialhost = 'api-content.dropbox.com';
return $this->_post_request($uri, array('file'=>'#'.$filepath), $specialhost);
}
Is there an API method I can use directly with curl and PHP?
I'm not sure what version you are using but the reduce_double_slashes line isn't commented out normally. That wouldn't cause some file types to upload and not others though. Is the failure with a single image or does it fail to upload with every image.
You can also set the DEBUG const in the library to true and have it write debug information out to your error log. That might help you figure out where the issue is happening.

How do you upload an image to Amazon S3 using wideimage

I am using the Amazon SDK for PHP and wideimage. I am resizing an image with wideimage and trying to then upload that resized image to Amazon S3.
$resized = $image->resize($width,$height);
upload
$response = $s3->create_object($myBucket, $newFilename, array(
'fileUpload' => $resized, //this does not work
));
Does anyone know the proper way to do this?
You can use a stream wrapper and use WideImage's saveToFile method. There are many stream wrappers for S3, this is one example: https://github.com/jakajancar/S3StreamWrapper.
You don't need to save an image and then upload from there.
When you resize the image, you have to convert to a string. You can do that with WideImage class.
Example:
$image = WideImage::load($_FILES["file"]['tmp_name']);
$resized = $image->resize(1024);
$data = $resized->asString('jpg');
And then when you're uploading on Amazon, you have to use the param 'body' instead of 'fileUpload'.
Example:
$response = $s3->create_object($myBucket, $newFilename, array(
'body' => $data,
));
I hope that helps.
I would like to point out few things might help someone in making choice.
First of all, I think you better go with what you are trying to do first resize image over your server and then move it to Amazon because suppose if there is some kind of way to resize and upload image at same time on the fly then your script will perform slow because script will have to resize and save it to server which is far away destined. It would be minor if there are few images but can be a problem when it's huge resizing even on high speed bandwidth and as PHP will not be able to release its resources used for image resizing until it has not completely saved the target image.
Second that if you are using a CDN (Content Delivery Network) then CDN uses PULL SERVER technique means that we do not push static content to CDN server but when a user/client ask for static content then CDN first checks its entire servers and if not found then it asks our main server for that.
Amazon S3 is not a true CDN. S3 was designed for content storage. The correct Amazon service to use for content delivery is Amazon CloudFront. And if we are saving files to any of our files to any storage server or CDN then that's called PUSH SERVER
A thorough article can be read on http://www.binarymoon.co.uk/2010/11/timthumb-cdn-amazon-s3-good/. That's actually about TimThumb but worth a good knowledge.
I ended up saving the file to the server and then uploading the file from there. If there is a better way then please let me know.

Categories