How to download .m4v video from given URL with Laravel? I have a URL for example rtmp://123456.r.cdnsun.net/_definst_/mp4:123456/h264/123456.m4v ... Can you recommend me a package for Laravel which could make the process easier?
It depends on what exactly you're trying to achieve. For example, if you want to just copy file from a remote server, you could use copy():
$remoteFile = 'rtmp://123456.r.cdnsun.net/_definst_/mp4:123456/h264/123456.m4v';
$localFile = storage_path().'movies';
copy($remoteFile, $localFile);
You can use file_put_contents to download video from URL
$video_name="video_name.mp4";
$video_path=public_path('videos').'/'.$video_name; // path where you want to store video
file_put_contents($video_path, fopen($file, 'r'));
Related
I want to download a proxy list with this api:
https://api.proxyscrape.com?request=getproxies&proxytype=http&timeout=5000&country=US&anonymity=elite&ssl=yes
how can I do it in php with curl???
When you open this url, It will automatically download a text file with proxies.
I want to download and place it in the current directory.
The easiest way would be using file_get_contents() function built into PHP.
So to download & save the file you would do something like this:
$proxies = fopen("proxies.txt", "w");
fwrite($proxies, file_get_contents('https://api.proxyscrape.com?request=getproxies&proxytype=http&timeout=5000&country=US&anonymity=elite&ssl=yes'));
fclose($proxies);
I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy
I have this URL
www1.intranet.com/reportingtool.asp?settings=var&export = ok
There I can download a report. The file-name of the report includes a timestamp. e.g. 123981098298.xls and varies everytime I download it.
I want to have a script with this functions:
<?php
//Download the File
//rename it to **report.xls**
//save it to a specified place
?>
I don't have any idea after searching stackoverflow and googling on this topic :(
Is this generally possible?
The simplest scenario
You can download the report with file_get_contents:
$report = file_get_contents('http://www1.intranet.com/reportingtool.asp?...');
And save it locally (on the machine where PHP runs) with file_put_contents:
file_put_contents('/some/path/report.xls', $report);
More options
If downloading requires control over the HTTP request (e.g. because you need to use cookies or HTTP authentication) then it has to be done through cURL which enables full customization of the request.
If the report is large in size then it could be directly streamed to the destination instead of doing read/store/write in three steps (for example, using fopen/fread/fwrite).
This may not work depending on your security settings, but it's a simple example:
<?php
$file = file_get_contents('http://www1.intranet.com/reportingtool.asp?settings=var&export=ok');
file_put_contents('/path/to/your/location/report.xls', $file);
See file_get_contents and file_put_contents.
There is a file located on a server. Lets call it "Movie".
My site's users need to downoad this movie, but the movie can only be downloaded by my website's IP.
Is there a way to download Movie to my server and from there to the user?
I can do that by "put_content" but I need the client to download the file WHILE my server downloads it.
Thanks!
You can make use of the standard wrappers PHP offersÂDocs and combine that with stream_copy_to_streamÂDocs:
Example / Demo:
<?php
$url = 'http://example.com/some.url';
$src = fopen($url, 'r');
$dest = fopen('php://output', 'w');
$bytesCopied = stream_copy_to_stream($src, $dest);
Related (couldn't find a better duplicate so far):
Remotely download a file from an external link to my server - download stops prematurely
Take a look at the readfile function.
http://php.net/manual/en/function.readfile.php
I want be able to upload an image or just paste the URL of an image in order to upload it a sa profile picture for my website users.
the point is i dont wanna store the url but i want to have a copy of that image on my server because if that external image will be lost i dont want to lose it either...
i believe facebook and tumblr etc do so... what the php script or best practice to do that?
thanks!
You can get the contents (bytes) from the image using the PHP function (http://php.net/manual/en/function.file-get-contents.php)
$contents = file_get_contents('http://www.google.com/images/logos/ps_logo2.png');
You can use CURL library as well... Here's an example of how you can downloadImageFromUrl with and save it in a local SaveLocation
function downloadImageFromUrl($imageLinkURL, $saveLocationPath) {
$channel = curl_init();
$curl_setopt($channel, CURLOPT_URL, $imageLinkURL);
$curl_setopt($channel, CURLOPT_POST, 0);
$curl_setopt($channel, CURLOPT_RETURNTRANSFER, 1);
$fileBytes = curl_exec($channel);
curl_close($channel);
$fileWritter = fopen($saveLocationPath, 'w');
fwrite($fileWritter, $fileBytes);
fclose($fileWritter);
}
You can use this as follows:
downloadImageFromUrl("http://www.google.com/images/logos/ps_logo2.png", "/tmp/ps_logo2.png")
You can also get the same name of the image by parsing the URL as well...
I think this is you are looking for: Copy Image from Remote Server Over HTTP
You can use a function called imagecreatefromjpeg or something. It takes a URL path to the image and creates a new image off of that. Have a look at http://php.net/manual/en/function.imagecreatefromjpeg.php
There's different functions for different extensions, though (if you prefer using such). You may need to check for the image extension from the URL and use the appropriate function I suppose.
Handling uploads is covered in this documentation, and if users paste a URL, I'd recommend using file_get_contents to save a copy of the image to your server, and then you can simply store the path to that image, rather than the external image.