Check if a url offers a downloadable file? - php

I am currently working on some affiliate feeds where most are offered as raw .csv formats. I am using file_get_contents to generate .csv files along with fputcsv().
Unfortunately there is also a link between my affiliate url's that instantly downloads a csv file when you visit the url in the browser. This needs no further work since it's a perfect .csv file as is.
Since I just put my url's in a array I need to check for when a file is offered as a download link. How can I check for this so I can skip all my default .csv logic and not mess this file up?
I don't know what to search for since I don't know what exactly happens when a file straight up downloads instead of seeing raw csv data. Hopefully somebody can help me out.

You can check if a file is downloadable using CURL :
PHP
function checkDownloadable($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_NOBODY, 1);
curl_setopt($ch, CURLOPT_FAILONERROR, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
if(curl_exec($ch) !== FALSE) {
return true;
}
else {
return false;
}
}

Related

Save mp3 file from a download link on your hosting space using PHP

I am fetching data from API of a service provider (Say- http://serviceprovider.com).
From several parameter one is MP3 download Link (example- http://serviceprovider.com/storage/read?uid=475b68f2-a31b-40f8-8dfc-5af791a4d5fa_1_r.mp3&ip=255.255.255.255&dir=recording)
When I put this download link on my browser it saves it to my local PC.
Now My Problem -
I want to save this MP3 file in one of folder on my hosting space, from where I can further use it for playing using JPlayer Audio.
I have tried file_get_contents(), but nothing happened.
Thanks in advance.
Edit:
After reading Ali Answer I tried the following code, But still not working fully.
// Open a file, to which contents should be written to.
$fp = fopen("downloadk.mp3", "w");
$url = 'http://serviceprovider.com/storage/read?uid=475b68f2-a31b-40f8-8dfc-5af791a4d5fa_1_r.mp3&ip=255.255.255.255&dir=recording';
$handle = curl_init($url);
// Tell cURL to write contents to the file.
curl_setopt($handle, CURLOPT_FILE, $fp);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
curl_setopt($handle, CURLOPT_HEADER, false);
// Do the request.
$data = curl_exec($handle);
// Clean up.
curl_close($handle);
fwrite($fp, $data);
fclose($fp);
This created the file download.mp3 file on my server but with 0 bytes, i.e. empty.
The url used here is a download link example not a mp3 file that can be played with modern browser directly.
Function file_get_contents is used for reading local files. What you have is an URL and in order to fetch the contents, you need to do a HTTP request in your script. PHP comes with the curl extension, which provides you with a stable library of functions for doing HTTP requests:
http://php.net/manual/en/book.curl.php
Using curl to download your file could be done like this:
// Open a file, to which contents should be written to.
$downloadFile = fopen("download.mp3", "w");
$url = "http://serviceprovider.com/storage/read?uid=475b68f2-a31b-40f8-8dfc-5af791a4d5fa_1_r.mp3&ip=255.255.255.255&dir=recording";
$handle = curl_init($url);
// Tell cURL to write contents to the file.
curl_setopt($handle, CURLOPT_FILE, $downloadFile);
// Follow redirects.
curl_setopt($handle, CURLOPT_FOLLOWLOCATION, true);
// Do the request.
curl_exec($handle);
// Clean up.
curl_close($handle);
fclose($downloadFile);
You should probably add some error checking.

php curl return downloaded content to user while file is still downloading

I want to server begins to download a big file. But while this file is downloading output the file content to the user. I tried this code:
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_TIMEOUT, 155000);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$response = curl_exec($ch); // get curl response
echo $response;
But this code takes a long time. I want to use curl instead of readfile.
See this answer: Manipulate a string that is 30 million characters long
Modifing the MyStream class should change it enough so that you can just echo the results to the browser. Assuming the browser is already downloading the file, it should just keep downloading it.

How to download all images from a website?

I want to download multiple images from the following website:
www.bbc.co.uk
I want to do it by using PHP cURL, can someone help lead me in the right direction?
It would be nice to download all the images in one shot, but if someone can help me download maybe download just 1 or a bunch that would be great!
Edit: it would be a good idea to show what I have tried:
<?php
$image_url = "www.bbc.co.uk";
$ch = curl_init();
$timeout = 0;
curl_setopt ($ch, CURLOPT_URL, $image_url);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
// Getting binary data
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$image = curl_exec($ch);
curl_close($ch);
// output to browser
header("Content-type: image/jpeg");
print $image;
?>
For some reason it is not working. It is to be noted I am an absolute amatuer at PHP and programming in general.
The above code you pasted isn't doing what you think it is.
$image = curl_exec($ch);
The $image variable doesn't contain any image, it actually contains the entire HTML of that webpage as a string.
If you replace
// output to browser
header("Content-type: image/jpeg");
print $image;
with:
var_dump($image);
You will see the html.
Something like this:
Try to find the actual champion image source and parse it accordingly
You just need the whole /Air/assets/images/champions folder right?
Nothing easier when you use FireFox and a Download plugin like "Download Them All", open the FTP folder (Eambo's link) where the champions' pictures are located, right clic, select the plug-in.
It's gonna list all the files, select them all or select those you need and start the download.
Also, if you own that game you can take a look in this path:
\League of Legends\rads\projects\lol_air_client\releases\<newest-version>\deploy\assets\images\champions
Since you will probably use that in your university, CHECK THIS and I hope you will find a solution.

Grab frame without downloading whole file?

Is this possible using php + ffmpeg?
ffmpeg-php has the ability to:
Ability to grab frames from movie files and return them as images that
can be manipulated using PHP's built-in image functions. This is great
for automatically creating thumbnails for movie files.
I just don't want to download the whole file before doing so.
So lets say i want to grab a frame # 10% of the movie:
First lets get the size of remote file:
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url); //specify the url
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
$head = curl_exec($ch);
$size = curl_getinfo($ch,CURLINFO_CONTENT_LENGTH_DOWNLOAD);
Then it's quite easy to download only 10% of the .flv or .mov file using curl.
But the framegrab trick using ffmpeg-php probably won't work because the file probably is corrupted?
Any other ideas?
Yes I believe this will work. For video files as long as you do have the start of the file, processing like this should be possible. (If you only had, for example, a chunk of the file from the middle, it probably wouldn't work.)
On the command line I downloaded the first part of an .FLV file with Curl, then grabbed frames using ffmpeg and it worked correctly. Doing the same in PHP should work as well.

Using cURL to save external files to my Server

I have a website to show opensource movies and videos.
I have saved urls in mysql and linked both videos as well as the images to the content server.
But users are complaining of slow website as images are getting fetched from outside and most of time Internet Explorer is not even displaying the image.
I just learnt about cURL and would like to save images as well as videos to my own server and provide mirror to original website.
I got " curl -O ('') ; " syntax at many places to do the task but don't know how to use it inside my php script.
In short:
I already have my form for url saving in mysql. I wish it to also save save file to a directory on my webserver and save file path to another column in mysql.
Any sort of help is welcome.
Thanx in Advance
$local_file = "/tmp/filename.flv";//This is the file where we save the information
$remote_file = "http://www.test.com/filename.flv"; //Here is the file we are downloading
$ch = curl_init();
$fp = fopen ($local_file, 'w+');
$ch = curl_init($remote_file);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_ENCODING, "");
curl_exec($ch);
curl_close($ch);
fclose($fp);
I've decided to update this answer almost 7 years later.
For those who have copy() enabled for remote hosts, you can simply use:
copy("http://www.test.com/filename.flv", "/some/local/path/filename.flv");

Categories