Some time ago this code was working fine. I was able to download a file into a directory by using a copy command but it stopped working. It is no longer downloading a file. It always creates a 0-byte file.
The code I'm using:
$video_url = 'https://api.zoom.us/rec/download/tJN4d7v5_Ts3HtzD4QSDVqJwW9XoJvms0nUbq_cPnRzhUCMAN1alZrVAN-AD8vw4clXzSccEqqZtfZw_';
$local_file = getcwd() ."/tmp/tmp_file.mp4";
copy($video_url, $local_file);
I have tried various ways to download and save but nothing helps.
Your $video_url returns 302 http response. Try this
$src = 'https://api.zoom.us/rec/download/tJN4d7v5_Ts3HtzD4QSDVqJwW9XoJvms0nUbq_cPnRzhUCMAN1alZrVAN-AD8vw4clXzSccEqqZtfZw_';
$fileName = 'tmp_file.mp4';
$dest = getcwd() . DIRECTORY_SEPARATOR . $fileName;
$ch = curl_init($src);
curl_exec($ch);
if (!curl_errno($ch)) {
$info = curl_getinfo($ch);
$downloadLink = $info['redirect_url'];
}
curl_close($ch);
if($downloadLink) {
copy($downloadLink, $dest);
}
You have to pass the access token in order to download the video recordings
here is the line you will update to pass the access token
curl_setopt($ch,CURLOPT_URL,download_URL?access_token)
As only host is allowed to record and download the meetings. By passing access token you will allow to download the meetings for every user.
Access token is the token you would be generating from JWT.
Related
my link (URL) is different!!! and does not work with usual method
I think because site load with js or aspx
you can test my link (URL) in your browser and see download starting
but cant work in php
I have tested all methods (fetch, curl, file, get, put), but it does not work.
I have a similar URL here: 'http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0'
I can open it in the browser and download a csv file I need to do this in php and save the csv file on server
Here is what I have tried so far:
<?php
$file = fopen('http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0');
$file = file_get_contents('http://www.tsetmc.com/tsev2/data/ClientTypeAll.aspx?h=0&r=0');
file_put_contents('ClientTypeAll.csv', $file);
?>
I do not want Contents !!! I want a csv file form my link
if you test my link in your browser download start in your pc
I run this code with a remote PDF file.
<?php
$url = 'https://example.com/file.pdf';
$dir_name = 'storage-x'; // saVe the directory name
if (!file_exists($dir_name)) {
mkdir($dir_name, 0777, true);
}
$path = $dir_name.'/'.rand().'.pdf';
$file = file_get_contents($url);
file_put_contents($path, $file);
?>
Please follow the below step.
Get file form URL.
Set directory and check file exit condition and update directory access permission.
Set new file path, name, and directory.
Save the file.
Please check the below example which works for me.
<?php
$file = file_get_contents('https://example.com/file.pdf');
$dirName = 'storage-pdf';
if (!file_exists($dirName)) {
mkdir($dirName, 0777, true);
}
$newFilePath = $dirName.'/'.rand().'.pdf';
file_put_contents($newFilePath, $file);
?>
i have subdomain sub.domain.com. the subdomain points on a directory root/sub of my root dir on my Webserver.
now i have pdfs on another dir on the server root/pdf.
How can i check if a specific pdf exists and if it exists i want to copy the file to a temp dir of the subdomain.
if i call a php script sub/check.php an try to check a pdf that exists :
$filename = "http://www.domain.com/pdf/1.pdf";
if (file_exists($filename))
{
"exists";
}
else
{
"not exists";
}
It always shows : not exists.
If i take the url and put it in a browser - the pdf will be shown.
How can my php script in the /sub-folder access files in the root or root/pdf ?
bye jogi
file_exists() function does not work that way. It does not take remote URLs.
This function is used to check the file that exists on the file system.
Check the manual here
Make use of cURL to accomplish this.
<?php
$ch = curl_init("https://www.google.co.in/images/srpr/logo4w.png"); //pass your pdf here
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_exec($ch);
$retcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
if($retcode==200)
{
echo "exists";
}
else
{
echo "not exists";
}
?>
file_exists() looks locally on the machine if a file exists. But what you are doing is using a URL.
Since you say your script is in the root folder, you need to do change
$filename = "http://www.domain.com/pdf/1.pdf";
into
$filename = realpath(dirname(__FILE__)) . "/pdf/1.pdf"; // first part gets current directory
I am able to save images from a website using curl like so:
//$fullpath = "/images/".basename($img);
$fullpath = basename($img);
$ch = curl_init($img);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$rawData = curl_exec($ch);
curl_close($ch);
if(file_exists($fullpath)) {
unlink($fullpath);
}
$fp = fopen($fullpath, 'w+');
fwrite($fp, $rawData);
fclose($fp);
However, this will only save the image on the same folder in which I have the php file that executes the save function is in. I'd like to save the images to a specific folder. I've tried using $fullpath = "/images/".basename($img); (the commented out first line of my function) but this results to an error:
failed to open stream: No such file or directory
So my question is, how can I save the file on a specific folder in my project?
Another question I have is, how can I change the filename of the image I save on the my folder? For example, I'd like to add the prefix siteimg_ to the image's filename. How do I implement this?
Update: I have managed to solve first problem with the path after trying to play around with the code a bit more. Instead of using $fullpath = "/images/".basename($img), I added a variable right before fopen and added it to the fopen method like so:
$path = "./images/";
$fp = fopen($path.$fullpath, 'w+');
Strangely that worked. So now I'm down to one problem which would be renaming the file. Any suggestions?
File paths in PHP are server paths. I doubt you have a /images folder on your server.
Try constructing a relative path from the current PHP file, eg, assuming there is an images folder in the same directory as your PHP script...
$path = __DIR__ . '/images/' . basename($img);
Also, why don't you try this much simpler script
$dest = __DIR__ . '/images/' . basename($img);
copy($img, $dest);
I have many .csv file in website, Basically i want to know how can i automatic download this file dynamically. In localhost has download folder, In download folder will be download and save auto without mouse/keyboard touch and then i want to do auto import from download folder without mouse/keyboard touch.
Real Fact is , in that website every day will keep .csv file. But everyday software will be auto download and import in MySQL Database without mouse/keyboard touch.
How can i solve it, please detail mention. Really i will be very happy if i get solution to this problem.
Use the following PHP page in your localhost server:
csvsaver.php:
<?php
$saveLocation = 'files';
$urls = array(
'http://yourwebsite/folder/hello.csv',
'http://yourwebsite/csv1.csv'
);
$now = date('Y-m-d_H-i');
$dir = 'files' . '/' . $now;
for ($i = 0; $i < count($urls); $i++){
$url = $urls[$i];
$file = $i . '_' . substr($url, strrpos($url,'/')+1 );
$path = $dir . '/' . $file;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
if (!is_dir($dir)) mkdir($dir);
file_put_contents($path, $data);
}
?>
If for example you put that in 'c:/wamp/www/csvSaver/csvsaver.php', you will also need to create a directory 'files' in it: c:/wamp/www/csvSaver/files/
When you run the above page, it will for example save the two files into:
c:/wamp/www/csvSaver/2013-02-27_09-45/0_hello.csv
c:/wamp/www/csvSaver/2013-02-27_09-45/1_csv1.csv
All you need to do is set up something to run the PHP page daily. If you're using linux, set up a cron job. If you're using windows, set up a scheduled task that runs a .bat file. For an example, Look here.. If that looks too complicated, then just go to your PHP page manually each day.
I put my files on my VPS and user can direct download all files. but I want to hide my actual file paths and make time limited download links. I googled it and find some solutions but most of them was for files that were on same server and some of them has some coding in VPS side, but i can't write any php code on my VPS because it doesn't support php.
also I try some script that works well but generated link wasn't resumable and didn't show file size until download finished. How can I solve these problems?
You could use mod_auth_token (http://code.google.com/p/mod-auth-token/) apache module, if you are running apache as web frontend.
This is how you can handle the PHP side of the token generation process:
<?php
// Settings to generate the URI
$secret = "secret string"; // Same as AuthTokenSecret
$protectedPath = "/downloads/"; // Same as AuthTokenPrefix
$ipLimitation = false; // Same as AuthTokenLimitByIp
$hexTime = dechex(time()); // Time in Hexadecimal
//$hexTime = dechex(time()+120); // Link available after 2 minutes
$fileName = "/file_to_protect.txt"; // The file to access
// Let's generate the token depending if we set AuthTokenLimitByIp
if ($ipLimitation) {
$token = md5($secret . $fileName . $hexTime . $_SERVER['REMOTE_ADDR']);
}
else {
$token = md5($secret . $fileName. $hexTime);
}
// We build the url
$url = $protectedPath . $token. "/" . $hexTime . $fileName;
echo $url;
?>
If you cant make changes to the actual downloadlinks they will stay available for download until they are deleted from the server.
Of course you can make a Script which encrypts the download URL based on the system time, but once the user calls them within time he gets the decrypted URL from the script.