Resolve relative urls of youtube using PHP - php

This question is asked before but non of the answers worked for me.
I use the following code to directly copy a file from a remote server to my server,
<?php
set_time_limit(0); //Unlimited max execution time
$remote_file_url = $_GET['url'];
$ext = pathinfo($remote_file_url, PATHINFO_EXTENSION);
$name = basename($remote_file_url);
if(isset($ext)){
$local_file = 'download/'.$name.'.'.$ext;
}
else
$local_file = 'download/'.$name;
$copy = copy( $remote_file_url, "1.mp4" );
if( !$copy ) {
echo "Doh! failed to copy $file...\n";
}
else{
echo "WOOT! success to copy $file...\n";
}
?>
It works well but it doesn't copy the files I get from Youtube. I use 1-Click Youtube Video Downloader extension for Firefox which gives me direct link to youtube videos. I can use these direct links in browser and Internet Download Manager as well.
For example the direct url of
https://www.youtube.com/watch?v=xPXrJwQ5lqQ
is
https://r6---sn-ab5l6nzy.googlevideo.com/videoplayback?ipbits=0&requiressl=yes&sparams=dur,ei,expire,id,initcwndbps,ip,ipbits,ipbypass,itag,lmt,mime,mip,mm,mn,ms,mv,pl,ratebypass,requiressl,source&ei=3DNOWfq4CImGc9rxvcgO&signature=3D188D073D872381433A45462E84928383D10D02.4E0AF7D777E76AA19A576D42983A81F4E62EF84D&lmt=1472135086539955&mime=video%2Fmp4&ratebypass=yes&id=o-ABaoUEn3pBt5SLXdWXlrzCdteMLfLPizrRTPoakDoLSX&expire=1498318908&source=youtube&dur=119.211&itag=22&pl=20&ip=162.217.31.128&key=cms1&redirect_counter=1&req_id=ce038b9993a9a3ee&cms_redirect=yes&ipbypass=yes&mip=159.203.89.210&mm=31&mn=sn-ab5l6nzy&ms=au&mt=1498297234&mv=m
The problem is my code can't copy this file to my server. I would like to know of there is any way to resolve such urls?
The error is
failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden in /home/...
thanks in advance.

Well, I have no idea why that happened. (Would it be expired?I hope not) I just managed to try another link for the above video (copy the link using right click) in your code as the $remote_file_url and it worked as expected
How did I get that link?
I've used the underlined library : YouTube-Downloader to the 1-Click Youtube Video Downloader extension (it is inherently used by that extension ) this way you will have more control over the process. Then after hosting the files in your web server. Simply run the index.php and when you use it, you'll get something like :
Then you can automate this last part to suit your needs.
That doesn't mean that all videos could be smoothly downloaded with this method. Because of the used videos that have signatures issue or that are recently uploaded issue and here's the list of issues of Youtube-Downloader
For that There is a fix that is somewhat involved: youtube-dl-php, it is based on a sound principle : there is a very good command line utility to download YouTube videos called youtube-dl : here is the download page
Basically, you'll just call it using php. Then, notice that you'll need its path installed in order for the following to work
After you install Composer, go to your web project folder
and run composer require norkunas/youtube-dl-php as explained in the Github page
When running its example, I've get an error
proc_open() 267 CreateProcess failed
I've never dealt with Symphony before and I've found it particularly interesting to play with YoutubeDl.php and redefine the $arguments passed to createProcess and commenting out much of the less useful configuration options to get rid of that error, give it more time to run with
ini_set('max_execution_time', 300);
And yikes it was downloaded.
You don't have to follow this unless you couldn't figure out a better way. It is just supposed to give you an idea of where lies the problem if you havn'et figure it out. And if you have that problem in the first place.
private function createProcess(array $arguments = [])
{
array_unshift($arguments, $this->binPath ?: 'youtube-dl');
$process = new Process("youtube-dl https://www.youtube.com/watch?v=nDMwW41AlSI");
/*$process->setEnv(['LANG' => 'en_US.UTF-8']);
$process->setTimeout($this->timeout);
$process->setOptions($this->processOptions);
if ($this->moveWithPhp) {
$cwd = sys_get_temp_dir();
} else {
$cwd = $this->downloadPath ?: sys_get_temp_dir();
}
$process->setWorkingDirectory($cwd);*/
return $process;
}
Or you can just write your own code that calls youtube-dl, good luck!

Related

Counting the pages in a PDF using PHP

I am writing a test to count the number of pages in PDFs stored in a folder on a server. I have been able to get it to work on my local machine, I am unable to get it to work on a remote file.
This is my code that works on local files:
require_once 'C:\..\application\libraries\fpdi\fpdf.php';
require_once 'C:\..\application\libraries\fpdi\fpdi.php';
$pathToFile = 'C:\Users\..\Desktop\filename.pdf';
$pdf = new FPDI();
$pageCount = $pdf->setSourceFile($pathToFile);
echo $pageCount;
But if I change the $pathToFile to a link on a remote server, I get an error message.
I tried this:
$pdfname = 'http://../filename.pdf';
$pdftext = file_get_contents($pdfname);
$num = preg_match_all('/\/Page\W/', $pdftext, $dummy);
echo 'Num: ' . $num;
But again, when I use a local file, it works fine, but the remote file gives me an error (failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden).
From searching online, it seems like that is a common error and I've seen code to use curl, but it makes no sense to me and I can't get it to work either. I saw code to use pdfinfo, but the link in that post is going to another site.
I don't want to have to download anything, so using something like Imagick is not an option either.
All I'm looking for is a simple page number from a file on a remote server. Any help would be much appreciated.

Use file_get_contents to to download multiple files at same time in php?

UDDATE: THIS IS AN S3 BUCKET QUESTION (SEE ANSWER)
I am looking to upload some code that reads files from an S3 bucket which uses the file_get_contents command to download a file one at a time.
Start
file_get_contents(s3://file1.json)
Wait until finished , then start next download:
file_get_contents(s3://file2.json)
And I want instead for them to all start at once to save time. like this:
Start both at same time:
file_get_contents(s3://file1.json)
file_get_contents(s3://file2.json)
Wait for them both at same time to finish.
I have seen multi curl requests but nothing for file_get_contents on this topic, is it possible ?
EDIT: Currently the code I am looking at uses s3:// which doesn't seem to work with curl. This is a way of getting to Amazon's S3 bucket.
EDIT2: Sample of current code :
function get_json_file( $filename = false ){
if(!$filename) return false;
// builds s3://somefile.on.amazon.com/file.json
$path = $this->get_json_filename( $filename );
if(!$filename || !file_exists($path)){
return false;
}
elseif(file_exists($path)) {
$data = file_get_contents($path);
}
else $data = false;
return ( empty( $data ) ? false : json_decode( $data , true ));
}
ANSWER: S3 SECURED, REQUIRES SPECIAL URI
Thank you guys for responding in comments. I was WAY off base with this earlier today when I asked this question.
This is the deal, the version of php we are using does not allow for threading. Hence, to ask for multiple URLs you need to use the multi curl option. The s:// somehow worked with a class file for S3 that I missed before. Hence the weird naming.
ALSO, IMPORTANT, if you don't care about protecting the data on S3, you can just make it public and NOT have this issue. In my case the data needs to be somewhat protected so that requires a bunch of things in the URI.
You can use Amazon's S3 class to generate a secure link from the URI and the bucket name on S3. This will return the proper URL to use with your bucket. The S3 class can be downloaded manually or installed via composer in laravel for example. It requires that you install an user with a key to access this in the AWS console.
$bucketName = "my-amazon-bucket";
$uri = "/somefile.txt"
$secure_url = $s3->getObjectUrl($bucketName, $uri, '+5 minutes');
This will generate a valid URL to access the file on S3 which can be used with curl etc..
https://domain.s3.amazonaws.com/bucket/filename.zip?AWSAccessKeyId=myaccesskey&Expires=1305311393&Signature=mysignature

Failed to Send File to another computer on the same network using PHP

We're making a Foodlist site which has an Admin that has the privilege to "Add Food" and "Edit Food" which includes giving images. But all trials to send the food image to the actual computer we're going to use as a server has failed. It'd be weird to copy the image everytime we edit something or add something.
Our site requires us to simply send a file from our computer to another in their "xampp\htdocs\sitepage\images" folder. We're on the same network. Here's our code.
$my_file = 'C:\Users\ASUS\DownloadsMyFile.jpg';
/* New file name and path for this file */
$destination_file = '192.168.1.105/sitepage/images/INeedSomeRest.jpg';
/* Copy the file from source to server */
$copy = copy( $my_file, $destination_file );
/* Add notice for success/failure */
if( !$copy ) {
echo "It didn't work.";
}
else{
echo "WOOT! Successfully copied the file.";
}
It always gives us an error such that the browser says something like:
" failed to open stream: No such file or directory in ~our php file~ "
Is there something we're doing wrong? That being said, is there another way to send a file to another computer via PHP?
A Response will be greatly appreciated. Thank you in advance.
It seems I took the "copy" method of php all wrong. it seems it can only take a file from another server to your computer, and not the other way around. To the people who have the same problem as me, try to read about File Transfer Protocols. =) I appreciate all the replies. Thank you.
1) Try to use $destination_file = 'http://192.168.1.105/sitepage/images/INeedSomeRest.jpg'; (note about http://)
2) Make sure that you CAN write to that resource (enough rights, shared properly, etc)

How can I change the home directory of some code?

I found some PHP online (it's a 1 page file manager with no permissions) that I find is really awesome, it suits my current needs. However, I'm having some issues changing the working (default) directory.
I got the script from a GitHub project that is no longer maintained. The PHP itself is a 1 page PHP file manager with no permissions, no databases etc. I already have a user accounts system and would like to change the working directory based on an existing database variable, however I can't seem to find a way around changing the directory.
Currently, the script is uploaded to /home/advenacm/public_html/my/ (as the file is /home/advenacm/public_html/my/files.php. By what I can tell, the PHP uses a cookie to determine the working directory, but it can't find a way around setting a custom directory. I want to use '/home/advenacm/public_html/my/'.$userdomain;, which will as a result become something like /home/advenacm/public_html/my/userdomain.com/.
What I would like to do is set the default (or "home") directory so that the file manager cannot access the root directory, only a specified subfolder.
Something like directory = "/home/advenaio/public_html/directory/" is the best way to explain it. I've tried a number of methods to try and achieve this but nothing seems to work.
I've taken the liberty of uploading my code to pastebin with the PHP syntax highlighting. Here is the snippet of PHP that I believe is choosing the working directory (line 19-29):
$tmp = realpath($_REQUEST['file']);
if($tmp === false)
err(404,'File or Directory Not Found');
if(substr($tmp, 0,strlen(__DIR__)) !== __DIR__)
err(403,"Forbidden");
if(!$_COOKIE['_sfm_xsrf'])
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
I appreciate any help anyone can offer me and would like to thank anyone in advance for even taking the time to look at my question.
Have you tried chdir() function ?
later edit
Updating my answer based on your edited question.
The main problem is line 30
$file = $_REQUEST['file'] ?: '.';
That needs to be a full real path to the file and has to be compared with your user's 'home'.
And you should use the same path for the checks at line 19.
So you can replace 19-30 with:
$user_home = __DIR__ . "/{$userdomain}";
$file = $_REQUEST['file'] ?: $user_home; //you might have to prepend $userdomain to $_REQUEST['file'], can't see from html the format.
$file = realpath($_REQUEST['file']);
if($file === false) {
err(404,'File or Directory Not Found');
}
if(strpos($file, $user_home) !== 0) {
err(403,"Forbidden");
}
if(!$_COOKIE['_sfm_xsrf']) {
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
}
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
Although this might solve your question I think the entire script is a poorly written solution.

Download file to server using API (it triggers prompt)

I want to store some data retrieved using an API on my server. Specifically, these are .mp3 files of (free) learning tracks. I'm running into a problem though. The mp3 link returned from the request isn't to a straight .mp3 file, but rather makes an ADDITIONAL API call which normally would prompt you to download the mp3 file.
file_put_contents doesn't seem to like that. The mp3 file is empty.
Here's the code:
$id = $_POST['cid'];
$title = $_POST['title'];
if (!file_exists("tags/".$id."_".$title))
{
mkdir("tags/".$id."_".$title);
}
else
echo "Dir already exists";
file_put_contents("tags/{$id}_{$title}/all.mp3", fopen($_POST['all'], 'r'));
And here is an example of the second API I mentioned earlier:
http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts
Is there some way to bypass this intermediate step? If there's no way to access the direct URL of the mp3, is there a way to redirect the file download prompt to my server?
Thank you in advance for your help!
EDIT
Here is the current snippet. I should be echoing something, correct?
$handle = fopen("http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts", 'rb');
$contents = stream_get_contents($handle);
echo $contents;
Because this echos nothing.
SOLUTION
Ok, I guess file_get_contents is supposed to handle redirects just fine, but this wasn't happening. So I found this function: https://stackoverflow.com/a/4102293/2723783 to return the final redirect of the API. I plugged that URL into file_get_contents and volia!
You seem to be just opening the file handler and not getting the contents using fread() or another similar function:
http://www.php.net/manual/en/function.fread.php
$handle = fopen($_POST['all'], 'rb')
file_put_contents("tags/{$id}_{$title}/all.mp3", stream_get_contents($handle));

Categories