I want to upload an external image to my remote host via php.mean's that i wan't code which copy http://www.exaple.com/123.jpg to http://www.mysite.com/image.jpg.
Despite exec function ,due this function has been disabled by webhosting.
best regards
Google is your friend:
<?php exec("wget http://www.exaple.com/123.jpg"); exec("mv 123.jpg image.jpg")?>
(this code is to be executed on the server www.mysite.com)
You can run a php script to get the file and save it in your site.
<?php
$data = file_get_contents('http://www.exaple.com/123.jpg');
if ($data !== false) {
file_get_contents('image.jpg', $data);
}
else {
// error in fetching the file
}
file_get_contents is binary safe. you can find more about it from php website http://php.net/manual/en/function.file-get-contents.php
Related
This question is asked before but non of the answers worked for me.
I use the following code to directly copy a file from a remote server to my server,
<?php
set_time_limit(0); //Unlimited max execution time
$remote_file_url = $_GET['url'];
$ext = pathinfo($remote_file_url, PATHINFO_EXTENSION);
$name = basename($remote_file_url);
if(isset($ext)){
$local_file = 'download/'.$name.'.'.$ext;
}
else
$local_file = 'download/'.$name;
$copy = copy( $remote_file_url, "1.mp4" );
if( !$copy ) {
echo "Doh! failed to copy $file...\n";
}
else{
echo "WOOT! success to copy $file...\n";
}
?>
It works well but it doesn't copy the files I get from Youtube. I use 1-Click Youtube Video Downloader extension for Firefox which gives me direct link to youtube videos. I can use these direct links in browser and Internet Download Manager as well.
For example the direct url of
https://www.youtube.com/watch?v=xPXrJwQ5lqQ
is
https://r6---sn-ab5l6nzy.googlevideo.com/videoplayback?ipbits=0&requiressl=yes&sparams=dur,ei,expire,id,initcwndbps,ip,ipbits,ipbypass,itag,lmt,mime,mip,mm,mn,ms,mv,pl,ratebypass,requiressl,source&ei=3DNOWfq4CImGc9rxvcgO&signature=3D188D073D872381433A45462E84928383D10D02.4E0AF7D777E76AA19A576D42983A81F4E62EF84D&lmt=1472135086539955&mime=video%2Fmp4&ratebypass=yes&id=o-ABaoUEn3pBt5SLXdWXlrzCdteMLfLPizrRTPoakDoLSX&expire=1498318908&source=youtube&dur=119.211&itag=22&pl=20&ip=162.217.31.128&key=cms1&redirect_counter=1&req_id=ce038b9993a9a3ee&cms_redirect=yes&ipbypass=yes&mip=159.203.89.210&mm=31&mn=sn-ab5l6nzy&ms=au&mt=1498297234&mv=m
The problem is my code can't copy this file to my server. I would like to know of there is any way to resolve such urls?
The error is
failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden in /home/...
thanks in advance.
Well, I have no idea why that happened. (Would it be expired?I hope not) I just managed to try another link for the above video (copy the link using right click) in your code as the $remote_file_url and it worked as expected
How did I get that link?
I've used the underlined library : YouTube-Downloader to the 1-Click Youtube Video Downloader extension (it is inherently used by that extension ) this way you will have more control over the process. Then after hosting the files in your web server. Simply run the index.php and when you use it, you'll get something like :
Then you can automate this last part to suit your needs.
That doesn't mean that all videos could be smoothly downloaded with this method. Because of the used videos that have signatures issue or that are recently uploaded issue and here's the list of issues of Youtube-Downloader
For that There is a fix that is somewhat involved: youtube-dl-php, it is based on a sound principle : there is a very good command line utility to download YouTube videos called youtube-dl : here is the download page
Basically, you'll just call it using php. Then, notice that you'll need its path installed in order for the following to work
After you install Composer, go to your web project folder
and run composer require norkunas/youtube-dl-php as explained in the Github page
When running its example, I've get an error
proc_open() 267 CreateProcess failed
I've never dealt with Symphony before and I've found it particularly interesting to play with YoutubeDl.php and redefine the $arguments passed to createProcess and commenting out much of the less useful configuration options to get rid of that error, give it more time to run with
ini_set('max_execution_time', 300);
And yikes it was downloaded.
You don't have to follow this unless you couldn't figure out a better way. It is just supposed to give you an idea of where lies the problem if you havn'et figure it out. And if you have that problem in the first place.
private function createProcess(array $arguments = [])
{
array_unshift($arguments, $this->binPath ?: 'youtube-dl');
$process = new Process("youtube-dl https://www.youtube.com/watch?v=nDMwW41AlSI");
/*$process->setEnv(['LANG' => 'en_US.UTF-8']);
$process->setTimeout($this->timeout);
$process->setOptions($this->processOptions);
if ($this->moveWithPhp) {
$cwd = sys_get_temp_dir();
} else {
$cwd = $this->downloadPath ?: sys_get_temp_dir();
}
$process->setWorkingDirectory($cwd);*/
return $process;
}
Or you can just write your own code that calls youtube-dl, good luck!
I want to store some data retrieved using an API on my server. Specifically, these are .mp3 files of (free) learning tracks. I'm running into a problem though. The mp3 link returned from the request isn't to a straight .mp3 file, but rather makes an ADDITIONAL API call which normally would prompt you to download the mp3 file.
file_put_contents doesn't seem to like that. The mp3 file is empty.
Here's the code:
$id = $_POST['cid'];
$title = $_POST['title'];
if (!file_exists("tags/".$id."_".$title))
{
mkdir("tags/".$id."_".$title);
}
else
echo "Dir already exists";
file_put_contents("tags/{$id}_{$title}/all.mp3", fopen($_POST['all'], 'r'));
And here is an example of the second API I mentioned earlier:
http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts
Is there some way to bypass this intermediate step? If there's no way to access the direct URL of the mp3, is there a way to redirect the file download prompt to my server?
Thank you in advance for your help!
EDIT
Here is the current snippet. I should be echoing something, correct?
$handle = fopen("http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts", 'rb');
$contents = stream_get_contents($handle);
echo $contents;
Because this echos nothing.
SOLUTION
Ok, I guess file_get_contents is supposed to handle redirects just fine, but this wasn't happening. So I found this function: https://stackoverflow.com/a/4102293/2723783 to return the final redirect of the API. I plugged that URL into file_get_contents and volia!
You seem to be just opening the file handler and not getting the contents using fread() or another similar function:
http://www.php.net/manual/en/function.fread.php
$handle = fopen($_POST['all'], 'rb')
file_put_contents("tags/{$id}_{$title}/all.mp3", stream_get_contents($handle));
I've got two servers. One for the files and one for the website.
I figured out how to upload the files to that server but now I need to show the thumbnails on the website.
Is there a way to go through the folder /files on the file server and display a list of those files on the website using PHP?
I searched for a while now but can't find the answer.
I tried using scanddir([URL]) but that didn't work.
I'm embarrassed to say this but I found my answer at another post:
PHP directory list from remote server
function get_text($filename) {
$fp_load = fopen("$filename", "rb");
if ( $fp_load ) {
while ( !feof($fp_load) ) {
$content .= fgets($fp_load, 8192);
}
fclose($fp_load);
return $content;
}
}
$matches = array();
preg_match_all("/(a href\=\")([^\?\"]*)(\")/i", get_text('http://www.xxxxx.com/my/cool/remote/dir'), $matches);
foreach($matches[2] as $match) {
echo $match . '<br>';
}
scandir will not work any other server but your own. If you want to be able to do such a thing your best bet to have them still on separate servers would be to have a php file on the website, and a php file on the file server. The php file on your website could get file data of the other server via the file server php file printing data to the screen, and the webserver one reading in that data. Example:
Webserver:
<?php
$filedata = file_get_contents("url to file handler php");
?>
Fileserver:
<?php
echo "info you want webserver to read";
?>
This can also be customized for your doing with post and get requests.
I used the following method:
I created a script which goes through all the files at the file server.
$fileList = glob($dir."*.*");
This is only possible if the script is actually on the fileserver. It would be rather strange to go through files at another server without having access to it.
There is a way to do this without having access (read my other answer) but this is very slow and not coming in handy.
I know I said that I didn't have access, but I had. I just wanted to know all the possibilities.
I have a small code in PHP to create a QR code from an url and store it on a webserver to use later:
function generate($content, $format = "png")
{
$encoded = urlencode($content);
$url = "http://www.esponce.com/api/v3/generate?content=$encoded&format=$format";
return $url;
}
$qr = generate("http://www.mywebsite.com");
$data = file_get_contents($qr);
$saved = file_put_contents('qr/my_qr.png', $data);
When I execute the script on my local computer, the QR code is stored correctly in the /qr folder. When I upload the script to the webserver and execute it, the file my_qr.png is created and stored in the /qr folder, BUT the file is empty (0kb).
I thaught it was caused by the permissions of the /qr folder, but putting the permissions to 777 gives no difference. Does anyone can explain to me how I can solve this?
Other solutions to create and store QR codes are welcome to.
Many thanks.
Steven
I have a php-script that uploads files to an ftp server from a location with a very low bandwith, low reliability connection. I am currently using the ftp functions of php.
Sometimes the connection drops in the middle of a transfer. Is there a way to later resume the upload? How can this be done?
Edit:
People misunderstood that this is happening from a browser. However, this is not the case. It is a php cli script. So it is about a local file being uploaded to ftp, no user interaction.
Try getting the remote file's size and then issuing the APPE ftp command with the difference. This will append to the file. See http://webmasterworld.com/forum88/4703.htm for an example. If you want real control over this, I recommend using PHP's cURL functions.
I think all the previous answers are wrong. The PHP manage the resume, for that FTP_AUTODESK must be activated, and you have to activate the FTP_AUTORESUME during upload. The code should be something like this :
$ftp_connect = ftp_connect($host, $port);
ftp_set_option ($ftp_connect, FTP_AUTOSEEK, TRUE);
$stream = fopen($local_file, 'r');
$upload = ftp_fput($ftp_connect, $file_on_ftp, $stream, FTP_BINARY, FTP_AUTORESUME);
fclose($stream);
ftp_(f)put has a $startpos parameter. You should be able to do what you need using this parameter. (starts the transfer from the file size on your server).
However I never used it, so I don't know about the reliability. You should try.
EDIT:
Another Example: by cballou
Look here. A very simple and good example.
You could do something like this:
<?php
$host = 'your.host.name';
$user = 'user';
$passwd = 'yourpasswd';
$ftp_stream = ftp_connect($host);
//EDIT
ftp_set_option($ftp_stream, FTP_AUTOSEEK, 1);
if ( ftp_login($ftp_stream,$user,$passwd) ){
//activate passive mode
//ftp_pasv($ftp_stream, TRUE);
$ret = ftp_nb_put($ftp_stream, $remotefile, $localfile, FTP_BINARY,FTP_AUTORESUME);
while ( FTP_MOREDATA == $ret ) {
// continue transfer
$ret = ftp_nb_continue($ftp_stream);
}
if (FTP_FINISHED !== $ret){
echo 'Failure occured on transfer ...';
}
}
else {
print "FAILURE while login ...";
}
//free
ftp_close($ftp_stream);
?>
are you meaning you are going to allow user resume upload from you web interface ?
is this suit you ?
http://www.webmasterworld.com/php/3931706.htm
just use the filezilla can done everything for you ...i mean the uploading process...
it will keep the data and reopen after close you will be able to continue process q...
for automate and tracking , simply use our way...which is git and capistrano
there is an option for php
try google it...