Fixing permissions for writing cUrl output to a text file - php

I am making a web application that grabs one page from the internet using cUrl and updates another page accordingly. I have been doing this by saving the HTML from cUrl and then parsing it on the other page. The issue is: I can't figure out what permissions I should use for the text file. I don't have it saved in my /public/ html folder, since I don't want any of the website's users to be able to see it. I only want them to be able to see the way it's parsed on the site.
Here is the cUrl code:
$perfidlist = "" ;
$sourcefile = "../templates/textfilefromsite.txt";
$trackerfile = "../templates/trackerfile.txt";
//CURL REQUEST 1 OF 2
$ch = curl_init("http://www.website.com");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
<more cUrl options omitted>
ob_start();
$curl2 = curl_exec($ch);
ob_end_clean();
curl_close($ch);
//WRITING FILES
$out = fopen($sourcefile, "r");
$oldfiletext = fread($out, filesize($sourcefile));
fclose($out);
$runcode = 1 ;
And the part where I save the text file:
/*only writing a file if the site has changed*/
if (strcmp($oldfiletext, $curl2) !==0)
{
$out = fopen($sourcefile, "w");
fwrite($out, $curl2);
fclose($out);
$tracker = fopen($trackerfile, "a+");
fwrite($tracker, date('Y/m/d H:i:s')."\n");
fclose($tracker);
$runcode = 1 ;
}
I am receiving an error at that last '$out = fopen($sourcefile, "w");' part that says:
Warning: fopen(../templates/textfilefromsite.txt): failed to open stream: Permission denied in /usr/share/nginx/templates/homedir.php on line 72
Any ideas?

The issue was with file/folder permissions. I ended up changing the permissions for the file to '666' meaning '-rw-rw-rw-' and it worked.

Related

copy, file_get_contents, file_put_contents doesn't get full content in new file

I'm trying to download a file that exists on different server and move it to my new server.
I've tried
$file = file_get_contents("https://exampleurl.com/file/download.txt");
file_put_contents("C:/directory/to/report/data.csv", $file);
as well
$remote_file_url = "https://exampleurl.com/file/download.txt";
$local_file = C:/directory/to/report/data.csv;
$copy = copy( $remote_file_url, $local_file );
But the file never completes, it cuts off towards the end of the file.
When I download the file directly from the url it's complete everytime.
I'm looking for a way to make sure the file is downloaded completely
Use Curl request generally file_get_contents timeout which causes this sort of errors and only the partial content gets loaded. Try this:
$ch=curl_init("https://exampleurl.com/file/download.txt"); //create a c url session
$timeout=300; //set time limit here depends upon the operations being done on the remote server currently its 5mins
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout); //set timeout for request
$content = curl_exec($ch);//execute request
if($content)
{
$localfile = fopen("path to local file", "w"); //create or open a existing file
if(fwrite($localfile , $content)) //write data received
echo "success";
else
echo "unable to write file";
fclose($localfile ); //close the file
}
else //error occured while executing the request on the remote server
echo "Request Failed";

Unable to fread output of a remote php file

I am using the output of a php file on a remote server, to show content on my own web-site. I do not have access to modify files on the remote server.
The remote php file outputs java script like this:
document.write('<p>some text</p>');
If I enter the url in a browser I get the correct output. E.g:
https://www.remote_server.com/files/the.php?param1=12
I can show the output of the remote file on my website like this:
<script type="text/javascript" src="https://www.remote_server.com/files/the.php?param1=12"></script>
But I would like to filter the output a bit before showing it.
Therefore I implemented a php file with this code:
function getRemoteOutput(){
$file = fopen("https://www.remote_server.com/files/the.php?param1=12","r");
$output = fread($file,1024);
fclose($file);
return $output;
}
When I call this function fopen() returns a valid handle, but fread() returns an empty string.
I have tried using file_get_contents() instead, but get the same result.
Is what I am trying to do possible?
Is it possible for the remote server to allow me to read the file via the browser, but block access from a php file?
Your variable $output is only holding the 1st 1024 bytes of the url... (headers maybe?).
You will need to add a while not the "end of file" loop to concatenate the entire remote file.
PHP reference: feof
You can learn a lot more in the PHP description for the fread function.
PHP reference: fread.
<?php
echo getRemoteOutput();
function getRemoteOutput(){
$file = fopen("http://php.net/manual/en/function.fread.php","r");
$output = "";
while (!feof($file)){ // while not the End Of File
$output.= fread($file,1024); //reads 1024 bytes at a time and appends to the variable as a string.
}
return $output;
fclose($file);
}
?>
In regards to your questions:
Is what I am trying to do possible?
Yes this is possible.
Is it possible for the remote server to allow me to read the file via
the browser, but block access from a php file?
I doubt it.
I contacted the support team for the site I was trying to connect to. They told me that they do prevent access from php files.
So that seems to be the reason for my problems, and apparently I just cannot do what I tried to do.
For what it's worth, here is the code I used to test the various methods to read file output:
<?php
//$remotefile = 'http://www.xencomsoftware.net/configurator/tracker/ip.php';
$remotefile = "http://php.net/manual/en/function.fread.php";
function getList1(){
global $remotefile;
$output = file_get_contents($remotefile);
return htmlentities($output);
}
function getList2(){
global $remotefile;
$file = fopen($remotefile,"r");
$output = "";
while (!feof($file)){ // while not the End Of File
$output.= fread($file,1024); //reads 1024 bytes at a time and appends to the variable as a string.
}
fclose($file);
return htmlentities($output);
}
function getList3(){
global $remotefile;
$ch = curl_init(); // create curl resource
curl_setopt($ch, CURLOPT_URL, $remotefile); // set url
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); //return the transfer as a string
$output = curl_exec($ch); // $output contains the output string
curl_close($ch); // close curl resource to free up system resources
return htmlentities($output);
}
function getList4(){
global $remotefile;
$r = new HttpRequest($remotefile, HttpRequest::METH_GET);
try {
$r->send();
if ($r->getResponseCode() == 200) {
$output = $r->getResponseBody();
}
} catch (Exception $e) {
echo 'Caught exception: ', $e->getMessage(), "\n";
}
return htmlentities($output);
}
function dumpList($ix, $list){
$len = strlen($list);
echo "<p><b>--- getList$ix() ---</b></p>";
echo "<div>Length: $len</div>";
for ($i = 0 ; $i < 10 ; $i++) {
echo "$i: $list[$i] <br>";
}
// echo "<p>$list</p>";
}
dumpList(1, getList1()); // doesn't work! You cannot include/requre a remote file.
dumpList(2, getList2());
dumpList(3, getList3());
dumpList(4, getList4());
?>

Downloading File from a URL using PHP script

Hi I want to download some 250 files from a URL which are in a sequence. I am almost done with it! Just the Problem is the structure of my URL is:
http://lee.kias.re.kr/~newton/sann/out/201409//SEQUENCE1.prsa
Where id is in a sequence but the file name "SEQUENCE1.psra" has a format "SEQUENCE?.psra".
Is there any way I can specify this format of file in my code? And also there are other files in folder, but only 1 with ".psra" ext.
Code:
<?php
// Source URL pattern
//$sourceURLOriginal = "http://www.somewebsite.com/document{x}.pdf";
$sourceURLOriginal = " http://lee.kias.re.kr/~newton/sann/out/201409/{x}/**SEQUENCE?.prsa**";
// Destination folder
$destinationFolder = "C:\\Users\\hp\\Downloads\\SOP\\ppi\\RSAdata";
// Destination file name pattern
$destinationFileNameOriginal = "doc{x}.txt";
// Start number
$start = 7043;
// End number
$end = 7045;
$n=1;
// From start to end
for ($i=$start; $i<=$end; $i++) {
// Replace source URL parameter with number
$sourceURL = str_replace("{x}", $i, $sourceURLOriginal);
// Destination file name
$destinationFile = $destinationFolder . "\\" .
str_replace("{x}", $i, $destinationFileNameOriginal);
// Read from URL, write to file
file_put_contents($destinationFile,
file_get_contents($sourceURL)
);
// Output progress
echo "File #$i complete\n";
}
?>
Its working if I directly specify the URL!
Error:
Warning: file_get_contents( http://lee.kias.re.kr/~newton/sann/out/201409/7043/SEQUENCE?.prsa): failed to open stream: Invalid argument in C:\xampp\htdocs\SOP\download.php on line 37
File #7043 complete
Its making the files but they are empty!
If there is a way in which I can download that whole folder(named with id in sequence) can also work! But how do we download the whole folder in a folder?
It may be possible file_get_contents() function is not working on your server.
Try this code :
function url_get_contents ($Url) {
if (!function_exists('curl_init')){
die('CURL is not installed!');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $Url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = curl_exec($ch);
curl_close($ch);
return $output;
}
Here you go.
I didnt test the whole file_get_contents, file_put_contents part, but if you say its adding the files (albeit, blank) then I assume it still works here...
Everything else works fine. I left a var_dump() in so you can see what the return looks like.
I did what I suggested in my comment. Open the folder, parse the file list, grab the filename you need.
Also, I dont know if you read my original comments, but $sourceURLOriginal has an extra space at the beginning, which might have been giving you an issue.
<?php
$start=7043;
$end=7045;
$sourceURLOriginal="http://lee.kias.re.kr/~newton/sann/out/201409/";
$destinationFolder='C:\Users\hp\Downloads\SOP\ppi\RSAdata';
for ($i=$start; $i<=$end; $i++) {
$contents=file_get_contents($sourceURLOriginal.$i);
preg_match_All("|href=[\"'](.*?)[\"']|",$contents,$hrefs);
$file_list=array();
if (empty($hrefs[1])) continue;
unset($hrefs[1][0],$hrefs[1][1],$hrefs[1][2],$hrefs[1][3],$hrefs[1][4]);
$file_list=array_values($hrefs[1]);
var_dump($file_list);
foreach ($file_list as $index=>$file) {
if (strpos($file,'prsa')!==false) {
$needed_file=$index;
break;
}
}
file_put_contents($destinationFolder.'\doc'.$i.'.txt',
file_get_contents($sourceURLOriginal.$i.'/'.$file_list[$needed_file])
);
}

Deezer API and file_put_contents

I've got this script for saving albumart from Deezer to my server. The albumart url is alright, you can try yourself. And it does make a file but it's not the image I would like to see but a corrupted file. I am guessing it has something to do with the (I guess) 301 they provide when you visit the original link you get from the API. But I don't know hot to solve that problem if it is that.
<?php
// Deezer
$query = 'https://api.deezer.com/2.0/search?q=madonna';
$file = file_get_contents($query);
$parsedFile = json_decode($file);
$albumart = $parsedFile->data[0]->artist->picture;
$artist = $parsedFile->data[0]->artist->name;
$dir = dirname(__FILE__).'/albumarts/'.$artist.'.jpg';
file_put_contents($dir, $albumart);
?>
Two issues:
1) $albumart contains a URL (in your case http://api.deezer.com/2.0/artist/290/image). You need to do file_get_contents on that url.
<?php
// Deezer
$query = 'https://api.deezer.com/2.0/search?q=madonna';
$file = file_get_contents($query);
$parsedFile = json_decode($file);
$albumart = $parsedFile->data[0]->artist->picture;
$artist = $parsedFile->data[0]->artist->name;
$dir = dirname(__FILE__).'/albumarts/'.$artist.'.jpg';
file_put_contents($dir, file_get_contents($albumart)); // << Changed this line
?>
2) The redirect may be a problem (as you suggest). To get around that, use curl functions.
// Get file using curl.
// NOTE: you can add other options, read the manual
$ch = curl_init($albumart);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
// Save output
file_put_contents($dir, $data);
Note, you should use curl() for handling getting content from external URLs as a matter of principal. Safer and you have better control. Some hosts also block accessing external URLS using file_get_contents anyway.
Why not get the headers for the file (the headers contain the redirect).
$headerdata=get_headers($albumart);
echo($headerdata[4]);//show the redirect (for testing)
$actualloc=str_replace("Location: ","",$headerdata[4]);//remove the 'location' header string
file_put_contents($dir, $actualloc);
I think it's the 4th record in the header, if not check it with a print_r($hearderdata);
this will return the proper url of the image file.

PHP: Download a file from web to local machine

I have searched the web for 2 days and can not find the answer.
I am trying to create a routine which displays the files on a site I control, and allows the user to download a selected file to a local drive.
I am using the code below. When I uncomment the echo statements, it displays the correct source and destination directories, the correct file size and the echo after the fclose displays TRUE.
When I echo the source file ($data), it displays the correct content.
The $FileName variable contains the correct filename, which is either .doc/.docx or .pdf. I have tested both and neither saves anything into the destination directory, or anywhere else on my machine.
The source path ($path) is behind a login, but I am already logged in.
Any thoughts on why this is failing to write the file?
Thanks,
Hank
Code:
$path = "https://.../Reports/ReportDetails/$FileName";
/* echo "Downloading: $path"; */
$data = file_get_contents($path); /* echo "$data"; */
$dest = "C:\MyScans\\".$FileName; /* echo "<br />$dest"; */
$fp = fopen($dest,'wb');
if ( $fp === FALSE ) echo "<br />Error in fopen";
$result = fwrite($fp,$data);
if ( $result === FALSE ) echo "<br />Can not write to $dest";
/* else echo "<br />$result bytes written"; */
$result = fclose($fp); /* echo "<br />Close: $result"; */
I think (!) that you're a bit confused.
You mentioned
allows the user to download a selected file to a local drive.
But the path "C:\MyScans\\".$FileName is is the path on the webserver, not the path on the user's own computer.
After you do whatever to retrieve the desired file from the remote website:
Create a file from it and redirect the user to the file by using header('Location: /path/to/file.txt');
Insert the following header:
header('Content-disposition: attachment; filename=path/to/file.txt');
It forces the user to download the file. And that's probably what you want to do
Note: I have used the extension txt, but you can use any extension
you can use php Curl:
<?php
$url = 'http://www.example.com/a-large-file.zip';
$path = '/path/to/a-large-file.zip';
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);

Categories