fopen alternative for FPDF (PHP) - php

I am using FPDF to extract info from a PNG file. Unfortunately, the server has fopen disabled. Can anyone recommend a good way of getting around this? Any help would be much appreciated. Thanks in advance!
function _parsepng($file)
{
// Extract info from a PNG file
$f = fopen($file,'rb');
if(!$f)
$this->Error('Can\'t open image file: '.$file);
$info = $this->_parsepngstream($f,$file);
fclose($f);
return $info;
}

You can try curl but usually if your hosting company disables one they also disable the other.

I really don't know if it can work :-)
but you may try with fsockopen
http://www.php.net/manual/en/function.fsockopen.php
file:// can be used as protocol
hope this helps

Ended up using a cURL workaround. Thanks everyone for the input!
function _parsepng($file)
{
$ch = curl_init($file);
$fp = fopen('/tmp/myfile.png', 'wb');
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FILE, $fp);
$image_data = curl_exec($ch);
curl_close($ch);
// Extract info from a PNG file
$f = fopen('/tmp/myfile.png','rb');
if(!$f)
$this->Error('Can\'t open image file: '.$file);
$info = $this->_parsepngstream($f,$file);
fclose($f);
return $info;
}

Related

Physical memory warning

<?php
file_put_contents("10gb.zip", fopen("http://website.website/10GB.zip", 'r'));
echo "File Downloaded!";
I am using this code to download files from url to my server. But when I run my code My hosing servers memory turn into red! -_- and my download stuck at 3.79 GB.
Is there any limitation to download big files? i want to download more than 50 GB with 5 process! Is it possible?
i would go for streaming when dealing with large file rather than copying them directly
from the example provided here : http://php.net/manual/en/function.stream-copy-to-stream.php
you can try :
<?php
function pipe_streams($in, $out)
{
while (!feof($in))
fwrite($out,fread($in,8192));
}
pipe_streams("http://website.website/10GB.zip", "10gb.zip");
?>
or use curl (http://php.net/manual/en/book.curl.php) :
<?php
$url = "http://website.website/10GB.zip";
$path = "10gb.zip";
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
check this https://www.sitepoint.com/performant-reading-big-files-php/ for more streaming options

Get Image With file_get_contents it return Not Found Error

I have one Image on another server (Image).but when i get this image With file_get_contents() function it will return
Not Found Error
and generate this Image.
file_put_contents(destination_path, file_get_contents(another_server_path));
plz help me. if there are another way to get those image.
Try this.
There is problem with URL Special character.then you have to decode some special character from url basename.
$imgfile = 'http://www.lagrolla.com.au/image/m fr 137 group.jpg';
$destinationPath = '/path/to/folder/';
$filename = basename($imgpath);
$imgpath = str_replace($filename,'',$imgpath).rawurldecode($filename);
copy($imgfile,$destination_path.$filename);
Another way to download copy file from another server is using curl:
$ch = curl_init('http://www.lagrolla.com.au/image/data/m%20fr%20137%20group.jpg');
$destinationPath = '/path/to/folder/filenameWithNoSpaces.jpg';
$fp = fopen($destinationPath, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Note: It is bad practice to save images with spaces in file name, so you should save this file with proper name.

php function file_get_contents() gets only few KB of remote file

Hello i want to download remote zip, which is about 8 MB big. I wrote simple script
set_time_limit(0);
$zip = file_get_contents('http://web.tld/folder/download/getfile.do?filename=file.zip&_lang=Lang');
file_put_contents('zip_files/file.zip',$zip);
it works but stored file is not 8 MB but only 52 KB.
Its same if i use
set_time_limit(0);
$url = 'http://web.tld/folder/download/getfile.do?filename=file.zip&_lang=Lang';
$path = 'zip_files/file.zip';
/* get and save remote data without exceeding php memory limit */
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
so maybe i have to use some stream option ?! Thank you
ps: i tried Snoopy library (http://sourceforge.net/projects/snoopy/) and its also same, only 52KB :P
include "libs/Snoopy-2.0/Snoopy.class.php";
$snoopy = new Snoopy;
$snoopy->submit($url);
print $snoopy->results;
Look inside saved file (use any text editor) it's possible to see not zip, just a page with wrong URL or something.

how to get file from xooplate.com using php

i see http://xooplate.com/templates/download/13693 return a file.
i have using
$ch = curl_init("http://xooplate.com/templates/download/13693");
$fp = fopen("example_homepage.zip", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
function get_include_contents($filename) {
if (is_file($filename)) {
ob_start();
include $filename;
$contents = ob_get_contents();
ob_end_clean();
return $contents;
}
return false;
}
$string = get_include_contents('http://xooplate.com/templates/download/13693');
but not working, i expected one a help, thank for all
Updated Answer :
Hey buddy, i checked that site. they put onClick jQuery method on the download button (widget.js line 27) with the post domain security. I tried to trigger the onClick event but due to domain check security functiion server only allows http://www.stumbleupon.com and http://xooplate.com to post the data using AJAX. So it wouldn't be possible using curl or PHPDOM. Sorry. :) Great Security.
Below Solution is not the answer !
Hi I found a solution using PHP Simple HTML DOM Library. Just Download it and use the following code
<?php
include("simple_html_dom.php");
$html = file_get_html('http://xooplate.com/templates/details/13693-creative-circular-skills-ui-infographic-psd');
foreach($html->find('a[class=btn_green nturl]') as $element)
echo $element->href . '<br>';
?>

There are any open-soure PHP Web Proxy ready to use?

I need a PHP Web Proxy that read html, show to the user and rewrite all the links for when the user click in the next link the proxy will handle the request again, just like this code, but with additionaly sould make the rewrite of all the links.
<?php
// Set your return content type
header('Content-type: text/html');
// Website url to open
$daurl = 'http://www.yahoo.com';
// Get that website's content
$handle = fopen($daurl, "r");
// If there is something, read and return
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
?>
I hope I have explained well. This question is for not reinventing the wheel.
Another additional question. This kind of proxies will deal with contents like Flash?
For an open source solution, check out PHProxy. I've used it in the past and it seemed to work quite well from what I can remember.
It will sort of work, you need to rewrite any relative path to apsolute, and I think cookies won't work in this case. Use cURL for this operations...
function curl($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
return curl_exec($ch);
curl_close ($ch);
}
$url = "http://www.yahoo.com";
echo curl($url);

Categories