I'm currently running into some issues resizing images using GD.
Everything works fine until i want to resize an animated gif, which delivers the first frame on a black background.
I've tried using getimagesize but that only gives me dimensions and nothing to distinguish between just any gif and an animated one.
Actual resizing is not required for animated gifs, just being able to skip them would be enough for our purposes.
Any clues?
PS. I don't have access to imagemagick.
Kind regards,
Kris
While searching for a solution to the same problem I noticed that the php.net site has a follow-up to the code Davide and Kris are referring to, but, according to the author, less memory-intensive, and possibly less disk-intensive.
I'll replicate it here, because it may be of interest.
source: http://www.php.net/manual/en/function.imagecreatefromgif.php#88005
function is_ani($filename) {
if(!($fh = #fopen($filename, 'rb')))
return false;
$count = 0;
//an animated gif contains multiple "frames", with each frame having a
//header made up of:
// * a static 4-byte sequence (\x00\x21\xF9\x04)
// * 4 variable bytes
// * a static 2-byte sequence (\x00\x2C)
// We read through the file til we reach the end of the file, or we've found
// at least 2 frame headers
while(!feof($fh) && $count < 2) {
$chunk = fread($fh, 1024 * 100); //read 100kb at a time
$count += preg_match_all('#\x00\x21\xF9\x04.{4}\x00[\x2C\x21]#s', $chunk, $matches);
}
fclose($fh);
return $count > 1;
}
There is a brief snippet of code in the PHP manual page of the imagecreatefromgif() function that should be what you need:
imagecreatefromgif comment #59787 by ZeBadger
Here's the working function:
/**
* Thanks to ZeBadger for original example, and Davide Gualano for pointing me to it
* Original at http://it.php.net/manual/en/function.imagecreatefromgif.php#59787
**/
function is_animated_gif( $filename )
{
$raw = file_get_contents( $filename );
$offset = 0;
$frames = 0;
while ($frames < 2)
{
$where1 = strpos($raw, "\x00\x21\xF9\x04", $offset);
if ( $where1 === false )
{
break;
}
else
{
$offset = $where1 + 1;
$where2 = strpos( $raw, "\x00\x2C", $offset );
if ( $where2 === false )
{
break;
}
else
{
if ( $where1 + 8 == $where2 )
{
$frames ++;
}
$offset = $where2 + 1;
}
}
}
return $frames > 1;
}
This is an improvement of the current top voted answer but I don't have enough reputation to comment yet. The problem with that answer is that it reads the file in 100Kb chunks and the end of frame marker might be split in between 2 chunks. A fix for that is to add the last 20b of the previous frame to the next one:
<?php
function is_ani($filename) {
if(!($fh = #fopen($filename, 'rb')))
return false;
$count = 0;
//an animated gif contains multiple "frames", with each frame having a
//header made up of:
// * a static 4-byte sequence (\x00\x21\xF9\x04)
// * 4 variable bytes
// * a static 2-byte sequence (\x00\x2C) (some variants may use \x00\x21 ?)
// We read through the file til we reach the end of the file, or we've found
// at least 2 frame headers
$chunk = false;
while(!feof($fh) && $count < 2) {
//add the last 20 characters from the previous string, to make sure the searched pattern is not split.
$chunk = ($chunk ? substr($chunk, -20) : "") . fread($fh, 1024 * 100); //read 100kb at a time
$count += preg_match_all('#\x00\x21\xF9\x04.{4}\x00(\x2C|\x21)#s', $chunk, $matches);
}
fclose($fh);
return $count > 1;
}
Reading whole file with file_get_contents may take too much memory if the given file is too large. I've re-factored the function previously given which reads just enough bytes to check frames and returns as soon as it finds at least 2 frames.
<?php
/**
* Detects animated GIF from given file pointer resource or filename.
*
* #param resource|string $file File pointer resource or filename
* #return bool
*/
function is_animated_gif($file)
{
$fp = null;
if (is_string($file)) {
$fp = fopen($file, "rb");
} else {
$fp = $file;
/* Make sure that we are at the beginning of the file */
fseek($fp, 0);
}
if (fread($fp, 3) !== "GIF") {
fclose($fp);
return false;
}
$frames = 0;
while (!feof($fp) && $frames < 2) {
if (fread($fp, 1) === "\x00") {
/* Some of the animated GIFs do not contain graphic control extension (starts with 21 f9) */
if (fread($fp, 1) === "\x2c" || fread($fp, 2) === "\x21\xf9") {
$frames++;
}
}
}
fclose($fp);
return $frames > 1;
}
Animated GIF must have the following string
"\x21\xFF\x0B\x4E\x45\x54\x53\x43\x41\x50\x45\x32\x2E\x30"
I've tested on a few animated gifs and it seem the string is at the pos 781 of a file (found with file_get_contents and strpos)
Related
I am using the following code to save an image from a URL but sometimes the image URL is bad and there is no image there, OR there is an issue with the image and it saves a zero size file.
<?php
file_put_contents ("/var/www/html/images/" . $character . ".jpg",
file_get_contents($image));
I need to try and find a way to stop this happening as this creates a problem (saving zero size files).
I have tried this, but it still seems to be happening:
$filesize = file_put_contents ("/var/www/html/images/" . $character . ".jpg",
file_get_contents($image));
if (($filesize < 10) || ($filesize == "")) {
echo "Error";
}
Could anyone recommend a more reliable way to do this?
Imagick package has methods for doing this
Imagick::getImageGeometry() - returns width and height of an image, or throws an exception.
function isValidImage($filename)
{
if (!fileexists($filename) return false;
if (filesize($filename) == 0) return false;
$image = new imagick($filename);
$img=$image->getImageGeometry();
return ($img['width'] > 0 && $img['height'] > 0);
}
EDIT: I have updated my answer with more checks
I have tried to get an image size by URL. I have get_headers() function to get the image size. Here is an example which is given below:
function checkImageSize($imageUrl){
if(!empty($imageUrl)){
$file_headers = #get_headers($imageUrl, 1); // it gives all header values .
// for image size, we use **Content-Length** for size.
$sizeInKB = round($file_headers['Content-Length'] / 1024, 2));
return $sizeInKB;
} else {
return 0;
}
}
$imageSize =checkImageSize($imageUrl);
if($imageSize<=$conditionalSize){
// upload code
} else {
// error msg
}
I have a 40M zip file that is at a web address of say http://info/data/bigfile.zip that I would like to download to my local server. What is the best way currently to download a zip file of that size using PHP or header requests such that it won't time out at 8M or give me a 500 error? Right now, I keep getting timed out.
In order to download large file via php try something like this (source http://teddy.fr/2007/11/28/how-serve-big-files-through-php/):
<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
// Here goes your code for checking that the user is logged in
// ...
// ...
$filename = 'path/to/your/file'; // url of your file
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($filename);
?>
**Second solution **
Copy the file one small chunk at a time
/**
* Copy remote file over HTTP one small chunk at a time.
*
* #param $infile The full URL to the remote file
* #param $outfile The path where to save the file
*/
function copyfile_chunked($infile, $outfile) {
$chunksize = 10 * (1024 * 1024); // 10 Megs
/**
* parse_url breaks a part a URL into it's parts, i.e. host, path,
* query string, etc.
*/
$parts = parse_url($infile);
$i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
$o_handle = fopen($outfile, 'wb');
if ($i_handle == false || $o_handle == false) {
return false;
}
if (!empty($parts['query'])) {
$parts['path'] .= '?' . $parts['query'];
}
/**
* Send the request to the server for the file
*/
$request = "GET {$parts['path']} HTTP/1.1\r\n";
$request .= "Host: {$parts['host']}\r\n";
$request .= "User-Agent: Mozilla/5.0\r\n";
$request .= "Keep-Alive: 115\r\n";
$request .= "Connection: keep-alive\r\n\r\n";
fwrite($i_handle, $request);
/**
* Now read the headers from the remote server. We'll need
* to get the content length.
*/
$headers = array();
while(!feof($i_handle)) {
$line = fgets($i_handle);
if ($line == "\r\n") break;
$headers[] = $line;
}
/**
* Look for the Content-Length header, and get the size
* of the remote file.
*/
$length = 0;
foreach($headers as $header) {
if (stripos($header, 'Content-Length:') === 0) {
$length = (int)str_replace('Content-Length: ', '', $header);
break;
}
}
/**
* Start reading in the remote file, and writing it to the
* local file one chunk at a time.
*/
$cnt = 0;
while(!feof($i_handle)) {
$buf = '';
$buf = fread($i_handle, $chunksize);
$bytes = fwrite($o_handle, $buf);
if ($bytes == false) {
return false;
}
$cnt += $bytes;
/**
* We're done reading when we've reached the conent length
*/
if ($cnt >= $length) break;
}
fclose($i_handle);
fclose($o_handle);
return $cnt;
}
Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.
Usage:
copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');
Not a lot of detail given, but it sounds like php.ini defaults are restricting the servers ability to transfer large files via php web interface in a timely manner.
Namely these settings post_max_size, upload_max_filesize, or max_execution_time
Might jump in your php.ini file, up the sizes, restart Apache, and retry the file transfer.
HTH
I have a task to implement resumable in Yii, and I implemented upload control, but never Resumable before.
public function actionUpload()
{
$model=new User;
if(isset($_POST['User'])) {
$model->attributes=$_POST['User'];
$model->image=CUploadedFile::getInstance($model,'image');
if($model->save()) {
$model->image->saveAs('upload/'.$model->image->name);
$this->redirect(array('view','id'=>$model->uUserID));
}
}
$this->render('upload',array('model'=>$model));
}
The task is to chunk file in small pieces.
Example: one file can be 1 GB. And I try to send that file with rest service.
See Sample server implementation in PHP
I copy-paste here essential part of the code, provided on that page:
/**
*
* Check if all the parts exist, and
* gather all the parts of the file together
* #param string $dir - the temporary directory holding all the parts of the file
* #param string $fileName - the original file name
* #param string $chunkSize - each chunk size (in bytes)
* #param string $totalSize - original file size (in bytes)
*/
function createFileFromChunks($temp_dir, $fileName, $chunkSize, $totalSize) {
// count all the parts of this file
$total_files = 0;
foreach(scandir($temp_dir) as $file) {
if (stripos($file, $fileName) !== false) {
$total_files++;
}
}
// check that all the parts are present
// the size of the last part is between chunkSize and 2*$chunkSize
if ($total_files * $chunkSize >= ($totalSize - $chunkSize + 1)) {
// create the final destination file
if (($fp = fopen('temp/'.$fileName, 'w')) !== false) {
for ($i=1; $i<=$total_files; $i++) {
fwrite($fp, file_get_contents($temp_dir.'/'.$fileName.'.part'.$i));
_log('writing chunk '.$i);
}
fclose($fp);
} else {
_log('cannot create the destination file');
return false;
}
// rename the temporary directory (to avoid access from other
// concurrent chunks uploads) and than delete it
if (rename($temp_dir, $temp_dir.'_UNUSED')) {
rrmdir($temp_dir.'_UNUSED');
} else {
rrmdir($temp_dir);
}
}
}
I'm getting a 30 second timeout error because the code keeps checking if the file is over 5mb when it's under. The code is designed to reject files over 5mb but i need it to also stop executing when the file is under 5mb. Is there a way to check the file transfer chunk to see if it's empty? I'm currently using this example by DaveRandom:
PHP Stop Remote File Download if it Exceeds 5mb
Code by DaveRandom:
$url = 'http://www.spacetelescope.org/static/archives/images/large/heic0601a.jpg';
$file = '../temp/test.jpg';
$limit = 5 * 1024 * 1024; // 5MB
if (!$rfp = fopen($url, 'r')) {
// error, could not open remote file
}
if (!$lfp = fopen($file, 'w')) {
// error, could not open local file
}
// Check the content-length for exceeding the limit
foreach ($http_response_header as $header) {
if (preg_match('/^\s*content-length\s*:\s*(\d+)\s*$/', $header, $matches)) {
if ($matches[1] > $limit) {
// error, file too large
}
}
}
$downloaded = 0;
while ($downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}
if ($downloaded > $limit) {
// error, file too large
unlink($file); // delete local data
} else {
// success
}
You should check if you have reached the end of the file:
while (!feof($rfp) && $downloaded < $limit) {
$chunk = fread($rfp, 8192);
fwrite($lfp, $chunk);
$downloaded += strlen($chunk);
}
php how to get web image size in kb?
getimagesize only get the width and height.
and filesize caused waring.
$imgsize=filesize("http://static.adzerk.net/Advertisers/2564.jpg");
echo $imgsize;
Warning: filesize() [function.filesize]: stat failed for http://static.adzerk.net/Advertisers/2564.jpg
Is there any other way to get a web image size in kb?
Short of doing a complete HTTP request, there is no easy way:
$img = get_headers("http://static.adzerk.net/Advertisers/2564.jpg", 1);
print $img["Content-Length"];
You can likely utilize cURL however to send a lighter HEAD request instead.
<?php
$file_size = filesize($_SERVER['DOCUMENT_ROOT']."/Advertisers/2564.jpg"); // Get file size in bytes
$file_size = $file_size / 1024; // Get file size in KB
echo $file_size; // Echo file size
?>
Not sure about using filesize() for remote files, but there are good snippets on php.net though about using cURL.
http://www.php.net/manual/en/function.filesize.php#92462
That sounds like a permissions issue because filesize() should work just fine.
Here is an example:
php > echo filesize("./9832712.jpg");
1433719
Make sure the permissions are set correctly on the image and that the path is also correct. You will need to apply some math to convert from bytes to KB but after doing that you should be in good shape!
Here is a good link regarding filesize()
You cannot use filesize() to retrieve remote file information. It must first be downloaded or determined by another method
Using Curl here is a good method:
Tutorial
You can use also this function
<?php
$filesize=file_get_size($dir.'/'.$ff);
$filesize=$filesize/1024;// to convert in KB
echo $filesize;
function file_get_size($file) {
//open file
$fh = fopen($file, "r");
//declare some variables
$size = "0";
$char = "";
//set file pointer to 0; I'm a little bit paranoid, you can remove this
fseek($fh, 0, SEEK_SET);
//set multiplicator to zero
$count = 0;
while (true) {
//jump 1 MB forward in file
fseek($fh, 1048576, SEEK_CUR);
//check if we actually left the file
if (($char = fgetc($fh)) !== false) {
//if not, go on
$count ++;
} else {
//else jump back where we were before leaving and exit loop
fseek($fh, -1048576, SEEK_CUR);
break;
}
}
//we could make $count jumps, so the file is at least $count * 1.000001 MB large
//1048577 because we jump 1 MB and fgetc goes 1 B forward too
$size = bcmul("1048577", $count);
//now count the last few bytes; they're always less than 1048576 so it's quite fast
$fine = 0;
while(false !== ($char = fgetc($fh))) {
$fine ++;
}
//and add them
$size = bcadd($size, $fine);
fclose($fh);
return $size;
}
?>
You can get the file size by using the get_headers() function. Use below code:
$image = get_headers($url, 1);
$bytes = $image["Content-Length"];
$mb = $bytes/(1024 * 1024);
echo number_format($mb,2) . " MB";