Check if image exist in a big loop - php

I'm doing a check like this:
// ADDITIONAL PRODUCT IMAGE
$add1 = #get_headers("https://{image_server_link}/DIx.jpg_RB2000,2000,255,255,255,127/-/article/" . $product_number . "_1.jpg");
if(strpos($add1[0], "404") === FALSE) {
// Image found, so let's print it out
}
It's inside a CSV generator, and because of the csv contains 2000+ products the get_headers slows down generation from 3 to 4 MB/S to 10-15KB/s. As our CSV is 6.9 MB this will take ages.
Is their any other option to check if an additional image exists which is faster?
By the way, the images are hosted on a seperate image-server with varnish caching, so it's not possible to do a is_file function or something..

You could make a php script on the other side that gives you a list of available images or checks for availability of multiple images at once (a web service).
Edit:
Or just use the index page of the image directory.

Related

PHP file_get_contents repetitive usage to get images from XML

Website I'm developing is showing products on sale and a lot of them are being updated and imported from XML (imported/updated 2x a day).
The problem is with getting product images. Some of them do not display even if url is provided.
Some XML contain from tens of items to hundreds of them.
The code loops through each item, gets all the data and then the image.
Example code regarding getting the image :
if (#file_get_contents($i->image)) {
$name = uniqid(rand(), true) . '.jpg';
$img = $name;
$url = $i->image;
$destinationPath = public_path() . '/img/upload/Items/'.$name;
#file_put_contents($destinationPath, #file_get_contents($url));
} else {
// do something else if there is no image
}
Is my approach wrong here ?
Remote server can't keep up with file requests ? And before each request need some pause ?
If items are updated 2x daily I shouldn't bother with saving images locally and give img src the external url from XML ?
Better use cURL() ?
If approach is accepted what improvement in needs ?
Thank you.

Right way of watermarking & storing & displaying images in PHP

I'm building a web based system, which will host loads and loads of highres images, and they will be available for sale. Of course I will never display the highres image, instead when browsing people will only see a low resolution, watermarked image. Currently the workflow is as follows:
PHP script handles the highres image upload, when image is uploaded, it's automatically re-sized to a low res image and to a thumbnail image as well and both of the files are saved on the server, (no watermark is added).
When people are browsing, the page displays the thumbnail of the image, on click, it enlarges and displays the lowres image with watermark as well. At the time being I apply the watermark on the fly whenever the lowres image is opened.
My question is, what is the correct way:
1) Should I save a 2nd copy of the lowres image with thumbnail, only when it's access for the first time? I mean if somebody access the image, I add the watermark on the fly, then display the image & store it on the server. Next time the same image is accessed if a watermarked copy exist just display the wm copy, otherwise apply watermark on the fly. (in case watermark.png is changed, just delete the watermarked images and they will be recreated as accessed).
2) Should I keep applying watermarks on the fly like I'm doing now.
My biggest question is how big is the difference between a PHP file_exists(), and adding a watermark to an image, something like:
$image = new Imagick();
$image->readImage($workfolder.$event . DIRECTORY_SEPARATOR . $cat . DIRECTORY_SEPARATOR .$mit);
$watermark = new Imagick();
$watermark->readImage($workfolder.$event . DIRECTORY_SEPARATOR . "hires" . DIRECTORY_SEPARATOR ."WATERMARK.PNG");
$image->compositeImage($watermark, imagick::COMPOSITE_OVER, 0, 0);
All lowres images are 1024x1024, JPG with a quality setting of 45%, and all unnecessary filters removed, so the file size of a lowres image is about 40Kb-80Kb.
It is somehow related to this question, just the scale and the scenarios is a bit different.
I'm on a dedicated server (Xeon E3-1245v2) cpu, 32 GB ram, 2 TB storage), the site does not have a big traffic overall, but it has HUGE spikes from time to time. When images are released we get a few thousand hits per hours with people browsing trough the images, downloading, purchasing, etc. So while on normal usage I'm sure that generating on the fly is the right approach, I'm a bit worried about the spike period.
Need to mention that I'm using ImageMagick library for image processing, not GD.
Thanks for your input.
UPDATE
None of the answers where a full complete solution, but that is good since I never looked for that. It was a hard decision which one to accept and whom to accord the bounty.
#Ambroise-Maupate solution is good, but yet it's relay on the PHP to do the job.
#Hugo Delsing propose to use the web server for serving cached files, lowering the calls to PHP script, which will mean less resources used, on the other hand it's not really storage friendly.
I will use a mixed-merge solution of the 2 answers, relaying on a CRON job to remove the garbage.
Thanks for the directions.
Personally I would create a static/cookieless subdomain in a CDN kinda way to handle these kind of images. The main reasons are:
Images are only created once
Only accessed images are created
Once created, an image is served from cache and is a lot faster.
The first step would be to create a website on a subdomain that points to an empty folder. Use the settings for IIS/Apache or whatever to disable sessions for this new website. Also set some long caching headers on the site, because the content shouldn't change
The second step would be to create an .htaccess file containing the following.
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*) /create.php?path=$1 [L]
This will make sure that if somebody would access an existing image, it will show the image directly without PHP interfering. Every non-existing request will be handled by the create.php script, which is the next thing you should add.
<?php
function NotFound()
{
if (!headers_sent()) {
$protocol = (isset($_SERVER['SERVER_PROTOCOL']) ? $_SERVER['SERVER_PROTOCOL'] : 'HTTP/1.0');
header($protocol . ' 404 Not Found');
echo '<h1>Not Found</h1>';
exit;
}
}
$p = $_GET['path'];
//has path
if (strlen($p)<=1)
NotFound();
$clean = explode('?', $p);
$clean = explode('#', $clean[0]);
$params = explode('/', substr($clean[0], 1)); //drop first /
//I use a check for two, because I dont allow images in the root folder
//I also use the path to determine how it should look
//EG: thumb/125/90/imagecode.jpg
if (count($params)<2)
NotFound();
$type = $params[0];
//I use the type to handle different methods. For this example I only used the full sized image
//You could use the same to handle thumbnails or cropped/watermarked
switch ($type) {
//case "crop":if (Crop($params)) return; else break;
//case "thumb":if (Thumb($params)) return; else break;
case "image":if (Image($params)) return; else break;
}
NotFound();
?>
<?php
/*
Just some example to show how you could create a responds
Since you already know how to create thumbs, I'm not going into details
Array
(
[0] => image
[1] => imagecode.JPG
)
*/
function Image($params) {
$tmp = explode('.', $params[1]);
if (count($tmp)!=2)
return false;
$code = $tmp[0];
//WARNING!! SQL INJECTION
//USE PROPER DB METHODS TO GET REALPATH, THIS IS JUST EXAMPLE
$query = "SELECT realpath FROM images WHERE Code='".$code."'";
//exec query here to $row
$realpath = $row['realpath'];
$f = file_get_contents($realpath);
if (strlen($f)<=0)
return false;
//create folder structure
#mkdir($params[0]);
//if you had more folders, continue creating the structure
//#mkdir($params[0].'/'.$params[1]);
//store the image, so a second request won't access this script
file_put_contents($params[0].'/'.$params[1], $f);
//you could directly optimize the image for web to make it even better
//optimizeImage($params[0].'/'.$params[1]);
//now serve the file to the browser, because even the first request needs to show the image
$finfo = finfo_open(FILEINFO_MIME_TYPE);
header('Content-Type: '.finfo_file($finfo, $params[0].'/'.$params[1]));
echo $f;
return true;
}
?>
I would suggest you to create watermarked images on-the-fly and to cache them at the same time as everybody suggested.
Then you could create a garbage-collector PHP script that will be executed every days (using cron). This script will browse your cache folder to read every image access time. This can done using fileatime() PHP method. Then when a cached wm image has not been accessed within 24 or 48 hours, just delete it.
With this method, you can handle spike periods as images are cached at the first request. AND you will save your HDD space as your garbage-collector script will delete unused images for you.
This method will only work if your server partition has atime updates enabled.
See http://php.net/manual/en/function.fileatime.php
For most scenarios, lazily applying the watermark would probably make most sense (generate the watermarked image on the fly when requested then cache the result) however if you have big spikes in demand you are creating a mechanism to DOS yourself: create the watermarked version on upload.
Considering your HDD storage capacity and Pikes.
I would only create a watermarked image if it is viewed.(so yes on the fly) In that way you dont use to much space with a bunch a files that are or might not be viewed.
I would not watermark thumbnails i would rather make a filter that fake watermark and protect from being saved. That filter would apply to all thumbnails without creating a second image.
In this way all your thumbbails are watermarked (Fake with onther element on top).
Then if one of these thumbnails is viewed it generate a watermarked image (only once) since after its generated you load the new watermarked image.
This would be the most efficient way to deal with your HDD storage and Pikes.
The other option would be to upgrade your hosting services. Godaddy offer unlimited storage and bandwith for about 50$ a year.

How do I only show pictures in PHP once they're done uploading?

I'm uploading an image using WinSCP, which gets sent to a web server using PHP.
Then the site refreshes automatically every X seconds and notices that a new image file is present, and displays it to the user.
However, during the loading time, it shows the following image, which doesn't look very nice:
http://s10.postimg.org/fwz0ok16h/imageupload.png
How can I ensure it only shows the image after it is completely uploaded and ready?
Is there maybe some sort of "pre-loader" or fade effect you could use in PHP that really only shows the picture once it's done? It needs to be PHP because Javascript can't find out the exact image name. Here's how I currently display the image:
foreach($files as $key => $value)
{
if($count >= 1)
break;
echo '<th><img id="image" height="250px" width="250px" src="files/'."$key".'"><br />'."$value".'</img></th>';
$count++;
}
Open the Image using
$png = #imagecreatefrompng('stamp.png');
// Or:
$jpg = #imagecreatefromjpeg('photo.jpeg');
If it doesn't fail, the image is complete. If it fails you either don't have the GD lib installed and enabled or the image is corrupt / incomplete.
But it is the best way to check if the Image upload is complete using Javascript event handlers, like onreadystate attached to the upload, and only refresh the page when that event is triggered.
Further links:
jQuery: Check if image exists
Taking control of image loading
Propably the best one: Is there any way to have PHP detect a corrupted image?
Before outputting a file, check its filemtime. If that modification time is less than a few seconds ago, you can assume that it's still being uploaded and skip over it.

Remote uploading MULTIPLE images

Okay, I have a question guys. I want to remote upload (copy an image from a site to my server) MULTIPLE images by putting links into a TEXTAREA and hitting submit. I just don't know how to make this possible with multiple images.
I am able to make it with an single image using the copy(); function, but not for multiple entries in a TEXTAREA.
I also want to limit the remote uploading feature up to 30 remote links and one image should not exceed 10MB - But I don't know how to start. I heard cURL is able to make this and I also heard that file_get_contents(); with file_put_contents(); can make a similar thing, but I still cannot figure out how to do it myself.
Help anyone? :)
You can use the same procedure as you do now with a single image, but do it in a loop.
$lines = explode("\n", $_POST['textarea']);
if(count($lines) > 30) {
die('Too many files');
}
foreach($lines as $line) {
$srcfile = trim($line);
//copy $srcfile here
//check size of the file with filesize()
}
You need to parse the URLs out of the textarea. You could with this PHP side with a regular expression.
You could then examine the parsed URLs and array_slice() the first 30, or error if more than 30.
You'd then need to copy the files from the remote server. You could inspect the Content-Length header to ensure the file is under 10mb. You could get just the headers using HEAD instead of GET.
I am not familiar with PHP but I suggest the following:
Solving the multiple files upload issue:
splitting the content in the text area by the carriage return
then iterate them to get image
preserve the size of each file in a variable, but how to get the size?
you can do exec (system) call to know the file size (this requires a full image download but its the most convenient way ), or you can make use of Content-Length header value, if the content length is more than 10 MG then skip it and move to the next item.
How to download the image?
use the file put content but make sure to put the encoding as binary encoding to preserve the content type.

What is the "conventional" way of handling uploaded image names?

So im making a website with an image upload functionality and im storing the image name to the database. I took a screenshot of my mac and wanted to upload this photo "Screen shot 2011-02-18 at 6.52.20 PM.png". Well, thats not a nice name to store in mysql! How do people ususally rename photos in such a way that each photo uploaded has a unique name? Also, how would i make sure i keep the file extension in the end when renaming the photo.
I would drop the extension, otherwise Apache (or equivalent) will run a1e99398da6cf1faa3f9a196382f1fadc7bb32fb7.php if requested (which may contain malicious PHP). I would also upload it to above the docroot.
If you need to to make the image accessible above the docroot, you can store a safe copy that is ran through image functions or serve it from some PHP with header('Content-Type: image/jpeg') for example and readfile() (not include because I can embed PHP in a GIF file).
Also, pathinfo($path, PATHINFO_EXTENSION) is the best way to get an extension.
Ensure you have stored a reference to this file with original filename and other meta data in a database.
function getUniqueName($originalFilename) {
return sha1(microtime() . $_SERVER['REMOTE_ADDR'] . $originalFilename);
}
The only way this can generate a duplicate is if one user with the same IP uploads the same filename more than once within a microsecond.
Alternatively, you could just use the basename($_FILES['upload']['tmp_name']) that PHP assigns when you upload an image. I would say it should be unique.
Hash the image name. Could be md5, sha1 or even a unix timestamp.
Here is an (untested) example with a random number (10 to 99)
<?php
function generate_unique_name($file_name)
{
$splitted = split(".", $file_name);
return time() . rand(10,99) . "." . $splitted[count($splitted)-1];
}
?>
You could use an image table like:
id: int
filename: varchar
hash: varchar
format: enum('jpeg', 'png')
The hash can be something like sha1_file($uploaded_file) and used to make sure duplicate images aren't uploaded. (So you could have multiple entries in the image table with the same hash, if you wanted.) The id is useful so you can have integer foreign key links back to the image table.
Next store the images in either:
/image/$id.$format
or
/image/$hash.$format
The second format via the hash would make sure you don't duplicate image data. If you are dealing with lots of images, you may want to do something like:
/image/a/b/c/abcdef12345.jpg
where you use multiple layers of folders to store the images. Many file systems get slowed down with too many files in a single directory.
Now you can link to those files directly, or set up a URL like:
/image/$id/$filename
For example:
/image/12347/foo.jpg
The foo.jpg comes from whatever the user uploaded. It is actually ignored because you look up via the id. However, it makes the image have a nice name if the person chooses to download it. (You may optionally validate that the image filename matches after you look up the id.)
The above path can be translated to image.php via Apache's MultiView or ModRewrite. Then you can readfile() or use X-SendFile (better performance, but not always available) to send the file to the user.
Note that if you don't have X-SendFile and don't want to process things through PHP, you could use a RewriteRule to convert /image/$hash/foo.jpg into /image/a/b/c/$hash.jpg.

Categories