Charging time of GD and parsing web - php

I created a PHP file in which a map is drawn with GD based on data obtained from another site. The fact is that the PHP run-time makes the page loading is very slow.
The question is, is there any way this PHP code is executed only once a day? o any chance you run the web server automatically?

You need to cache your map image and load it from a file if it already exists. Regenerate it once a day. This skeletal code outlines how that can be accomplished. The first time the page loads when the image has become more than a day old, it will be regenerated and saved to a file.
// If the file is older than 1 day, create a new one
if (filemtime("imagecache.jpg") < time() - 86400) {
// Generate your new image and write it to a file
// Assuming $im is an image from GD
// UPDATE: fixed file_put_contents() because I didn't know imagejpeg()
// could write the file by itself.
imagejpeg($im, "imagecache.jpg");
}

Many ways to do this. They all start with having a PHP script that creates a static graphic file using gd and saves it somewhere in the disk. This is what you will show to users.
Once you're generating that file your two easiest choices might be:
Point your users to the static file, and invoke the php periodically using cron or something similar.
Point your users to a PHP script. Have your PHP script check the graphic file's timestamp and if it's older than a certain age, regenerate it and then send that to the user. Thus you have some PHP overhead but it's less than generating the graphic every time.

Create a cron job that runs once a day (preferably during a light traffic time) to do your heavy lifting, save or cache the result (for example, using APC or memcached, or even just overwriting the currently-used image with the new one), and display that result to your users.

Related

Run large php scripts without them timing out?

I'm trying to mass compress images with GD on my site, it works just fine when I try to compress a small folder with 20 images but I have around 70k images and when I use the script I get a timeout and 500 error message. This is the code:
$di = new RecursiveDirectoryIterator('./image/data/');
$iter = new RecursiveIteratorIterator($di);
$regexIter = new RegexIterator(
$iter, '/^.+\.jpg$/i', RecursiveRegexIterator::GET_MATCH);
foreach ($regexIter as $fileInfo) {
$img = imagecreatefromjpeg($fileInfo[0]);
imagejpeg($img,$fileInfo[0],
75);
}
Now I already searched for this topic and found out that I can use:
set_time_limit();
So I decided to add
set_time_limit(100000);
but this is not working, I still the timeout message and no images are compressed
Do you have any suggestions on how I could do this efficiently because typing in every folder would take me weeks.
the better way to do big works is to do these in more parts.
e.g. you move the treated pictures in another directory and you stop the script after 100 pictures.
then you just have to restart the same script a few times and all pictures are done
To answer your question, increasing the timeout is something you should ask your hosting provider. Of course, they may refuse to do it.
A good idea is to transform your script to run from command line. The processing is faster and the timeout is usually much, much higher. But then again, it requires for you to have command line access on the server.
Last and preferred option is to transform your script into "chaining". Since most of the time will be spent doing the actual image conversion, this is what I would do:
get a list of all images with their full path; save in session or in a temporary table
start processing each image from the list, deleting it after it's been done
at every image, you check how much time it's passed since the start of the script, and if it's getting close to the timeout, you redirect to the same script with an additional "offset" parameter

About PHP Thumbnails - Storing them or Generating them on the Fly

I am making a web application that needs to show 3 types of thumbnails to a user. No I might end up with a lot of thumbnail files on the server for a lot of users.
This makes me think is generating thumbnails on the fly is a better option than storing them?
Speed vs Storage vs Logic - Which one to go for?
Does anyone here ever faced such a dilemma - let me know!
I am using CodeIgniter and its inbuilt Image Library for generating thumbnails.
I would go with: generate when needed, store afterwards.
Link to the image using a URL like /img/42/400x300.jpg. Through rewrite rules, you can fire up a PHP script should the image not exist. That script can then generate the requested image in the requested size and store it in the public web folder, where the web server can serve it directly the next time.
That gives you the best of both worlds: the image is not generated until needed, it is only generated once and it even makes it very flexible to work with different image sizes on the fly.
If you're worried about storage space, you can add a regular clean-up job which removes old images or perhaps analyses your access log files and removes images which where not accessed for some time.
My comment as an answer: (why not :)
My personal thoughts on this are, if you're anticipating a lot of users go with storage as the the load of creating dynamic thumbnails for every one of these users for every page load is going to hurt the server, maybe create it dynamically the first time it's ever viewed and then store it.
You may also take advantage of browser caching to save load and bandwidth. (marginal but every little helps)

Using PHP to update file after a new copy is uploaded

So I'm trying to see if something like this is possible WITHOUT using database.
A file is uploaded to the server /files/file1.html
PHP is tracking the upload time by checking last update time in database
If the file (file1.html) has been updated since the last DB time, PHP makes changes; Otherwise, no changes are made
Basically, for a text simulation game (basketball), it outputs HTML files for rosters/stats/standings/etc. and I'd like to be able to insert each team's Logo at the top (which the outputted files don't do). Obviously, it would need to be done often as the outputted files are uploaded to the server daily. I don't want to have to go through each team's roster manually inserting images at the top.
Don't have an example as the league hasn't started.
I've been thinking of just creating a button on the league's website (not created yet) that when pushed would update the pages, but I'm hoping to have PHP do it by itself.
Yes, you could simply let php check for the file creation date (the point in time where the file was created on the server, not the picture itself was made). check http://php.net/manual/en/function.filemtime.php and you should be done within 30mins ;)
sexy quick & dirty unproven code:
$filename = 'somefile.txt';
$timestamp_now = time(); // get timestamp from now (seconds)
if (filemtime($filename) > $timestamp_now) {
// overwrite the file (maybe check for existing file etc first)
}

How to output a large image to the browser using PHP?

I have a very large image generated on the fly with PHP and outputted to the browser. (it's 5000px wide and 1000-2000px tall. It's a plot of the daily user activity on my site).
The problem is that nowadays the plot is too big and the PHP script gives memory exhausted errors (tough the generated PNG itself is quite small) and I can't get the image due to this.
Is there way to output this large image in multiple parts somehow using GD in PNG format?
(ps: the host where I run the site uses safe mode, so I can't modify the configuration and I think they're using the default PHP installation.)
EDIT1: It's an admin script. No users see it except me.
EDIT2: and example image can be seen here: http://users.atw.hu/calmarius/trash/wtfb2/x.png
(I also have the option to group the tracks by IP address.)
Every user+IP pair has its own 24 hour track on the plot. And every green mark denotes an user activity. As you can see this image can be output track by track. And there is no need to output and generate the whole thing all once.
This website will be an online strategy game and I want to use this graph in the future to make detecting multiaccounts easier. (Users who are trying to get advantage by registering multiple accounts over those ones who only have 1.) But this is a different problem.
I'm using PHP script because I'm too lazy to export the requestlog from the database, download it and feed the data to a program that would make the plot for me. ;)
Set the memory limit to unlimited before processing the image.
ini_set('memory_limit', '-1');
It'd help to say how you're generating the image (GD library, ImageMagick) and how you're outputting it. Are you saving the file to a directory and then using readfile() to output it? If yes, fopen / fread / echo combination is about 50%-60% faster than using readfile() to output files to the browser. Are you using gzip compression? What's the time limit on php execution? What's the exact error message you're getting?

I want to create multiple thumbnails using GD library in php, which is better creating on the fly or creating physical one?

I want to create multiple thumbnails using GD library in php, and I already have a script to do this, the question is what is better for me .. is it better to create thumbnail on the fly? or create a physical file on my server each time I want a thumb?? and Why?
Please, consider time consuming and storage capacity and other disadvantages for both
When you create the thumbnail depends on a couple of factors (that I'll get into) but you should never discard the output of something like this (unless you'll never use it again) as it's a really expensive operation.
Anyway your two main choices for "when to generate the thumbnail" are:
When it's first requested. This is common and it means that you don't generate thumbnails that are never used but it does mean if you have a page full of first-time-thumbnails that the server might become overwhelmed with PHP processes generating the thumbnails.
I had a similar issue with Sorl+Django where I was generating 100+ thumbnails per request for the first few requests after uploading and it basically made the entire server hang for 20 minutes. Not good.
Generate all required thumbnails when you upload. Because it takes a long time to upload, you break down the processing quite a lot. You can also pull it out-of-process (ie use another script to process uploads - perhaps not even in PHP).
The obvious downside is you're using up disk space that you otherwise might not need to use up... But unless you're talking about hundreds of thousands of thumbnails, a small percentage of unused ones probably won't break the bank.
Of course, if disk space is an issue, there might be an argument for pushing the thumbnail up to a CDN at the same time as you process it.
One note when you save the thumbnails, it's fairly common that you'll want to resize the thumbnails at some point down the line or perhaps want two small variants. I find it really useful to make the filenames very specific so if the original image is image.jpg, the 200x200 version is image-200x200.jpg.
Neither/both - don't generate the thumbnails till you need them - but keep the files you generate.
That way you'll minimise the amount of work needed and have a self-repairing system
C.
GD is really resource heavy, so you should look at if you can use ImageMagick instead (which also has a clearer syntax).
You definitely will be better off caching the created thumbnail after the first run (regardless of if you run GD or ImageMagick) and serve them from the cache. If you are worried about storage, clear out old files from the cache now and then.
Always cache (= write out to disk) the results of GD operations. They are too expensive both regarding processor time and memory to be done on the fly every time. This becomes increasingly true the more visitors/hits you have.

Categories