Counting how many times a image is being served from server? - php

I want to count the no. of times a image is being served from our server. I have some images in a website and want to count the no. when these images are showed on web pages(served from the server to my website and if hotlinked). Is there any way to accomplish this. I know php so if there is some way doing it in php it would be really helpful.
advice please
thank you.

Can't you look at your server logs for that?

If you're wanting something beyond parsing server-logs, you'd have to setup a database to manage the list of images, and the number of times they're accessed. Serve the images through a .php script which increments the db value with each request. You could use a flat-file system too, but I prefer the db-solution.
You wouldn't need to worry about the source of your image if you implement .htaccess and apache's mod_rewrite. You could serve url's like this:
http://mysite.com/images/001.jpg
Which would be understood on the server as:
http://mysite.com/images.php?id=001
Thus providing a basis to perform database-actions, and scripted logic.

You can use Microsoft's LogParser to query your server logs using a query something like this:
c:\Program Files\Log Parser 2.2> logparser "select cs-uri-stem, count(*) as Hits from C:\Your\Log\File\Path\ex091002.log where cs-uri-stem like 'imagefilename.jpg' or where cs-uri-stem like 'anotherimage.jpg' group by cs-uri-stem order by Hits DESC" -i:w3c
You can even have it output to a text file or a graph (requires Excel, I believe) if you need something to display on a page. You'll probably have to change the query if you're using Apache logs, not sure.

You should be able to gather this information using the log files and an analytics package. If you are running IIS a really good one to look into (and free for 1 domain) is SmarterTools' SmarterStats. www.smartertools.com

The answers recommending looking in the logs are right. But if for some reason that's not acceptable, it's not hard to configure a php script to handle this.
1) Create a rewrite rule (using mod_rewrite) to transparently rewrite requests to your image to go to a php file instead, with the image's name as a parameter.
2) your php script can log the request, then send out the appropriate MIME type for the image, and dump the real file to the output buffer (this shouldn't be affected by your rewrite rule as long as you load from the file system rather than using a URL stream).

Related

How to use PHP to get the website bandwidth

I want to get the website bandwidth with PHP
get the information like this:
aaa.example.com 200g/m
bbb.example.com 150g/m
I don't think PHP have any standard tools to provide this data. You probably will have to set up custom logs for Apache where it will save subdomain and page size for all requests and then you will need to extract this data and present it with some statistical tool like Munin. Actually maybe someone already made a plugin for Munin that is doing just that, just try to google for it.

Can I embed a server-side action inside a file (mp3s, images etc)?

I'm looking for a way to send a user a regular file (mp3s or pictures), and keeping count of how many times this file was accessed without going through an HTML/PHP page.
For example, the user will point his browser to bla.com/file.mp3 and start downloading it, while a server-side script will do something like saving data to a database.
Any idea where should I get started?
Thanks!
You will need to go through a php script, what you could do is rewrite the extensions you want to track, preferably at the folder level, to a php script which then does the calculations you need and serves the file to the user.
For Example:
If you want to track the /downloads/ folder you would create a rewrite on your webserver to rewrite all or just specific extensions to a php file we'll call proxy.php for this example.
An example uri would be proxy.php?file=file.mp3 the proxy.php script sanitizes the file parameter, checks if the user has permission to download if applicable, checks if the file exists, serves the file to the client and perform any operations needed on the backend like database updates etc..
Do you mean that you don't want your users to be presented with a specific page and interrupt their flow? If you do, you can still use a PHP page using the following steps. (I'm not up to date with PHP so it'll be pseudo-code, but you'll get the idea)
Provide links to your file as (for example) http://example.com/trackedDownloader.php?id=someUniqueIdentifer
In the tracedDownloader.php file, determine the real location on the server that relates to the unique id (e.g. 12345 could map to /uploadedFiles/AnExample.mp3)
Set an appropriate content type header in your output.
Log the request to your database.
Return the contents of the file directly as page output.
You would need to scan log files. Regardless you most likely would want to store counters in a database?
There is a great solution in serving static files using PHP:
https://tn123.org/mod_xsendfile/

dynamic web page automatic snapshot

I have a dynamic web site (php/mySQL/Ajax on a Linux server), I need to take automatically a photo (snapshot) of each web page periodically (If I can find the way to do the snapshot... I can use cron) and save this image to the database (I also know how to do this...my only problem is the photo!).
I can't do it manually, so I need an script which take the snapshop for me, without displaying the web page, i.e directly from the .php files.
How is it possible?
Thanks!
http://browsershots.org/ may work for you, they have an api
You can use the GD functions imagegrabscreen() or imagegrabwindow() to take a screenshot.
Note that they're only available on Windows at the moment.
Looks like this might answer your question, I have seen it done with php and flash but wasn't privy to the inner workings, if the link doesnt help then you could research that route.

Restricting access to images on a website

I'm putting together a portfolio website which includes a number of images, some of which I don't want to be viewable by the general public. I imagine that I'll email someone a user name and password, with which they can "log-in" to view my work.
I've seen various solutions to the "hide-an-image" problem on line including the following, which uses php's readfile. I've also seen another that uses .htaccess.
Use php's readfile() or redirect to display a image file?
I'm not crazy about the readfile solution, as it seems slow to load the images, and I'd like to be able to use Cabel Sasser's FancyZoom, which needs unfettered access to the image, (his library wants a link to the full sized image), so that rules out .htaccess.
To recap what I'm trying to do:
1) Provide a site where I give users the ability to authenticate themselves as someone I'd like looking at my images.
2) Restrict random web users from being able see those images.
3) Use FancyZoom to blow up thumbnails.
I don't care what technology this ends up using -- Javascript, PHP, etc. -- whatever's cleanest and easiest.
By the way, I'm a Java Developer, not a web developer, so I'm probably not thinking about the problem correctly.
Instead of providing a link to an image. Provide a link to a cgi script which will automatically provide the proper header and content of the image.
For example:
image.php?sample.jpg
You can then make sure they are already authenticated (e.g. pass a session id) as part of the link.
This would be part of the header, and then your image data can follow.
header('Content-Type: image/jpeg');
Edit: If it has to be fast, you can write this in C/C++ instead of php.
Using .htaccess should be the safest/simplest method, as it's built in functionality of the webserver itself.
I do not know if it fits your needs, but I solved a similar poblem(giving pictures to a restricted group of people) by using TinyWebGallery, which is a small gallery application without database.
You can allow access to different directories via password and you can upload pictures directly into the filesystem, as TinyWebGallery will check for new dirs/pics on the fly. It will generate thumbnails and gives users possibility to rate / comment pictures (You can disable this).
This is not the smallest tool, however I thik it is far easier to setup than using apache directives and it looks better as naked images.
If you're using Nginx, you could use the Secure Link module.

PHP: I want to create a page that extracts images from a forum thread, doable? codeigniter?

You have a forum (vbulletin) that has a bunch of images - how easy would it be to have a page that visits a thread, steps through each page and forwards to the user (via ajax or whatever) the images. i'm not asking about filtering (that's easy of course).
doable in a day? :)
I have a site that uses codeigniter as well - would it be even simpler using it?
assuming this is to be carried out on server, curl + regexp are your friends .. and yes .. doable in a day...
there are also some open-source HTML parsers that might make this cleaner
It depends on where your scraping script runs.
If it runs on the same server as the forum software, you might want to access the database directly and check for image links there. I'm not familiar with vbulletin, but probably it offers a plugin api that allows for high level database access. That would simplify querying all posts in a thread.
If, however, your script runs on a different machine (or, in other words, is unrelated to the forum software), it would have to act as a http client. It could fetch all pages of a thread (either automatically by searching for a NEXT link in a page or manually by having all pages specified as parameters) and search the html source code for image tags (<img .../>).
Then a regular expression could be used to extract the image urls. Finally, the script could use these image urls to construct another page displaying all these images, or it could download them and create a package.
In the second case the script actually acts as a "spider", so it should respect things like robots.txt or meta tags.
When doing this, make sure to rate-limit your fetching. You don't want to overload the forum server by requesting many pages per second. Simplest way to do this is probably just to sleep for X seconds between each fetch.
Yes doable in a day
Since you already have a working CI setup I would use it.
I would use the following approach:
1) Make a model in CI capable of:
logging in to vbulletin (images are often added as attachments and you need to be logged in before you can download them). Use something like snoopy.
collecting the url for the "last button" using preg_match(), parsing the url with parse_url() / and parse_str() and generating links from page 1 to page last
collecting html from all generated links. Still using snoopy.
finding all images in html using preg_match_all()
downloading all images. Still using snoopy.
moving the downloaded image from a tmp directory into another directory renaming it imagename_01, imagename_02, etc. if the same imagename allready exists.
saving the image name and precise bytesize in a db table. Then you can avoid downloading the same image more than once.
2) Make a method in a controller that collects all images
3) Setup a cronjob that collect images at regular intervals. wget -o /tmp/useless.html http://localhost/imageminer/collect should do nicely
4) Write the code that outputs pretty html for the enduser using the db table to get the images.

Categories