I have a dynamic web site (php/mySQL/Ajax on a Linux server), I need to take automatically a photo (snapshot) of each web page periodically (If I can find the way to do the snapshot... I can use cron) and save this image to the database (I also know how to do this...my only problem is the photo!).
I can't do it manually, so I need an script which take the snapshop for me, without displaying the web page, i.e directly from the .php files.
How is it possible?
Thanks!
http://browsershots.org/ may work for you, they have an api
You can use the GD functions imagegrabscreen() or imagegrabwindow() to take a screenshot.
Note that they're only available on Windows at the moment.
Looks like this might answer your question, I have seen it done with php and flash but wasn't privy to the inner workings, if the link doesnt help then you could research that route.
Related
I want to get the website bandwidth with PHP
get the information like this:
aaa.example.com 200g/m
bbb.example.com 150g/m
I don't think PHP have any standard tools to provide this data. You probably will have to set up custom logs for Apache where it will save subdomain and page size for all requests and then you will need to extract this data and present it with some statistical tool like Munin. Actually maybe someone already made a plugin for Munin that is doing just that, just try to google for it.
This is a very broad question, so I'm just looking for the best way of doing this.
I want to periodically monitor certain pages on my website.
I am looking to write a PHP script which will load the page as if it is being loaded in a browser. So that means, it loads all CSS, Javascript, Images, Videos, etc...
I want to just get the load time of these pages and then email the results to myself in a crontab. For this I was going to use microtime() and a phpMailer.
Does anyone know of a script to load a complete page, or have any suggestions on how to go about this?
Thanks.
What if the page has dynamic content? You will also need to execute all the JavaScript and fetch all CSS images to get the final amount of time. I believe that is impossible using only PHP.
A php script you run from the same server you host your site will give you abnormal readings (very low) since it's loading on the first hop essentially. What you really would want to do is run a script from various servers outside of your own. There are also limitations with what php can see ie JS and JQuery etc.
The simplest is to check from your home pc, using jmeter. You set your home browser to use it as a proxy and go to whichever website you want. Jmeter will record statistics. When you are happy you can choose to save the stats.
This avoids the problems of handling JS and JQuery through a script.
This could get very complicated. You'd basically have to parse the HTML, and then there's tons of edge cases, like JS including resources, etc... I would definitely recommend using something like the network tab of Chrome's dev tools instead.
Hi,
I download a large amount of files for data mining. I used to use PHP for this purpose but I am finding it to be too slow. Also I just want a small part of the web page. I want to achieve two things
Curl should be able to utilize all my download bandwidth
Is there any way to download only a part of the web page where my data resides.
I am not confined to PHP. If curl works better in terminal I would use that.
Yes, you can download only a part of the page by using the CURLOPT_RANGE option, and you can also provide a write callback function that simply returns an error when you've received "enough" data and you want to stop and move on.
Are you downloading HTML? Your comment leads me to believe that you are. If that's the case, simply load up the html with Simple PHP DOM and get only the part that you want. Although, I find it hard to believe that grabbing just the HTML is slowing you down. Are you downloading any files or media as well?
Link : http://simplehtmldom.sourceforge.net/
There is no way to download only part of a page. When you request a URL, the server response is what it is.
Utilize more of your bandwidth by using cURL's ability to make multiple connections at once.
I have to make an image of a dynamic page i.e. the page keeps on changing in every 5 minutes.
I want to make images of that very page that keeps on changing so that i can have its records saved in the form of images.
How can i do that using php??
i have no idea about this and a little elaboration in answers will be highly appreciated!!
Two steps:
1: Create a script that captures the current data in image form.
If you provide more information about what you mean when you say "create an image of dynamic data", I can probably point you to some resources you can use. For now, just have a look at the GD library.
2: Set up a job that runs the script every 5 minutes
This can be done via Cron. I would suggest investigating if you can run the script when the data changes, instead of at specific intervals.
http://www.devarticles.com/c/a/PHP/Generating-Images-on-the-Fly-With-PHP/
http://www.thesitewizard.com/php/create-image.shtml
Getting a screenshot of a web page isn't an easy task.
You can choose one of the online services that do that for you and you can download the images from there.
Otherwise, I have found a solutions using webkit and python but you will need full access to your linux server in order to install the necessary packages, then you will be able to call that script from php and get your screenshots.
I want to count the no. of times a image is being served from our server. I have some images in a website and want to count the no. when these images are showed on web pages(served from the server to my website and if hotlinked). Is there any way to accomplish this. I know php so if there is some way doing it in php it would be really helpful.
advice please
thank you.
Can't you look at your server logs for that?
If you're wanting something beyond parsing server-logs, you'd have to setup a database to manage the list of images, and the number of times they're accessed. Serve the images through a .php script which increments the db value with each request. You could use a flat-file system too, but I prefer the db-solution.
You wouldn't need to worry about the source of your image if you implement .htaccess and apache's mod_rewrite. You could serve url's like this:
http://mysite.com/images/001.jpg
Which would be understood on the server as:
http://mysite.com/images.php?id=001
Thus providing a basis to perform database-actions, and scripted logic.
You can use Microsoft's LogParser to query your server logs using a query something like this:
c:\Program Files\Log Parser 2.2> logparser "select cs-uri-stem, count(*) as Hits from C:\Your\Log\File\Path\ex091002.log where cs-uri-stem like 'imagefilename.jpg' or where cs-uri-stem like 'anotherimage.jpg' group by cs-uri-stem order by Hits DESC" -i:w3c
You can even have it output to a text file or a graph (requires Excel, I believe) if you need something to display on a page. You'll probably have to change the query if you're using Apache logs, not sure.
You should be able to gather this information using the log files and an analytics package. If you are running IIS a really good one to look into (and free for 1 domain) is SmarterTools' SmarterStats. www.smartertools.com
The answers recommending looking in the logs are right. But if for some reason that's not acceptable, it's not hard to configure a php script to handle this.
1) Create a rewrite rule (using mod_rewrite) to transparently rewrite requests to your image to go to a php file instead, with the image's name as a parameter.
2) your php script can log the request, then send out the appropriate MIME type for the image, and dump the real file to the output buffer (this shouldn't be affected by your rewrite rule as long as you load from the file system rather than using a URL stream).