how to create an image of dynamic data in php - php

I have to make an image of a dynamic page i.e. the page keeps on changing in every 5 minutes.
I want to make images of that very page that keeps on changing so that i can have its records saved in the form of images.
How can i do that using php??
i have no idea about this and a little elaboration in answers will be highly appreciated!!

Two steps:
1: Create a script that captures the current data in image form.
If you provide more information about what you mean when you say "create an image of dynamic data", I can probably point you to some resources you can use. For now, just have a look at the GD library.
2: Set up a job that runs the script every 5 minutes
This can be done via Cron. I would suggest investigating if you can run the script when the data changes, instead of at specific intervals.

http://www.devarticles.com/c/a/PHP/Generating-Images-on-the-Fly-With-PHP/
http://www.thesitewizard.com/php/create-image.shtml

Getting a screenshot of a web page isn't an easy task.
You can choose one of the online services that do that for you and you can download the images from there.
Otherwise, I have found a solutions using webkit and python but you will need full access to your linux server in order to install the necessary packages, then you will be able to call that script from php and get your screenshots.

Related

How to call tabula-java from another program written in php?

Situation
I have a website written in PHP.
In PHP, I can extract the text inside a pdf file uploaded to the same website and so on.
I found the tabula-java github repo.
So what's the issue?
I have tried the mac app for tabula. I noticed that I needed to highlight a certain section of the pdf before the table data can be converted.
However, that's not what I want to accomplish. I want to run tabula in the background and on demand. When my website receives a file upload and certain conditions are satisfied, I want to call the tabula as a service somehow and feed it the unstructured data and then get back the tabulated data.
How do I go about doing this?
One way is to wrap the tabula-extractor command line command and return the results into your application.
For example, in R, the tabulizer package works this way.

TCPDF takes 10 minutes to generate a 40 page pdf file

I have a report generation PHP program which used to work fine before. I have used 2 3rd party libraries in the program: Google image chart library ( returns image if I supply values in url ) and tcpdf ( for pdf generation ). I am using mysql not mysqli for queries. There are lots of queries and loops in the page.
Before it used to take less than 3 minutes to generate the report, I am using an ajax call to generate the report which gives a completed message once the file generation is done. This program saves the pdf file in a folder and I have a link with same name to download the file.
Recently when I checked its not generating properly.
Error was TCPDF unable to get the image. This was because of the google chart library not returning the image properly. When I access the chart url in browser it gives me the image without any issue but If I give it in an image src inside a php file, its not showing. So I decided to save the file in a folder using functions like file_get_contents,file_put_contents and link it in image src. This part is now working correctly I can see the image.
But now the problem is it is taking a lot of time to generate the report, even in local environment. I tried to generate the report without the chart priniting but even then its taking time. In between it was 25 minutes n all and now its close to 10 minutes to generate a 40 page pdf file.
I really don't know why its taking so much time. All of this was working fine before and now its not working. Only thing that changed was google image chart library but now even without(commented that part and checked) that also its taking time.
How do I speed this up ? Is there any way to check which part of program is slow.
Tried xdebug but its output file is more than 400 mb and webgrind is not able to process it.
Please help.
Your next step is to troubleshoot performance.
Is TCPDF doing a lot of work you don't need done? Presumably you've seen the tips from TCPDF's author on increasing performance, and put them into practice. http://www.tcpdf.org/performances.php
Are some of your MySQL queries inefficient? Obtain an interactive connection to your MySQL server, using phpMyAdmin or a similar command-line tool. While your pdf-creation process is running, repeatedly issue this command
SHOW FULL PROCESSLIST
It presents an INFO column showing the active MySQL query for each connection. It also shows each query's elapsed time in milliseconds. If you have queries that run for many hundreds of milliseconds, you might consider using MySQL's
EXPLAIN command to analyze those queries. Often adding an appropriate index to a MySQL table can dramatically speed things up.
Is the machine running your PDF program short on RAM? use a performance monitor like *nix top or Windows perfmon to take a look.
Is your 40-page report, simply put, a huge job to create? If so, you might consider switching to a faster report-generation program than PHP + TCPDF.
Sorted out.
The issue is with the database, one of the tables has more 120000 records in it. Deleted irrelevant records, not a permanent solution but now it generates the same thing in 2.1 minutes.
Now I can't do the same thing in my production server. I would love to get your inputs on how to optimize the database.
Thank You

Server-side dynamic Flash export to video

I have spend some time hunting around for a solution without success so I am hoping someone here can at least point me in the right direction.
The specific project flow is this:
user visits a Facebook app
user uploads a number of photos and chooses optional filters
user can preview a video which showcases their photos (the video has animation and audio)
user can then choose to download this video for their device/PC
Some givens:
server side is PHP on Linux/Apache
preview video is Flash
output format is variable (WMV/AVI/MP4)
I have found a couple of solutions but none seem to match this exact flow. I want the whole process to be automated/scripted so need a component that can sit on the server, accept commands from PHP and be able to handle dynamic Flash input and export to the chosen format. My client has a generous budget to buy software to do this.
If anyone can suggest a good software solution or indeed another method to achieve the same goal I would be eternally grateful...
Thank you!
I won't give the exact details of how to do this yet, but a brief outline of what you could do:
User uploads video (or it's already on the server? I'm not sure)
You use exec() or shell_exec() to run ffmpeg, which will convert the
flash file to another output, depending on the user's choice.
What I would recommend is writing a bash script or something that takes a command line parameter of the output choice (and input file of course), converts the file, and returns the location of the new file. You can use the output of exec() or shell_exec() in PHP to return the location to the user and allow them to download the file.
If you would like more details on a certain aspect, please comment, but I'm not sure if you've already looked into this method, so for now it's just a suggestion.

dynamic web page automatic snapshot

I have a dynamic web site (php/mySQL/Ajax on a Linux server), I need to take automatically a photo (snapshot) of each web page periodically (If I can find the way to do the snapshot... I can use cron) and save this image to the database (I also know how to do this...my only problem is the photo!).
I can't do it manually, so I need an script which take the snapshop for me, without displaying the web page, i.e directly from the .php files.
How is it possible?
Thanks!
http://browsershots.org/ may work for you, they have an api
You can use the GD functions imagegrabscreen() or imagegrabwindow() to take a screenshot.
Note that they're only available on Windows at the moment.
Looks like this might answer your question, I have seen it done with php and flash but wasn't privy to the inner workings, if the link doesnt help then you could research that route.

PHP: I want to create a page that extracts images from a forum thread, doable? codeigniter?

You have a forum (vbulletin) that has a bunch of images - how easy would it be to have a page that visits a thread, steps through each page and forwards to the user (via ajax or whatever) the images. i'm not asking about filtering (that's easy of course).
doable in a day? :)
I have a site that uses codeigniter as well - would it be even simpler using it?
assuming this is to be carried out on server, curl + regexp are your friends .. and yes .. doable in a day...
there are also some open-source HTML parsers that might make this cleaner
It depends on where your scraping script runs.
If it runs on the same server as the forum software, you might want to access the database directly and check for image links there. I'm not familiar with vbulletin, but probably it offers a plugin api that allows for high level database access. That would simplify querying all posts in a thread.
If, however, your script runs on a different machine (or, in other words, is unrelated to the forum software), it would have to act as a http client. It could fetch all pages of a thread (either automatically by searching for a NEXT link in a page or manually by having all pages specified as parameters) and search the html source code for image tags (<img .../>).
Then a regular expression could be used to extract the image urls. Finally, the script could use these image urls to construct another page displaying all these images, or it could download them and create a package.
In the second case the script actually acts as a "spider", so it should respect things like robots.txt or meta tags.
When doing this, make sure to rate-limit your fetching. You don't want to overload the forum server by requesting many pages per second. Simplest way to do this is probably just to sleep for X seconds between each fetch.
Yes doable in a day
Since you already have a working CI setup I would use it.
I would use the following approach:
1) Make a model in CI capable of:
logging in to vbulletin (images are often added as attachments and you need to be logged in before you can download them). Use something like snoopy.
collecting the url for the "last button" using preg_match(), parsing the url with parse_url() / and parse_str() and generating links from page 1 to page last
collecting html from all generated links. Still using snoopy.
finding all images in html using preg_match_all()
downloading all images. Still using snoopy.
moving the downloaded image from a tmp directory into another directory renaming it imagename_01, imagename_02, etc. if the same imagename allready exists.
saving the image name and precise bytesize in a db table. Then you can avoid downloading the same image more than once.
2) Make a method in a controller that collects all images
3) Setup a cronjob that collect images at regular intervals. wget -o /tmp/useless.html http://localhost/imageminer/collect should do nicely
4) Write the code that outputs pretty html for the enduser using the db table to get the images.

Categories