I have a very large image generated on the fly with PHP and outputted to the browser. (it's 5000px wide and 1000-2000px tall. It's a plot of the daily user activity on my site).
The problem is that nowadays the plot is too big and the PHP script gives memory exhausted errors (tough the generated PNG itself is quite small) and I can't get the image due to this.
Is there way to output this large image in multiple parts somehow using GD in PNG format?
(ps: the host where I run the site uses safe mode, so I can't modify the configuration and I think they're using the default PHP installation.)
EDIT1: It's an admin script. No users see it except me.
EDIT2: and example image can be seen here: http://users.atw.hu/calmarius/trash/wtfb2/x.png
(I also have the option to group the tracks by IP address.)
Every user+IP pair has its own 24 hour track on the plot. And every green mark denotes an user activity. As you can see this image can be output track by track. And there is no need to output and generate the whole thing all once.
This website will be an online strategy game and I want to use this graph in the future to make detecting multiaccounts easier. (Users who are trying to get advantage by registering multiple accounts over those ones who only have 1.) But this is a different problem.
I'm using PHP script because I'm too lazy to export the requestlog from the database, download it and feed the data to a program that would make the plot for me. ;)
Set the memory limit to unlimited before processing the image.
ini_set('memory_limit', '-1');
It'd help to say how you're generating the image (GD library, ImageMagick) and how you're outputting it. Are you saving the file to a directory and then using readfile() to output it? If yes, fopen / fread / echo combination is about 50%-60% faster than using readfile() to output files to the browser. Are you using gzip compression? What's the time limit on php execution? What's the exact error message you're getting?
Related
I have a report generation PHP program which used to work fine before. I have used 2 3rd party libraries in the program: Google image chart library ( returns image if I supply values in url ) and tcpdf ( for pdf generation ). I am using mysql not mysqli for queries. There are lots of queries and loops in the page.
Before it used to take less than 3 minutes to generate the report, I am using an ajax call to generate the report which gives a completed message once the file generation is done. This program saves the pdf file in a folder and I have a link with same name to download the file.
Recently when I checked its not generating properly.
Error was TCPDF unable to get the image. This was because of the google chart library not returning the image properly. When I access the chart url in browser it gives me the image without any issue but If I give it in an image src inside a php file, its not showing. So I decided to save the file in a folder using functions like file_get_contents,file_put_contents and link it in image src. This part is now working correctly I can see the image.
But now the problem is it is taking a lot of time to generate the report, even in local environment. I tried to generate the report without the chart priniting but even then its taking time. In between it was 25 minutes n all and now its close to 10 minutes to generate a 40 page pdf file.
I really don't know why its taking so much time. All of this was working fine before and now its not working. Only thing that changed was google image chart library but now even without(commented that part and checked) that also its taking time.
How do I speed this up ? Is there any way to check which part of program is slow.
Tried xdebug but its output file is more than 400 mb and webgrind is not able to process it.
Please help.
Your next step is to troubleshoot performance.
Is TCPDF doing a lot of work you don't need done? Presumably you've seen the tips from TCPDF's author on increasing performance, and put them into practice. http://www.tcpdf.org/performances.php
Are some of your MySQL queries inefficient? Obtain an interactive connection to your MySQL server, using phpMyAdmin or a similar command-line tool. While your pdf-creation process is running, repeatedly issue this command
SHOW FULL PROCESSLIST
It presents an INFO column showing the active MySQL query for each connection. It also shows each query's elapsed time in milliseconds. If you have queries that run for many hundreds of milliseconds, you might consider using MySQL's
EXPLAIN command to analyze those queries. Often adding an appropriate index to a MySQL table can dramatically speed things up.
Is the machine running your PDF program short on RAM? use a performance monitor like *nix top or Windows perfmon to take a look.
Is your 40-page report, simply put, a huge job to create? If so, you might consider switching to a faster report-generation program than PHP + TCPDF.
Sorted out.
The issue is with the database, one of the tables has more 120000 records in it. Deleted irrelevant records, not a permanent solution but now it generates the same thing in 2.1 minutes.
Now I can't do the same thing in my production server. I would love to get your inputs on how to optimize the database.
Thank You
Hello folks of SO!
We're trying to do some very small and simple code in PHP to generate a variation of a video, using always the same file.
The script would have to make a small pixel mark, on random or specific frame of the video file, and this would have to be streamed in real time.
Here's some pseudo code to explain my idea:
$frame = $_GET[frame];
$videofile = 'video.avi';
make_random_red_pixel_mark($videofile, $frame);
Does anyone know if this is possible using ffmpeg? As well, it is of extreamly importance for us, to execute this procedure as fast as possible.
A solution that would imply reprocessing the whole video, won't be useful for our purposes. It should be something like a closed caption, or a quick image / overlay filter that could be applied without an entire video reprocessing. As well, we can't put the overlay using Javascript nor any HTML approach, since the actual manipulation has to be on the video file itself.
The quality, and framerate of the original video, should be kept intact. Perhaps some other PHP module or software that could be execute from PHP using an exec()?
Any recommendation?
Thanks in advance!!
Chris C. Russo
More information:
1) It's possible for us to apply this procedure on any frame we want to, so we could use a "keyframe" in order to avoid the decoding and reencoding of an entire GOP.
2) As previously stated, the video stream would have to flow in real time.
This is a hard problem. The FFmpeg overlay video filter requires re-encoding.
When you change ALMOST anything in a video, you will be dealing with re-encoding of the video. This might be an expensive process depending on the video and on the how hurry you are (if you want real-time, you are in a hurry).
A possible solution for this would be something like this:
Open the INPUT video.
Create the OUTPUT video.
Loop over the packets of the INPUT video until you find the frame you want.
Reading the flags of the video packets (AVPacket structure) you can identify the Group of Pictures of this frame.
Ok, you will have to RE-ENCODE only the frames that belong to this group of pictures. Because a GOP always start with a keyframe, you will be able to do that.
After done, go on reading the packets of the INPUT and writing it to the OUTPUT (transmux).
The process of reading a packet from source and write to destination is called transmux and is very very cheap for live streaming. It's basically a plain copy of bytes. No big deal.
"The hard part here is that you will have to manage a POOL of packets until you identify the GOP where your frame is located. Why? Because you will read all packets AND STORE them in a pool (without decode the packets). When you identify it's a GOP, you will write these packets to your OUTPUT and go on to the next GOP. So you will always have the GOP in memory to be flushed (all packets together). When you identify the target frame you wanna modify. I will have to DECODE THE FRAMES from the beginning of the GOP to the end, modify the frame you want and then REENCODE this GOP! Well very hard!"
For arbitrary videos, this process above may result in a visible difference of quality of encoding in the GOP you reencoded. :-(
If you don't know how to open a video, read the packets, write the packets, etc, etc... you will have to know the basics os FFmpeg.
In order to do that, I suggest you to study this example if you don't know anything about:
Demuxing: http://ffmpeg.org/doxygen/trunk/doc_2examples_2demuxing_8c-example.html
Muxing: http://ffmpeg.org/doxygen/trunk/doc_2examples_2muxing_8c-example.html
This example will teach you how to open the video, identify the audio/video streams and loop over the packets, as well as decoding and reencoding.
Hard job. These examples are in C. You can decide make a plugin for PHP or use a PHP wrapper for FFmpeg.
OTHER SOLUTION IS: If you have flexibility of choose frame, try to reencode only keyframes. Because keyframes are complete "bitmaps". You don't need to deal with GOPs. You will decode and reencode only 1 frame.
I am working with a large amount of pages (letters) that are the same except for the address and a few other minor details. I believe what slows the PDF creation down the most is the logo image that I'm including on every page (even though it is fairly small).
I'm hoping to speed up the process some more by caching the logo, i.e. by loading the file once and storing it in a variable and have TCPDF use that instead of loading the image every time. TCPDF can load a "PHP image data stream", and the example given is this:
$imgdata = base64_decode('iVBORw0KGgoAAAANSUhEUgAAABwAAAASCAMAAAB/2U7WAAAABlBMVEUAAAD///+l2Z/dAAAASUlEQVR4XqWQUQoAIAxC2/0vXZDrEX4IJTRkb7lobNUStXsB0jIXIAMSsQnWlsV+wULF4Avk9fLq2r8a5HSE35Q3eO2XP1A1wQkZSgETvDtKdQAAAABJRU5ErkJggg==');
$pdf->Image('#'.$imgdata);
However, I have no idea how to create an image stream like this from a file.
My logo is a small (4kB) PNG file. If I use readfile($file) and send that to $pdf->Image with the '#' in front, it errors out - something about the cache folder which is already set to chmod 777 (it's a test server - I'll work on proper permissions on the live server). I believe I also tried base64_encode which also didn't work.
Any thoughts on how to do this?
PS: I already noticed that the more pages I include into the PDF, the slower it gets, so I'll find a good middle (probably 200-250 pages per file instead of the current 500).
Thanks!
Posted the same question in the TCPDF forum on sourceforge (sourceforge forum post), and the author of TCPDF answered.
He said that images are cached internally, however if the images need processing, he suggests using the XObject() template system (see example 62 on TCPDF site).
It took me a while to get it working (still not sure why it didn't work for me at first), but once I had it looking exactly like my original version using Image(), I ran a few tests with about 3,000 entries divided into PDF files of 500 pages each.
There was no speed gain at all between XObject() and Image(), and XObject() actually appeared to make the resulting files just a tiny bit larger (2.5kB in a 1.2MB file).
While this doesn't directly answer my original question (how to create a PHP data stream that can be directly used in TCPDF using Image('#'.$image)), it tells me what I really needed to know - the image is already cached, and caching using XObject() does not provide any advantage to my situation.
So some background: what I'm doing, is creating a gallery that shows thumbnails of all the pictures in a server directory dynamically (it caches the thumbnails, don't worry). When a user clicks on a thumbnail, a loading gif is displayed until the image is ready, and then the image displayed. The actual pictures are very large in size and might take a considerable amount of time to download to a users computer.
What I would like to do, is show a percentage of the picture that is downloaded while the loading gif is playing.
I realize there are other questions like this, and from what research I've done so far, I also realize this might not be able to be accomplished without some server-side tricks.
From what I have come across in the last little bit, I've gathered (and I could be wrong, so please correct me if I am) is that the client-side code, knows how many bytes are received, but not how large the file is.
So is there a possible configuration using some php/javascript tricks, so that the client side javascript can load an image from a web-server directory and be able to calculate downloaded percentage?
Possibly the php code sending an extra header to the client with file size or something? Or even opening a second request to the web server for file size? How could you get the currently downloaded bytes?
You can use XMLHttpRequest2 to load the data and hook onto the progress events. The loaded data is turned into base64 and added to a Data URI. Once loading has finished you can assign a new image source to the constructed URI.
More info can be found here: http://blogs.adobe.com/webplatform/2012/01/13/html5-image-progress-events/
I'm sure this has been asked before, but as I can't seem to find a good answer, here I am, asking... again. :)
Is there any way, using only a mixture of HTML, JavaScript/AJAX, and PHP, to report the actual progress of a file upload?
In reply to anyone suggesting SWFUpload or similar:
I know all about it. Been down that road. I'm looking for a 100% pure solution (and yes, I know I probably won't get it).
Monitoring your file uploads with PHP/Javascript requires the PECL extension:
uploadprogress
A good example of the code needed to display the progress to your users is:
Uber Uploader
If I'm not mistaken it uses JQuery to communicate with PHP.
You could also write it yourself, It's not that complex.
Add a hidden element as the first element of upload form, named UPLOAD_IDENTIFIER.
Poll a PHP script that calls uploadprogress_get_info( UPLOAD_IDENTIFIER )
It return an array containing the following:
time_start - The time that the upload began (unix timestamp),
time_last - The time that the progress info was last updated,
speed_average - Average speed in bytes per second,
speed_last - Last measured speed in bytes per second,
bytes_uploaded - Number of bytes uploaded so far,
bytes_total - The value of the Content-Length header sent by the browser,
files_uploaded - Number of files uploaded so far,
est_sec - Estimated number of seconds remaining.
Let PHP return the info to Javascript and you should have plenty of information.
Depending on the audience, you will likely not use all the info available.
If you have APC installed (and by this point, you really should; it'll be standard in PHP6), it has an option to enable upload tracking.
There's some documentation, and Rasmus has written a code sample that uses YUI.
If you're able to add PECL packages into your PHP, there is the uploadprogress package.
The simplest way would be to just use swfupload, though.
Is there any way, using only a mixture of HTML, JavaScript/AJAX, and PHP, to report the actual progress of a file upload?
I don't know of any way to monitor plain HTML (multipart/form-data) file uploads in webserver-loaded PHP.
You need to have access to the progress of the multipart/form-data parser as the data comes in, but this looks impossible because the ways of accessing the HTTP request body from PHP ($HTTP_RAW_POST_DATA and php://input) are documented as being “not available with enctype="multipart/form-data"”.
You could do a script-assisted file upload in Firefox using an upload field's FileList to grab the contents of a file to submit in a segmented or non-multipart way. Still a bunch of work to parse though.
(You could even run a PHP script as a standalone server on another port just for receiving file uploads, using your own HTTP-handling code. But that's a huge amount of work for relatively little gain.)
I'd recommend you to five FancyUpload a try it's a really cool solution for progress bar and it's not necesarely attached to php. Checkout also the other tools at digitarald.de
cheers
IMHO, this is the problem that Web browsers should solve. We have progress meter for downloads, so why not for uploads as well?
Take a look at this for example:
http://www.fireuploader.com/