So I just created a script to resize a whole bunch of images. Is there anyway to have there be output as its running through the loop?
Basically I have like 400 photos in photo db table. Its gathering a list of all these photos, then looping through each one and resizing it 3 times. (large,medium,small version).
Right now on each loop I am echoing that images results, but I dont see the results untill EVERYTHING is done. So like 10 minutes later, I will get output. I added this set_time_limit(0); to make sure it doesnt time out.
**EDIT ** It looks like every so often the script actually updates to the browser, maybe every 30 seconds?
You can use flush() or ob_flush() to tell the script to send something back to the client after you do an echo().
BUT - you never really have complete control over it, the web server does, so the web server may not cooperate depending on how its configured. For example, if you have the server doing gzip rather than using PHP's gzip features, the web server may still buffer the output.
Related
Like a Log-file is written by a php-script via fwrite($fp, ---HTML---),
I need to save an HTML DIV as png-file on the server.
The client-browser only start the php-script,
but without any client-interaction the png-file should be saved on the server.
Is there a way to do this?
All posts (over thousands) I have been reading are about html2canvas,
which is (as I understand) client-side operating.
I know the html-(html-div)-rendering normally does the browser.[=client-side]
But is there a way to do it in PHP on server-side ?
Reason:
Until now the procedure is
print the div via browser on paper twice
one for the costumer,
one to scan it in again to save it on the server as picture and throw it in the paperbasket.
By more than 500 times a day ...
By security reasons it need to be a saved picture on the server.
I'm working on an app that creates QR codes and renders them onto multiple graphics for a user.
The Problem:
I wrote a script to import users to create from a CSV. I'm needing to create over 100 users (each including the process above). Right now it takes roughly 1 minute to complete for each new user to complete the processing.. then spits out all my error/success messages at once.
My Question:
Rather than the browser slowly loading the result view (currently stays on a white page until complete) as my script is processing, is their a somewhat easy way to display the live progress and errors as they happen? Something like a progress bar updated as each user is created/fails. I'm guessing it will require AJAX?
When dealing with websites, remember the golden rule.
PHP MUST DIE.
Noobs assume this is people rubbishing PHP. It isn't. It's the HTTP request cycle.
Request In > PHP > Response Out > PHP process dies.
This is only the case when dealing with web servers and browsers, not CLI PHP. But the point is that you may end up getting Apache timeouts if your script takes as long as you say.
One solution could be to set up a cron that checks for a file and if it finds it, processes it, dumping a line number in a text file that your browser could check, which means you could fetch progress:
<?php
if (file_exists('/some/csv/to/process.csv')) {
// open file
// get row to work on
// process row
// update progress file with next line number
}
Meanwhile, you could set up a script that does this:
<?php
$progress = file_get_contents('/path/to/progress.txt');
header('Content-Type: application/json');
echo json_encode(['progress' => $progress]);
And then get the progress using AJAX inside a setInterval function:
$.get('/path/to/progress/json/page', function(data){
console.log(data);
});
Just an idea, may or may not suit you but give it a try!
I have a report generation PHP program which used to work fine before. I have used 2 3rd party libraries in the program: Google image chart library ( returns image if I supply values in url ) and tcpdf ( for pdf generation ). I am using mysql not mysqli for queries. There are lots of queries and loops in the page.
Before it used to take less than 3 minutes to generate the report, I am using an ajax call to generate the report which gives a completed message once the file generation is done. This program saves the pdf file in a folder and I have a link with same name to download the file.
Recently when I checked its not generating properly.
Error was TCPDF unable to get the image. This was because of the google chart library not returning the image properly. When I access the chart url in browser it gives me the image without any issue but If I give it in an image src inside a php file, its not showing. So I decided to save the file in a folder using functions like file_get_contents,file_put_contents and link it in image src. This part is now working correctly I can see the image.
But now the problem is it is taking a lot of time to generate the report, even in local environment. I tried to generate the report without the chart priniting but even then its taking time. In between it was 25 minutes n all and now its close to 10 minutes to generate a 40 page pdf file.
I really don't know why its taking so much time. All of this was working fine before and now its not working. Only thing that changed was google image chart library but now even without(commented that part and checked) that also its taking time.
How do I speed this up ? Is there any way to check which part of program is slow.
Tried xdebug but its output file is more than 400 mb and webgrind is not able to process it.
Please help.
Your next step is to troubleshoot performance.
Is TCPDF doing a lot of work you don't need done? Presumably you've seen the tips from TCPDF's author on increasing performance, and put them into practice. http://www.tcpdf.org/performances.php
Are some of your MySQL queries inefficient? Obtain an interactive connection to your MySQL server, using phpMyAdmin or a similar command-line tool. While your pdf-creation process is running, repeatedly issue this command
SHOW FULL PROCESSLIST
It presents an INFO column showing the active MySQL query for each connection. It also shows each query's elapsed time in milliseconds. If you have queries that run for many hundreds of milliseconds, you might consider using MySQL's
EXPLAIN command to analyze those queries. Often adding an appropriate index to a MySQL table can dramatically speed things up.
Is the machine running your PDF program short on RAM? use a performance monitor like *nix top or Windows perfmon to take a look.
Is your 40-page report, simply put, a huge job to create? If so, you might consider switching to a faster report-generation program than PHP + TCPDF.
Sorted out.
The issue is with the database, one of the tables has more 120000 records in it. Deleted irrelevant records, not a permanent solution but now it generates the same thing in 2.1 minutes.
Now I can't do the same thing in my production server. I would love to get your inputs on how to optimize the database.
Thank You
I have done a code to receive images from iphone to PHP Server and I need to resize these image and move to 4 folders.
Only then the json respose is giving to iphone. But it takes much time.
Requirement:
i want to move a file to the folder "folder1" then want to give the json response.
the resizing process should do from this "folder1" after giving json response.
How to run this resizing process in background.
Here is my code:
http://pastebin.com/qAcT1yi9
You could always send your php script to run in the background with a Linux command.
Example:
// using backticks to execute the Linux command but there are
// other alternatives
$cmd = `php runScriptInBackground.php &`;
echo $cmd;
First send/upload the images and send a response back, without doing the resize operation.
Then, if the upload was successful, let the browser issue another request and do the resizing. When this succeeds, send the message ‘resizing successful’ back.
A common solution to this problem is implementing a loading/processing message on hitting a specific event. Then - still being displayed - the action will continue to load on the background and the result page will finally be displayed when done.
Although the user must wait, I prefer this above display a result message when the actual result is not known. Unfortunately I'm not sure how this is done on iphone development.
if your building in objective c then you may just resize make a copy and resize it there and send the resized image to your php you could then display a spinner and json result back to the user and also if the is an error the user will still have the resized image to try again with... Also another thought I had was was to use push notification. I don't know what that code would look like but it's something to consider
you need some async javascript or an iframe in your page posting the image to your server and providing feedback to the user.
This means that the 'main' page would not change, but some visual information can be provided to the user.
You can display an animated gif loader or use JS setInterval to give the user the feeling that things are moving forward why waiting for the server to respond.
If the processing is split in more 1 parts, after each step the server could respond with an HTML page and a redirect: this would even work in an IFRAME without JS.
Each 'page' would perform one more step. But if the user closes the browser before all is done you would end with an unfinished task.
A DB, real background processing, and client side JS polling are a more robust alternative.
A full answer would be quite long and require way more details on your settings (apache CGI PHP? or mod_php? are you using an MVC model or framework, or are you writing a page-oriented website?).
If i had to write a full answer I would forget PHP and use Python and celery http://celeryproject.org/ ;-)
PS.
I just found out that a few related questions already existed:
PHP Background Processes
Asynchronous shell exec in PHP
You can do it realy in two times, first send de files and save on first server, after when the user request that you generate the necesary parts.
You will pass the costs from the file sender to the first request from that.
I am fixing up a php script to recieve a byte64 encoded image and store it to database
Here is my problem the script takes very long time, sometimes up to 5 minutes, to respond when flash calls the script(via a post request)
(i test it with a very small image)
if i remove the byte64 encoded data from the request it loads fast, and if i call the script from the browser with no data it is fast
i tried removing all php script from the script file so no php is run, and with data it is still slow.
My guess is that this is somehow server related, but i have no clue what this could be, other than it must happen before php is run.
and i have no htaccess file on the site,
"if i remove the byte64 encoded data from the request it loads fast"
So you know exactly where you issue is at. Test the queries, if they're not optimizable, store your images the normal way.
i ended up rewriting the script to a normal upload and then store them as files, storing images in DB is bad mojo, but was trying to avoid having to rewrite the intire script at the time :)
but thanks for your time!