many images on one page some not loading - php

I am making a Warehouse management system.
The orders come in a CSV in the morning that my script then executes.
It places a php-made barcode on the top of each order. the sample CSV i am using has around 100 unique orders on, so when i load the page that will then print orders off the server is getting 100+ requests and (im guessing) some of the images time out.
When i view source and open the link to the ones that don't work it loads the image, leading me to think i need to somehow disable the timout method on the browser.
My only other idea is to load the barcodes through javascript.
Any suggestions?

I think what enygma may be getting at is the limited processing time php scripts have. Sometimes they get cut off after 30 seconds. Generating all of those images at one time might run over, causing your script to be killed on the server and stop sending data. Your idea of loading them in javascript is probably your best bet, as long as you only do a few at a time or do them serially.

If you start a session in php, the session is locked and cannot be accessed by another php script until released.
Based on you generating images with php - that's quite likely the cause of what you see.
There are other questions which go into a bit more detail of how php and sessions work; but most likely that's the direct cause for some of your images not being received - the requests are in a single, serial queue being processed in turn because each script reads the session and doesn't release it until it's finished. The requests at the end of the queue hit a time limit one way or another and return nothing.
Therefore, ensure that you call:
session_write_close();
as soon as you can in all scripts that need access to the session to prevent them from blocking all other php requests, or better still don't use the session at all (e.g. if you're using the session for authorization just include a hash in the url and compare to that for image requests).

Related

Pages hang when another page from same site is loading

Seeing some strange behavior in my application. Any time I have a long running script waiting for a response in the browser, any other page in the application will spin/hang after a click (even for very simple//static pages) until the original page finishes loading. This is a PHP5.3 based application using native PHP sessions on Apache 2.2.x.
This happens in multiple browsers in all our dev, qa, and production instances. I'm not sure where to start looking. Any advice?
Get all the data you need from the $_SESSION, and call session_write_close() before you are going to do something that takes a long time. Due to a PHP script still being active with that session, it locks the session (after all, it may write some data to that session that the next request needs).
If you need to write something to the session after you've done your long job, you can just call session_start() once again (provided you've not generated output yet), write to it, and after that the script may end, or maybe you repeat this cycle a few times.
Also, don't call session_start() if you don't need it, I see you mention 'static' pages. If a page does not need session data, avoid the overhead & locks it creates.

PHP page process separation

If some php page is running some long process such as sleep or while loop that makes it take a while till it loads, does it affect on other processes from the same page ?,
I noticed when i try to open the same page with different short process, it also takes so long to load and to be clear it doesn't load before the first one (long process) does,
is it true or something's wrong with my code and how to prevent it ?
i think it has something to do with cache, i don't wanna mess up though before getting a tip or an answer
PHP run in a single process, each time you access the page, it start the process, process, and finish.
Each process won't affect the others.
I noticed when i try to open the same page with different short
process, [...] it doesn't load before the first one (long process) does
The most common reasons:
Your scripts use PHP sessions "as is", which use file locking. The file locking mechanism ensures only one script at a time can edit the session data of each user, but this does mean that provided two requests from the same user happen simultaneously, a second script will not start before the first has finished if they both rely on sessions (two different users have different session files though, so they can't collide)
The browser automatically detects the page is taking long and delays subsequent requests intentionally in the background — I believe this is something Google Chrome does by default.
Both cases are relatively safe however, because the delay is only present in case the same user is trying to load several pages simultaneously which is not usual — different users will not see delays regardless how long the actual page takes to load.
More on the topic in this excellent SO answer.

push an xml file update from server to all client browsers?

I'm wondering if it is possible to push an xml file update from server to all client browsers?
Basically my proposed situation is that my server holds an xml file, when a user loads a page that uses said xml file they can request to change it, if the change is allowed (determined by the page client side) then the xml file is updated on the server side. I'm fine up to this point (well, I have plenty of reading to get me to this point). Then I want all pages who are connected to refresh all elements of the page reliant on the xml file with out refreshing the whole page.
Another words all those pages using the file to update their copy->data if the copy on the server is newer than theirs. Is this possible via a server push, or do I have to constantly poll the server to compare files? (That seems sloppy to me..) And if it is possible, what's the best way to go about it?
Thanks for any points in the right direction.
Because a webpage is stateless you cannot push data to it. You need to poll the server for updates. Think about a small ajax script that polls about every 5 minutes, when content is update that script calls something to update the page. You will need a lot of ajax to do this; take jQuery or alike to accomplish this.
You may try an AJAX call to a some php script like:
set_time_limit(3600); // one hour or set it as long as your session timeout is
// Keep on repeating this to prevent PHP from stopping the script
while (true)
{
sleep(5); //5 seconds between polling the server
//do the updates xml updates
flush();
ob_flush();
}
The connection will stay open and every 5 seconds xml will be pulled and client updated.
If you don't wont to spend a lot of time and resources by pulling the data if it's not changed, you may use APC, memcache or any other server stored variable which notifies you the XML was changed.
if(apc_fetch('xml_updated') == 1)
{
//do the xml pull
}
You may test what happens if you are trying to pull data every second in terms of resources. In my opinion it's best to have a greater delay.
Hope it helps!

PHP - Display status of loop

I have a PHP script something like:
$i=0;
for(;$i<500;++i) {
//Do some operation with files numbered 0 to 500;
}
The thing is, the script works and displays the end results, but the operation takes a while and watching a blank screen can be frustrating. I was thinking if there is some way I can continuously update the page at the client's end, detailing which file is currently being worked upon. That is, can I display and continuously update what is the current value of $i?
The Solution
Thanks everyone! The output buffering is working as suggested. However, David has offered valuable insight and am considering that approach as well.
You can buffer and control the output from the PHP script.
However, you may want to consider the scalability of this design. In general, heavy processes shouldn't be done online. Your particular case may be an edge in that the wait is acceptable, but consider something like this as an alternative for an improved user experience:
The user kicks off a process. This can be as simple as setting a flag on a record in the database or inserting some "to be processed" records into the data.
The user is immediately directed to a page indicating that the process has been queued.
An offline process (either kicked off by the PHP script on the server or scheduled to run regularly) checks the data and does the heavy processing.
In the meantime, the user can refresh the page (manually, by navigating elsewhere and coming back to check, or even use an AJAX polling mechanism to update the page) to check the status of the processing. In this case, it sounds like you'd have several hundred records in a database table queued for processing. As each one finishes, it can be flagged as done. The page can just check how many are left, which one is current, etc. from the data.
When the processing is completed, the page shows the result.
In general this is a better user experience because it doesn't force the user to wait. The user can navigate around the site and check back on progress as desired. Additionally, this approach scales better. If your heavy processing is done directly on the page, what happens when you have many users or the data processing load increases? Will the page start to time out? Will users have to wait longer? By making the process happen outside of the scope of the website you can offload it to better hardware if needed, ensure that records are processed in serial/parallel as business rules demand (avoid race conditions), save processing for off-peak hours, etc.
Check out PHP's Output Buffering.
Try to use:
flush();
http://php.net/manual/ru/function.flush.php
Try the flush() function. Calling this function forces PHP to send whatever output it has so far to the client, instead of waiting for the script to end.
However, some web servers will only send the output once the entire page is done being built, so calling flush() would have no effect in this case.
Also, browsers themselves buffer input, so you may run into problems there. For example, certain versions of IE won't start displaying the page until 256 bytes has been received.

PHP display progress messages on the fly

I am working in a tool in PHP that processes a lot of data and takes a while to finish. I would like to keep the user updated with what is going on and the current task processed.
What is in your opinion the best way to do it? I've got some ideas but can't decide for the most effective one:
The old way: execute a small part of the script and display a page to the user with a Meta Redirect or a JavaScript timer to send a request to continue the script (like /script.php?step=2).
Sending AJAX requests constantly to read a server file that PHP keeps updating through fwrite().
Same as above but PHP updates a field in the database instead of saving a file.
Does any of those sound good? Any ideas?
Thanks!
Rather than writing to a static file you fetch with AJAX or to an extra database field, why not have another PHP script that simply returns a completion percentage for the specified task. Your page can then update the progress via a very lightweight AJAX request to said PHP script.
As for implementing this "progress" script, I could offer more advice if I had more insight as to what you mean by "processes a lot of data". If you are writing to a file, your "progress" script could simply check the file size and return the percentage complete. For more complex tasks, you might assign benchmarks to particular processes and return an estimated percentage complete based on which process has completed last or is currently running.
UPDATE
This is one suggested method to "check the progress" of an active script which is simply waiting for a response from a request. I have a data mining application that I use a similar method for.
In your script that makes the request you're waiting for (the script you want to check the progress of), you can store (either in a file or a database, I use a database as I have hundreds of processes running at any time which all need to track their progress, and I have another script that allows me to monitor progress of these processes) a progress variable for the process. When the process begins, set this to 1. You can easily select an arbitrary number of 'checkpoints' the script will pass and calculate the percentage given the current checkpoint. For a large request, however, you might be more interested in knowing the approximate percent the request has completed. One possible solution would be to know the size of the returned content and set your status variable according to the percentage received at any moment. I.e. if you receive the request data in a loop, each iteration you could update the status. Or if you are downloading to a flat file you could poll the size of the file. This could be done less accurately with time (rather than file size) if you know the approximate time the request should take to complete and simply compare against the script's current execution time. Obviously neither of these are perfect solutions, but I hope they'll give you some insight into your options.
I suggest using the AJAX method, but not using a file or a database. You could probably use session values or something like that, that way you don't have to create a connection or open a file to do anything.
In the past, I've just written messages out to the page and used flush() to flush the output buffer. Very simple, but it may not work correctly on every web server or with every web browser (as they may do their own internal buffering).
Personally, I like your second option the best. Should be reliable and fairly simple to implement.
I like option 2 - using AJAX to read a status file that PHP writes to periodically. This opens up a lot of different presentation options. If you write a JSON object to the file, you can easily parse it and display things like a progress bar, status messages, etc...
A 'dirty' but quick-and-easy approach is to just echo out the status as the script runs along. So long as you don't have output buffering on, the browser will render the HTML as it receives it from the server (I know WordPress uses this technique for it's auto-upgrade).
But yes, a 'better' approach would be AJAX, though I wouldn't say there's anything wrong with 'breaking it up' use redirects.
Why not incorporate 1 & 2, where AJAX sends a request to script.php?step=1, checks response, writes to the browser, then goes back for more at script.php?step=2 and so on?
if you can do away with IE then use server sent events. its the ideal solution.

Categories