In one app, I have an administrative backend written in PHP, which allows to browse internal data structures or change settings.
One controller queries a DB backend checks against another reference DB using SOAP and returns a list of missing values. Takes a few seconds to complete.
The PHP code doesn't send a HTTP Location header, nor does the client-side kicks off JS code.
If I submit a request, the controller starts, finishes and automatically (!) starts again. The second start terminates normally - emits a report too. The behaviour happens deterministic each time I call the script.
After hours of debugging, I finally made the PHP code send a report back to me, which includes emitted HTML-Code. As a result, I receive two reports. But again, I receive two emails. The restart of the script happens instantly.
I know, this isn't much I can supply, but might this be some Firefox bug?
Other browsers run the PHP script once and only once.
Update 2012-01-09
The problem persists. Still, firefox reloads the page, no other browser does.
No method, especially tracing HTTP-request / -reply did show anything special.
You could use the "Live HTTP headers" Firefox plugin to monitor what's going on. The page is obviously loaded 2 times, this will help you determine how you end up with something like this.
You might want to use a proxy to see how the traffic is different across browsers (I use http://www.fiddler2.com/ it's a Microsoft project but works with all browsers).
Same Problem here. I think i know whats causing the problem. I echo some debugging stuff before the DOCTYPE/HTML-Tag. And maybe FF thinks there is something wrong with the page, so it reloads it once more.
That sounds crazy, but if i don't echo anything before html it works fine.
Are you sending the correct character set headers? If Firefox thinks it started decoding the page with the wrong character set, it will reload the page to fix it.
Related
I've got the following problem at hand:
I'm having users on two seperate pages, but saving page input to the same text file. While one user is editing, the other can't. I'm keeping track of this with sessions and writing changes and who's turn to edit it is in a file.
Works fine so far, the output in the end is quite similar to a chat. However, right now I'm having users manually actualize their page and reload the file. What I'd like to do is have the page execute a redirect when the file-timestamp changes (to indicate that the last user has saved their edits and its another users turn). I've looked into javascript shortpolling a little, but then found the php filmtime function and it looks much easier to use. Well - here's what I got:
while(true){
$oldtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
$waittimer=2;
$waittimer++;
sleep($waittimer);
$newtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
if ($eintragszeit2 > $eintragszeit1){
addnav("","kampf_ms.php?op=akt");
redirect("kampf_ms.php?op=akt");
}}
In theory, while the user sees the output "it's ... turn to edit the file." this should loop in the background, checking if the file has already been updated, and if yes, redirect the user.
Practically this heavily affects server perfomance (I'm on shared hosting) until it breaks with a memory exceeded error message.
Is something wrong with the code? Or is it generally a bad idea to use a while loop in this case?
Thanks in advance!
PHP language should be only used to generate web content (client do a request to the server => server calls the required script, and returns the response to the client).
Once page is loaded and displayed to the client, the connection is closed, so Internet can die, the client isn't informed...
So with an infinite loop, not only the client can wait for response... an infinite time, but also the server may be heavy impacted because of load... Effectively It is a really bad idea :)
PHP can't be used to make a bidirectional communication: it is just called to build web pages that client demands, and so he can't do anything "in the background" (not directly, effectively you can call an external script, but not for notify a client...)
Also, to do a bidirectional communication, php and "regular" http is not good, because of client / server architecture (the server only answers client request, it is passive)
I can suggest to use WebSocket protocol, to do a chat application:
http://socket.io/
https://en.wikipedia.org/wiki/WebSocket
But for that, you need to use an "active" server solution, such as node.js or ruby (depends of your server capabilities...)
The other way if you want to stay in php is that client makes Ajax request every 10 seconds, for example, to call a php script which check the file, and send back a message to the client if file is updated, but it is really deprecated, because of heavy performance loss, so forget it immediately.
I'm considering doing capability/feature tests with JS and sending the results back to the server so it knows what it can/cannot send to client.
It's the same idea as modernizr server - https://github.com/jamesgpearce/modernizr-server
The problem I'm running into is communicating the JavaScript results back to the server on initial page load. I'm not sure what options are available but I know there are not a lot. I've tested setting a cookie and instantly reloading the page so PHP can read the results yet I'm concerned from an SEO standpoint. It just seems like instantly reloading a page will have some adverse affects, also I'm particularly worried if the refresh is on a page that has a form. Once the cookie is set and the user goes to another page, it works fine, it's just figuring out how to serve the content on the initial page load based on the capability tests. I've had a few different thoughts like just using JS to serve the markup on the initial page load and let PHP read the cookie on subsequent page loads, but I'm just not sure what might be the best solution.
I'm just at a loss as to what other options there are. I don't know which direction I should be looking in or if there is any direction at all. I don't know if AJAX would be able to help with this or not. I feel like I'm close, but figured maybe if I asked the community someone might have a good solution.
Thanks!
modernizr-server uses the method you described, it sets a cookie and then reloads the page. (Actually, it doesn't do any content output, the only thing on the initial page load is the JavaScript itself. Has an exit; statement if the cookie it's looking for isn't found.) Anyone who has JavaScript off could probably expect a blank page with it.
It looks like you have a couple of options (this is non-exclusive, there are more.):
Set a cookie and then reload.
Set a cookie, and then use AJAX to fetch the initial page's content. (As you mentioned)
Set an expected baseline of support (perhaps, expect no JavaScript support whatsoever), and serve that on your initial page load. If they have JavaScript on, you can either reload, or use AJAX to tell your server what things it supports and then reload chunks (or all) of the initial page.
Serve no-javascript content to just search engines, use option 1 or 2 for everyone else.
Option three here is the most work intensive, but probably the most inclusive option. (Edit: 3 and 4 will make sure search engines see your content)
I'm working on a simple PHP application, using CouchDB and PHP-on-Couch to access some views, and it's working great. My next step is to introduce Ajax to update the frontend with data from the database.
I understand you can use the _changes notifications to detect any changes made on the database easily enough. So, its a matter of index.html monitoring for changes (through long polling), which calls loadView.php to update the page content.
Firstly, I hope the above is the correct method of going about it...
Secondly, when browsing to index.html, the page seems to never fully load (page load bar never completes). When a change is made, Firebug shows a the results as expected, but not any subsequent changes. At this time, the page seems to have stopped the infinite loading.
So far, i'm using jQuery to make the Ajax call...
$.getJSON('http://localhost:5984/db?callback=?', function(db) {
console.log(db.update_seq);
$.getJSON('http://localhost:5984/db/_changes?since='+db.update_seq+'&feed=continuous&callback=?', function(changes) {
console.log(changes);
});
});
Any ideas what could be happening here?
I believe the answer is simple enough.
A longpoll query is AJAX, guaranteed to respond only once, like fetching HTML or an image. It may take a little while to respond while it waits for a change; or it may reply immediately if changes have already happened.
A continuous query is COMET. It will never "finish" the HTTP reply, it will keep the connection open forever (except for errors, crashes, etc). Every time a change happens, zoom, Couch sends it to you.
So in other words, try changing feed=longpoll to feed=continuous and see if that solves it.
For background, I suggest the CouchDB Definitive Guide on change notifications and of course the excellent Couchbase Single Server changes API documentation.
Running a PHP script that is doing a huge mysql query plus some crunching on the results. Because of this the script takes a long time to execute and may appear to be not working to the user.
Is there a way to provide feedback to the user that the script is running?
Perhaps way to print to the browser with each loop - indicating what record it's on... kind of a "live output buffer" or something?
Try using flush(). http://us3.php.net/flush
You could also have a main page, that uses Javascript/jQuery to request the work page. Then, Javascript could show a nice little loader box telling you the the page is still doing stuff!
Do the request in an iframe. That way the user sees a page while the results are still being loaded in the frame. Ajax would work as well.
I am looking for a way to start a function on form submit that would not leave the browser window waiting for the result.
Example:
User fills in the form and press submit, the data from the form via javascript goes to the database and a function in php that will take several seconds will start but I dont want the user to be left waiting for the end of that function. I would like to be able to take him to another page and leave the function doing its thing server side.
Any thoughts?
Thanks
Thanks for all the replies...
I got the ajax part. But I cannot call ajax and have the browser move to another page.
This is what I wanted.
-User fills form and submits
-Result from the form passed to database
-long annoying process function starts
-user carries on visiting the rest of the site, independent of the status on the "long annoying process function"
By the way and before someone suggests it. No, it cannot be done by cron job
Use AJAX to call the php script, and at the top of the script turn on ignore_ user_ abort.
ignore_user_abort(true);
That way if they navigate away from the page the script will continue running in the backround. You can also use
set_time_limit(0);
to set a time limit, useful if you know your script will take a while to complete.
The most common method is:
exec("$COMMAND > /dev/null 2>&1 &");
Ah, ok, well you're essentially asking therefore, does PHP support threading, and the general answer is no... however...
there are some tricks you can perform to mimick this behaviour, one of which is highlighted above and involves forking out to a separate process on the server, this can be acheived in a number of ways, including the;
exec()
method. You also may want to look here;
PHP threading
I have also seen people try to force a flush of the output buffer halfway through the script, attempting to force the response back to the client, I dont know how successful this approach is, but maybe someone else will have some information on that one.
This is exactly what AJAX (shorthand for asynchronous JavaScript + XML) is for;
AJAX Information
It allows you to code using client side code, and send asynchronous requests to your server, such that the user's browser is not interuppted by an entire page request.
There is alot of information relating to AJAX out there on the web, so take a deep breath and get googling!
Sounds like you want to use some of the features AJAX (Asynchronous Javascript and XML - google) have to offer.
Basically, you would have a page with content. When a user clicks a button, javascript would be used to POST data to the server and begin processing. Simultaneously, that javascript might load a page from the server and then display it (eg, load data, and then replace the contents of a DIV with that new page.)
This kind of thing is the premise behind AJAX, which you see everywhere when you have a web page doing multiple things simultaneously.
Worth noting: This doesn't mean that the script is running "in the background on the server." Your web browser is still maintaining a connection with the web server - which means that the code is running in the "background" on the client's side. And by "background" we really mean "processing the HTTP request in parallel with other HTTP requests to give the feel of a 'background' running process"