persistent php connection? - php

Is it possible to have a php script pause at a certain point wait for an ajax response, then continue executing?
If not, what is the most appropriate way to store the current state of a script (all defined variables ect) untill it in invoked again?

Making it wait for an ajax response which could never come sounds like a very bad plan (apart from being impossible due to the nature of http as a stateless protocol).
What's wrong with using session to persist user data (you can store objects in the session)? It won't serialize the whole state automagically though.

Not how php works, or ajax really either. Each request is independent. You can use a session.

I think you are looking for Cometlink text - which is a form of reverse Ajax which holds the connection open indefinitely - although I don't think you'd have a PHP script just wait for it...

If you want to get in your php script response that you would normally access via ajax then just use curl functions. Curl will wait for the response to arrive.

Related

Ajax -PHP in use case

is there any "in-use"-case for the specific PHP-file in AJAX?
For example: On my website there are numerous AJAX scripts running simultanious. The PHP script is already in use for a AJAX script. What does it mean for the other AJAX scripts? Do they connect to a new instance of the PHP script or what? Or am I completely wrong?
As #mudasobwa said PHP is (most of the times) stateless. It's not really PHP which is stateles, but it is HTTP which is a stateless protocol.
So that means every request is completely seperate from each other and they don't know anything about the request before or after. That's why there are things like SESSION, POST and GET to send or share information accross multiple PHP (HTTP) requests.
Also note this is the reason why you need to do things like authentication, database connection and query data every request.
As already mention in comments each request is separate.
However, if you are using sessions, in the php files requested by ajax, and the same request takes some time to process, your second ajax might hang, if this is trying to access the same session. This is due to the fact that php will lock your session file until it knows that the locking thread is done with it.
What you can do (if you are using session), is to close the session file by calling "session_write_close", when you know that you don't need to update the session.
http://php.net/session_write_close

How to print http response first and do the heavy db operations later?

I'm implementing a mobile api. One of the requests processes json data and returns afterwards a predefined message( is independent from the calculation) back to the device. I'm using kohana 3.
How do I return the http response first and do the calculation afterwards?
What do you think about, using a message queue and a separate program that does the processing and db operations?
One option would be to use gearman. There is a Kohana gearman module made by one of the Kohana devs.
Maybe you can help flush() function, which send buffer (and header also). But flush() don't guarantee header send, because between php and web browser stays a web server (like apache)
Not sure I understand your question, but if you want to buffer output you can use ob_start() and ob_get_clean()
I think you might be looking for something like
ignore_user_abort(true)
http://www.php.net/manual/en/function.ignore-user-abort.php
After making the call you can send your response back to the browser and finish wrapping up your calculations/logging after the connection is closed and the client is off doing something else.
This enables you to do some quick processing without hanging up the client or having to use an external process to handle your tasks

ajax multi-threaded

Is it possible to achieve true multi-threading with Ajax? If so, how? Please give me some related information, websites or books.
It depends on what you mean by "multithreaded".
Javascript code is distinctly singlethreaded. No Javascript code will interrupt any other Javascript code currently executing on the same page. An AJAX (XHR) request will trigger the browser to do something and (typically) call a callback when it completes.
On the server each Ajax request is a separate HTTP request. Each of these will execute on their own thread. Depending on th Web server config, they may not even execute on the same machine. But each PHP script instance will be entirely separate, even if calling the same script. There is no shared state per se.
Now browsers typically cap the number of simultaneous Ajax requests a page can make on a per host basis. This number is typically 2. I believe you can change it but since the majority of people will have the default value, you have to assume it will be 2. More requests than that will queue until an existing request completes. This can lead to having to do annoying things like creating multiple host names like req1.example.com, req2.example.com, etc.
The one exception is sessions but they aren't multithreaded. Starting a session will block all other scripts attempting to start the exact same session (based on the cookie). This is one reason why you need to minimize the amount of time a session is opened for. Arguably you could use a database or something like memcache to kludge inter-script communication but it's not really what PHP is about.
PHP is best used for simple request processing. A request is received. It is processed and a response is returned. That response could be HTML, XML, text, JSON or whatever. The request could be an HTTP request from the browser or an AJAX request.
Each of these request-response cycles should, where possible, be treated as separate entities.
Another technique used is long-polling. An HTTP request is sent to the server and may not return for a long time. This is used for Web-based chat and other "server push" type scenarios. Sometimes partial responses will be flushed without ending the request.
The last option (on Unix/Linux at least) is that PHP can spawn processes but that doesn't seem to be what you're referring to.
So what is it exactly you're trying to do?
You can't actually multi-thread but what a lot of larger websites do is flush the output for a page and then use Ajax to load additional components on the fly so that the user sees content even while the browser is still requesting new information. Its a good technique to know but, like everything else, you need to be careful how you use it.

PHP and AJAX, sending new PHP info to page for AJAX to receive, is this possible?

I'm searching on how to do this but my searches aren't turning up things that are talking about what I'm trying to do so maybe I'm not searching with the right terms or this isn't possible, but figured I would ask here for help.. this is what I am trying to do..
I have PHP scripts that are called asyncrhonously, so it is called and it just runs, the calling PHP doesn't wait for a response, so it can go on to do other stuff / free things up so another asynch php process can be run.
I would still like to get back a result from these "zombie" scripts or whatever you want to call them, however the only way I can think of doing it that I know for sure will work is something like make this "zombie" script save its final output to a database and then have my AJAX UI make periodic requests to this database to check if the needed value exists in the place it is supposed to.. which would allow it to get the output from the zombie PHP script..
I am thinking it would be better if somehow this zombie script could do a sort of page refresh to the AJAX ui but the ajax ui would intercept this and just take the received data from PHP and use it as needed (such as display in a DIV for user to see).. basically I'm wondering if you can make PHP force this kind of thing rather than needing to involve a database in this and making AJAX do repeated requests to check for a specific value that way..
Thanks for any advice
No, a background script has no way to influence the client's front-end because it has no connection to it.
Starting a background script, having the script write status data into a shared space - be it a database or a memcache or a similar solution - and polling the status through Ajax is usually indeed the best way to go.
One alternative may be Comet. It's a technique where a connection is kept open over a long time, and updated actively from the server side (instead of frequent client-side Ajax polling). I have no practical experience with this but I imagine it most probably needs server side tweaking to be doable in PHP - it's not the best platform for long-running stuff. See this question for some approaches.

AJAX return data before execution has finished

I have a page that I am performing an AJAX request on. The purpose of the page is to return the headers of an e-mail, which I have working fine. The problem is that this is called for each e-mail in a mailbox. Which means it will be called once per mail in the box. The reason this is a problem is because the execution time of the imap_open function is about a second, so each time it is called, this is executed. Is there a way to make an AJAX call which will return the information as it is available and keep executing to prevent multiple calls to a function with a slow execution time?
Cheers,
Gazler.
There are technologies out there that allow you to configure your server and Javascript to allow for essentially "reverse AJAX" (look on Google/Wikipedia for "comet" or "reverse AJAX"). However, it's not incredibly simple and for what you're doing, it's probably not worth all of the work that goes into setting that up.
It sounds like you have a very common problem which is essentially you're firing off a number of AJAX requests and they each need to do a bit of work that realistically only one of them needs to do once and then you'd be good.
I don't work in PHP, but if it's possible to persist the return value of imap_open or whatever it's side effects are across requests, then you should try to do that and then just reuse that saved resource.
Some pseudocode:
if (!persisted_resource) {
persisted_resource = imap_open()
}
persisted_resource.use()....
where persisted_resource should be some variable stored in session scope, application scope or whatever PHP has available that's longer lived than a request.
Then you can either have each request check this variable so only one request will have to call imap_open or you could initialize it while you're loading the page. Hopefully that's helpful.
Batch your results. Between loading all emails vs loading a single email at a time, you could batch the email headers and send it back. Tweak this number till you find a good fit between responsiveness and content.
The PHP script would receive a range request in this case such as
emailHeaders.php?start=25&end=50
Javascript will maintain state and request data in chunks until all data is loaded. Or you could do some fancy stuff such as create client-side policies on when to request data and what data to request.
The browser is another bottleneck as most browsers only allow 2 outgoing connections at any given time.
It sounds as though you need to process as many e-mails as have been received with each call. At that point, you can return data for all of them together and parse it out on the client side. However, that process cannot go on forever, and the server cannot initiate the return of additional data after the http request has been responded to, so you will have to make subsequent calls to process more e-mails later.
The server-side PHP script can be configured to send the output as soon as its generated. You basically need to disable all functionality that can cause buffering, such as output_buffering, output_handler, HTTP compression, intermediate proxies...
The difficult part is that you'd need that your JavaScript library is able to handle partial input. That is to say: you need to have access to downloaded data as soon as it's received. I believe it's technically possible but some popular libraries like jQuery only allow to read data when the transfer is complete.

Categories