This is a relatively simple question to which I haven't been able to find the answer (or my search skills are just rubbish), say I send $_POST content through jQuery/AJAX, the content is handled by PHP and a result is returned, finishing this function.
How long does this $_POST content still "live"? Is it "destroyed" as soon as the AJAX / PHP function is done?
Or does it remain in the system for a while?
HTTP is a stateless protocol, so the web server does not need to keep any data for the duration of multiple requests. When your web server sends multiple requests to PHP (usually PHP-FPM), it treats every request separately.
So if you send an HTTP POST request (which has a Content-Type of application/x-www-form-urlencoded or multipart/form-data) to PHP, the data will be loaded into $_POST before your script is executed. You can access it while your script is running, but afterwards all variables are deleted from memory, including $_POST.
If you want to use data across multiple requests, you need to persist it somehow, for example using Sessions, files, databases, shared memory, APCu.
Related
I have a web page that will have a dynamic number of input fields depending on the user's preference. I am using AJAX to send the data. I'm thinking of sending the data one field at a time via individual POST requests to a single php page. That page will look at the variable that has been set and respond appropriately. If I have a single javascript function that runs a for loop, sending the POST requests as it runs through the input fields, do I get one php session for each POST, or do I get only a single one that begins, runs the script, and ends?
Be sure that you clearly understand what "a session" is, and how it relates to HTTP and therefore also to AJAX.
Ordinarily, you would bundle-up all the data and send it in one AJAX request, although you can certainly do things any way you like. Remember that AJAX requests are likely to be processed in parallel on the server, and that they might be handled in an unpredictable sequence. Therefore, it is common practice that "everything that logically 'goes together' is sent together, in one AJAX round-trip.
Your "session," meanwhile, would be established once, and then referenced (implicitly ...) in all AJAX requests, as well as in any other HTTP or HTTPS activity which occurs.
is there any "in-use"-case for the specific PHP-file in AJAX?
For example: On my website there are numerous AJAX scripts running simultanious. The PHP script is already in use for a AJAX script. What does it mean for the other AJAX scripts? Do they connect to a new instance of the PHP script or what? Or am I completely wrong?
As #mudasobwa said PHP is (most of the times) stateless. It's not really PHP which is stateles, but it is HTTP which is a stateless protocol.
So that means every request is completely seperate from each other and they don't know anything about the request before or after. That's why there are things like SESSION, POST and GET to send or share information accross multiple PHP (HTTP) requests.
Also note this is the reason why you need to do things like authentication, database connection and query data every request.
As already mention in comments each request is separate.
However, if you are using sessions, in the php files requested by ajax, and the same request takes some time to process, your second ajax might hang, if this is trying to access the same session. This is due to the fact that php will lock your session file until it knows that the locking thread is done with it.
What you can do (if you are using session), is to close the session file by calling "session_write_close", when you know that you don't need to update the session.
http://php.net/session_write_close
I'm creating some specific application in PHP that receive data from clients and process it.
Sometimes the data sent by client can be significantly large or unknown length.
So I want to ask if it possible to make PHP start processing data before client finish uploading?
Here what I'm looking to:
From PHP docs it is possible to use PUT method and receive content via php://input
http://www.php.net/manual/en/features.file-upload.put-method.php
I'm planning on using PUT method with chunked transfer encoding.
but I just try sending PUT method to phpinfo.php (You know what inside.)
and intentionally not sent last chunk. (zero length chunk)
The PHP doesn't response unless I complete the request.
(That's not what I expect from PUT method it should has some kind of pros.
If it unable to start PHP early I rather use ordinary POST method.)
And according to this topic
Can I reply to a HTTP Request Before I've Finished Receiving it?
I don't think early response from PHP will break HTTP.
(Don't worry about the client I write it myself.)
Note: Breaking data into many request is not an option because in order to process the data there are some memory/objects that cant be shared among multiple PHP instances. (even with memory table in MySQL.)
Is it possible to achieve true multi-threading with Ajax? If so, how? Please give me some related information, websites or books.
It depends on what you mean by "multithreaded".
Javascript code is distinctly singlethreaded. No Javascript code will interrupt any other Javascript code currently executing on the same page. An AJAX (XHR) request will trigger the browser to do something and (typically) call a callback when it completes.
On the server each Ajax request is a separate HTTP request. Each of these will execute on their own thread. Depending on th Web server config, they may not even execute on the same machine. But each PHP script instance will be entirely separate, even if calling the same script. There is no shared state per se.
Now browsers typically cap the number of simultaneous Ajax requests a page can make on a per host basis. This number is typically 2. I believe you can change it but since the majority of people will have the default value, you have to assume it will be 2. More requests than that will queue until an existing request completes. This can lead to having to do annoying things like creating multiple host names like req1.example.com, req2.example.com, etc.
The one exception is sessions but they aren't multithreaded. Starting a session will block all other scripts attempting to start the exact same session (based on the cookie). This is one reason why you need to minimize the amount of time a session is opened for. Arguably you could use a database or something like memcache to kludge inter-script communication but it's not really what PHP is about.
PHP is best used for simple request processing. A request is received. It is processed and a response is returned. That response could be HTML, XML, text, JSON or whatever. The request could be an HTTP request from the browser or an AJAX request.
Each of these request-response cycles should, where possible, be treated as separate entities.
Another technique used is long-polling. An HTTP request is sent to the server and may not return for a long time. This is used for Web-based chat and other "server push" type scenarios. Sometimes partial responses will be flushed without ending the request.
The last option (on Unix/Linux at least) is that PHP can spawn processes but that doesn't seem to be what you're referring to.
So what is it exactly you're trying to do?
You can't actually multi-thread but what a lot of larger websites do is flush the output for a page and then use Ajax to load additional components on the fly so that the user sees content even while the browser is still requesting new information. Its a good technique to know but, like everything else, you need to be careful how you use it.
I have a page that I am performing an AJAX request on. The purpose of the page is to return the headers of an e-mail, which I have working fine. The problem is that this is called for each e-mail in a mailbox. Which means it will be called once per mail in the box. The reason this is a problem is because the execution time of the imap_open function is about a second, so each time it is called, this is executed. Is there a way to make an AJAX call which will return the information as it is available and keep executing to prevent multiple calls to a function with a slow execution time?
Cheers,
Gazler.
There are technologies out there that allow you to configure your server and Javascript to allow for essentially "reverse AJAX" (look on Google/Wikipedia for "comet" or "reverse AJAX"). However, it's not incredibly simple and for what you're doing, it's probably not worth all of the work that goes into setting that up.
It sounds like you have a very common problem which is essentially you're firing off a number of AJAX requests and they each need to do a bit of work that realistically only one of them needs to do once and then you'd be good.
I don't work in PHP, but if it's possible to persist the return value of imap_open or whatever it's side effects are across requests, then you should try to do that and then just reuse that saved resource.
Some pseudocode:
if (!persisted_resource) {
persisted_resource = imap_open()
}
persisted_resource.use()....
where persisted_resource should be some variable stored in session scope, application scope or whatever PHP has available that's longer lived than a request.
Then you can either have each request check this variable so only one request will have to call imap_open or you could initialize it while you're loading the page. Hopefully that's helpful.
Batch your results. Between loading all emails vs loading a single email at a time, you could batch the email headers and send it back. Tweak this number till you find a good fit between responsiveness and content.
The PHP script would receive a range request in this case such as
emailHeaders.php?start=25&end=50
Javascript will maintain state and request data in chunks until all data is loaded. Or you could do some fancy stuff such as create client-side policies on when to request data and what data to request.
The browser is another bottleneck as most browsers only allow 2 outgoing connections at any given time.
It sounds as though you need to process as many e-mails as have been received with each call. At that point, you can return data for all of them together and parse it out on the client side. However, that process cannot go on forever, and the server cannot initiate the return of additional data after the http request has been responded to, so you will have to make subsequent calls to process more e-mails later.
The server-side PHP script can be configured to send the output as soon as its generated. You basically need to disable all functionality that can cause buffering, such as output_buffering, output_handler, HTTP compression, intermediate proxies...
The difficult part is that you'd need that your JavaScript library is able to handle partial input. That is to say: you need to have access to downloaded data as soon as it's received. I believe it's technically possible but some popular libraries like jQuery only allow to read data when the transfer is complete.