apache not multi process the same php - php

I create one php to send one file, but before send this file need to check some situations, one situation is a maximum access, can only exist 5 download file per time.
I create php but apache processing one request per time, not all in the same time.
e.g if I make 3 request and put sleep(3) in php file, first request slow 3 seconds, second 6 seconds, and third 9 seconds.
I`m Not understand much about php and apache.
anyone can help me ?

If you use sessions, session is locked per request, so the second, third etc. request must wait until the first finish the process.
If you expect another request while a long process running with the same session you should call
session_write_close()
http://php.net/manual/en/function.session-write-close.php
explicitly. But only if you don't want to write to the session later in the process.
Edit:
If you want to reopen the session later, you can call
session_start() http://hu.php.net/manual/en/function.session-start.php
(before any output).

Related

Stopping requests server-side

I've got a name lookup box that operates by your typical ajax requests. Here's the general flow of the Javascript that fires every time a letter is pressed:
If ajax request already open, then abort it.
If timeout already created, destroy it.
Set new timeout to run the following in half a second:
Send string to 'nameLookup.php' via ajax
Wait for response
Display results
The issue is that nameLookup.php is very resource heavy. In some cases up to 10,000 names are being pulled from an SQL database, decrypted, and compared against the string. Under normal circumstances requests can take anywhere from 5 to 60 seconds to return.
I completely understand that when you abort a request on the client side the server is still working on things and sends back the result. That it's just the client side that knows to ignore the response. But the server is getting so hung up on working on all of these requests.
So if you do:
Request 1
Abort Request 1
Request 2
Abort Request 2
Request 3
Wait for response from Request 3
The server is either not even working on Request 3 until it's finished with 1 and 2... or it's just so hung up on working on Request 1 and 2 that Request 3 is taking an extra long amount of time.
I need to know how to tell the server to stop working on Request 1 and 2 so I can free up resources for it to work on Request 3.
I'm using Javascript & jQuery on the client side. PHP/Apache and SQL on the server side.
Store a boolean value in the DB in a table, or in the session.
Have your resource intensive script check periodically that value to see if it should continue or not. If the DB says to stop, then your script cancels itself (by calling return; in the current function for example).
When you want to cancel, instead of calling abort();, make an AJAX request to set that value to false.
Next time the resource checks that value it will see that it has to stop.
Potential limitations:
1. Your script does not have a way of checking periodically the DB.
2. Based on how often the script checks the DB, it might take a few seconds to effectively kill the script.
I think there is something missing from this question. What are the triggers for doing the requests? You might be trying to solve the wrong problem.
Let me elaborate on that. If you lookup box is actually doing autocompletion of some kind and is doing a new search everytime the user presses a key, then you are going to have the issue you describe.
The solution in that case is not killing all the process. The solution lies in not starting them. So, you might make some decisions like not trying to search if there is only one character to search with - lets say we go for three. We then might say we want to wait until we can be reasonable sure the user has finished typing before sending off the request. Lets say we wait 1 second.
Now, someone looking for all the Paul's in you list of names will send off one search when they type 'pau' and pause for 1 second, instead of three searches for 'p' then 'pa' then 'pau'... so no need to kill anything.
I've come up with an awesome solution that I've tested and it's working beautifully. It's just a few lines of PHP code to put in whatever files are being resource intensive.
This solution utilizes the Process Identifier (PID) from the server. We can use two PHP function: posix_getpid() to get the current PID and posix_kill() to kill another PID. This also assumes that you already have called session_start() somewhere else.
Here's the code:
//if any existing PIDs to kill, go through each
if ($_SESSION['pid']) foreach ($_SESSION['pid'] as $i => $pid) {
//if posix_kill returns true, unset this PID from the session so we don't waste time killing it again
if(posix_kill($pid,0)) unset($_SESSION['pid'][$i]);
}
//now that all others are killed, we can store the current PID in the session
$_SESSION['pid'][]=posix_getpid();
//close the session now, otherwise the PID we just added won't actually be saved to the session until the process ends.
session_write_close();
Couple things to note:
posix_kill has two values. The first is the pid, and the second is supposed to be one of the signal constants from this list. Nothing there made any sense to me, other people seemed to have success just using 0, and when I use 0 it returns true. So whatever works!
calling session_write_close() before the resource intensive things start happening is crucial. Otherwise the new PID that has been saved to the session won't ACTUALLY be saved to the session until all of the page's processing is done. Which means the next process won't know to cancel the one that's still going on and taking forever.

How should I make a long PHP request via AJAX, periodically check for status updates, and close the script if the request cancels?

Part of the PHP web app I'm developing needs to do the following:
Make an AJAX request to a PHP script, which could potentially take from one second to one hour, and display the output on the page when finished.
Periodically update a loading bar on the web page, defined by a status variable in the long running PHP script.
Allow the long running PHP script to detect if the AJAX request is cancelled, so it can shut down properly and in a timely fashion.
My current solution:
client.php: Creates an AJAX request to request.php, followed by one request per second to status.php until the initial request is complete. Generates and passes along a unique identifier (uid) in case multiple instances of the app are running.
request.php: Each time progress is made, saves the current progress percentage to $_SESSION["progressBar"][uid]. (It must run session_start() and session_write_close() each time.) When finished, returns the data that client.php needs.
status.php: Runs session_start(), returns $_SESSION["progressBar"][uid], and runs session_write_close().
Where it falls short:
My solution fulfills my first two requirements. For the third, I would like to use connection_aborted() in request.php to know if the request is cancelled. BUT, the docs say:
PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client. Simply using an echo statement does not guarantee that information is sent, see flush().
I could simply give meaningless output, but PHP must send a cookie every time I call session_start(). I want to use the same session, BUT the docs say:
When using session cookies, specifying an id for session_id() will always send a new cookie when session_start() is called, regardless of if the current session id is identical to the one being set.
My ideas for solutions, none of which I'm happy with:
A status database, or writing to temp files, or a task management system. This just seems more complicated than what I need!
A custom session handler. This is basically the same as the above solution.
Stream both progress data and result data in one request. This solves everything, but I would essentially be re-implementing AJAX. That can't be right.
Please tell me I'm missing something! Why doesn't PHP know immediately when a connection terminates? Why must PHP resend the cookie, even when it is exactly the same? An answer to any of these questions will be a big help!
My sincere thanks.
Why not set a second session variable, consisting of the unique request identifier and an access timestamp, from status.php.
If the client is closed it stops getting updates from status.php and the session variable stops being updated, which triggers a clean close in request.php if the variable isn't updated in a certain amount of time.

Can't have several threads per session

I am buidling some webapp and have implemented long-polling (and a command queue in my db) so my server can send commands to my cleint asynchronously, etc. The commands are encoded into json and sent over ajax calls for the client to server, and via long-polling for the server to client way.
Everything was working just fine, until I included my "Authentication module" in the ajax.php file. This module wraps the session stuff and calls session_start().
The problem is that, my long polling routine can wait up to 21 seconds before comming back to the client. During this time, the server won't run anything from the same session. It's instead executed right after the long polling ajax call returned.
I understand there's probably a restriction of only 1 thread per session at a time, and that the requests are queued up.
Now here's the question : What is the best way to address this? Is there a setting to allow several threads per sessions (3 would be fine, in my case). Or should I just send tell the client what is his SessionID (i have some sessions table in my db, to track which user is connected to which session(s)). The client could then send it along with any ajax calls so authentication module could be bypassed.
On the later option, iam afraid it open's up a bunch of security problems because of eventual session spoofing. I would need to send a "random string" to each session, to make sure you can't spoof too easily, but even then, it's not perfect...
Thanks for your awnsers :)
Nicolas Gauthier
It's a well known issue/fact that PHP locks session files for the duration of their usage in order to prevent race conditions.
If you take a look at the PHP source code, (ext/session/mod_files.c) you can see that the ps_files_open function locks the session file, and ps_files_close unlocks it.
If you call session_start() right at the beginning of your long-running script, and do not explicitly close the session file, it will be locked until the script terminates, where PHP will release all file locks during script shutdown.
While you are not using the session, you should call session_write_close to flush the session data to disk, and release the lock so that your other "threads" can read the data.
I'm sure you can imagine what would happen if the file was not locked.
T1: Open Session
T2: Open Session
...
T2: Write Data
T1: Write Data
The data written by thread 2 will be completely overwritten by thread 1, and at the same time, any data that thread 1 wanted to write out, was not available to thread 2.

Multiple simultaneous running ajax requests in ExtJS 4

Problem
I have a long running import job which I start with an ajax request, it could take some minutes until the request is finished. While this first ajax request is running, I want to have a look at the server to know how far the import is gone, this second request will be done every 2 seconds or so.
When I use the Ext.Ajax method the requests seems to be chained - the first ajax request (import) runs until it is finished, just then the second (import update) is fired.
I saw that Ext.Ajax is singleton, so maybe thats the reason. So I tried to create my own Connection objects with Ext.create('Ext.data.Connection') but it doesn't work.
My current request chain is:
first request - start
first request - end
second request - start
second request - end
But it should be:
first request - start
second request - start
second request - end
...maybe more second requests
first request - end
Question
The browser should be able to handle multiple request, there must be a limitation inside ExtJS but I didn't find it?
Update 2011-10-16
Answer
The problem wasn't ExtJS - sorry! It was PHP, my first script works with the session and the second script tried to access the session as well. And because PHP sessions are file based, the session file was locked from the first request script and the second request script had to wait until the first release the session lock.
I solved this with this little piece of code I added to my import process (the first script) after every x row:
$id = session_id();
session_write_close();
sleep(1);
session_start($id);
So it stops and reloads the session and the other script was able to hook in and get the session information.
Singleton or non-singleton doesn't even change the way Ext.Ajax works. I think this could be due to the coding (did you wait for the calls to finish?)
Afaik, I never have this problem before when I do multiple calls. The only thing that is hogging the calls is the server (PHP), which doesn't support parallel processing and causes delays, and generate a pattern like this
Call 1 - start
Call 2 - start
Call 1 get processed in the server and Call 2 get queued up
Call 1 - finished
Call 2 get processed in server
Call 2 - finished
It could be disastrous if Call 1 requires more time to process than Call 2.
EDIT:
I have written this little demo just for you to feel how does it works. Check it out :) Spent me half an hour lol!

php, multithreading and other doubts

morning
I have some doubts about the the way php works. I cant find the answer anywhere on books so I thought to hit the stack ;)
so here it goes:
lets assume we have one single server with php+apache installed. Here are my beliefs:
1 - php can handle one request at a time. Doesn't matter if apache can handle more than 1 thread at a time because eventually the invoked php interpreter is single threaded.
2 - from belief 1 follows that I believe if the server receives 4 calls at the same very time these calls are queued up and executed 1 at a time. Who makes the request last gets the response last.
3 - from 1 and 2 follows that if I cron-call a url corresponding to a script that does some heavy-lifting/time consuming stuff I slow down the server up to the moment the script returns.
Whats true? whats false?
cheers
My crystal ball suggests that you are using PHP sessions and you have having simultaneous requests (either iframes or AJAX) getting queued. The problem is that the default session handler uses files and session_start() locks the data file. You should read your session data quickily and then call session_write_close() to release the file.
I see no reason why would PHP be not able to handle multiple requests at the same time. That said, it may be semi-true for handling requests of single client, depending on the type of script.
Many scripts use sessions. When session_start() is called, session is being opened and locked. When execution of script ends, session is being closed and unlocked (this can be done manually). When there are multiple requests for the same session, first requests opens and locks session, and the second request has to wait until session is unlocked. This might make an impression that multiple PHP scripts cannot be executed at the same time, but that's true (partly) only for requests that use the same session (in other words - requests from the same browser). Requests from two clients (browsers) may be processed parallelly as long as they don't use resources (files, DB tables etc) that are being locked/unlocked in other requests.

Categories