I'm trying to use ajax to make multiple simultaneous requests to a php script, however, it only seems to do 1 instance at a time and I cannot connect to do the next call until the previous one is finished. What do I have to do in order to make it do them all at the same time? I'm using apache (xampp) on windows. I've also tested this on my unix server and the same thing is happening there as well.
In theory, there is nothing preventing one PHP script from being executed several times in parallel -- else, a lot of websites would have big problems ;-)
So, there is probably, in your situation, some locking mecanism that prevents this...
If your script is using sessions, and those are file-based (which is the default), those sessions could cause that kind of problem : with the default session handler, it's not possible to have several files accessing the same session data (i.e. the session data that corresponds to a given user) at the same time ; that's to prevent one script from overriding the data of another, and should probably not be disabled.
So, if your script is using sessions : would it be OK for you to stop using sessions ?
If not, you should try to close them as soon as you don't need them -- to unlock the files that are used to store them.
Here's a quote from the manual page of session_write_close, about that :
Session data is usually stored after
your script terminated without the
need to call session_write_close(),
but as session data is locked to
prevent concurrent writes only one
script may operate on a session at any
time. When using framesets
together with sessions you will
experience the frames loading one by
one due to this locking. You can
reduce the time needed to load all the
frames by ending the session as soon
as all changes to session variables
are done.
Related
On our PHP website we have noticed, that if I request pages in 2 separate tabs, the second tab will never start loading until the first tab finishes processing the PHP side (uses LiteSpeed webserver). It's a cloud webhosting that should handle a notable number of visitors. We observed the same behavior on previous hosting with Apache.
Could the issue be in webserver configuration? Or could it be something in our application blocking the processing?
If your application uses PHP sessions (intentionally or not), this will generally be the behavior.
When an application calls session_start() it generates a session for you as a user, and will do a file lock on this session until the execution of the site is done, then the lock will be removed, and the next script waiting for the same session can then proceed.
If you're not going to write any data into the session, it's beneficial to call session_write_close() early on, since this will free the lock, and let other processes (Your other tab) allow being processed.
The locking happens because you obviously don't want two processes trying to write to the same file at once, possibly overriding one of the processes data that was written to the session.
Obviously it could also be something else in your application causing this, but you'd have to actually debug the code to figure out. You could possibly use strace on the command-line as well to try to gather more information.
PHP session locking is explained in depth in https://ma.ttias.be/php-session-locking-prevent-sessions-blocking-in-requests/ - it simply goes into a lot more detail than I did above.
I've found already Own SuperGlobal Variable in PHP?
here it says that there is no way to somehow define/use a "server variable" that would be same for different users/sessions. But since few years has passed may be something has changed? And now such would be possible?
Basically I want to run a php script triggered by user event, but I don't want to run more often then e.g. once per 30mins. So my frst thought was keep a "server side var (that is visibles for all sessions/clients)" and if user trigger the event (e.g. open some page) check if script been running last 30 mins then just open the page, if no then before opening page e.g. purge outdated/passed things.
Yes I do understand I can do workaround with mySQL etc... but why to do workaroud if PHP would support what I need.
Thank you in advance!
A regular PHP variable cannot be persisted beyond the current execution of the script. Once a script ends, all its variables are gone. That's the basic nature of PHP; in fact it's the basic nature of all computer programs, it's just that PHP is typically always terminated very quickly. To persist data beyond a single script invocation, you have several options where to store that data:
a database
a file
shared memory (though this may be purged at any time)
a memory cache like memcached (basically an in-memory database server)
run a persistent PHP script and communicate with it via sockets (basically a memcache)
don't terminate your script, write it as a daemon
Nothing of this has changed recently.
For some reason, I found there is a problem when viewing my website, that only one php page can be loaded at a time in the browser.
For example, if I open "PageA.php" then open "PageB.php", "PageB.php" does not return any result (shows loading..) until "PageA.php" finishes loading.
I am using Apache Httpd on CentOS. Can someone help me please?
Thanks!!
You're probably using file-based PHP sessions, which by default lock the session file when a particular script instance is using the session. This will cause any other hits on scripts that use the same session to be locked out until the first script is completed.
If you need to have long-running scripts and/or allow parallel usage of the pages, you'll have to explicitly release the session lock in the scripts with session_write_close().
If you do so, $_SESSION will still be available for read/writing, but will no longer auto-save changes when the scripts exits. You can, however, do another session_start() later (assuming no output has been performed) to "resume" the session and enable the auto-save behavior again.
I have an AJAX script that takes about 10 minutes to do its thing. I would like to be able to tell the user 'Hey listen, the task is being completed, we'll let you know how it turns out', but it won't let the user run any other scripts on the server until that one completes (I believe this is a consequence of PHP being single threaded, but I'm not sure). Is there a way to assign that AJAX script to a separate PHP or Apache process, so that the user can continue to click around in the application without having to wait for the task to finish?
You can use database or files to insert some lock mechanism to prevent task from running multiple times simultaneously. Then you need to just spawn PHP process using command nohup (no hang up), for more details look at this article: https://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/ or this question: nohup: run PHP process in background .
I seek for hours, at least, the solution was very easy for me using Cron Jobs. In cPanel you can go to Advanced -> Cron Jobs, and there schedule a task using a PHP script in Command line.
A Command example that execute a script php:
/usr/bin/wget http://www.example.com/miscript.php
or better:
php /home/USER/public_html/miscript.php
Are you using PHP sessions? If so, then a likely cause is that the long-running script keeps the session locked until it finishes. Any other request trying to access that same session will have to wait until the first one is done (usually it'll exceed request timeouts).
To fix that you'll need session_write_close():
Session data is usually stored after your script terminated without the need to call session_write_close(), but as session data is locked to prevent concurrent writes only one script may operate on a session at any time. When using framesets together with sessions you will experience the frames loading one by one due to this locking. You can reduce the time needed to load all the frames by ending the session as soon as all changes to session variables are done.
So simply call that function right around where you tell the user hey ya gotta wait. If you need (read) access to session variables later on in that script, consider storing them in local variables, then close the session immediately afterwards before moving on to whatever's taking a long time. If you need write access you could try re-running session_start() at the end, but if the session is currently locked elsewhere it'll have the same blocking problem. You could work around that by e.g. storing something in the database from the background script and fetching it from the regular user session, for example.
On executing two very simple ajax POST requests (successivly), the Apache server seems to always respond in the same order as they were requested, although the second request takes significant less amount of time to process than the first request.
The time it takes the server to process Request1 is 30 seconds.
The time it takes the server to process Request2 is 10 seconds.
var deferred1 = dojo.xhrPost(xhrArgs1);
var deferred2 = dojo.xhrPost(xhrArgs2);
I expect Apache to achieve some "parallelization" on my dual core machine, which is obviously not happening.
When I execute each request at the same time in a separate broswer then works ok, the Request2 is returned first.
Facts:
httpd.conf has: ThreadsPerChild 50, MaxRequestsPerChild 50
PHP version : 5.2.5
Apache's access log states that both client requests are received at about the same time, which is as expected.
The Php code on the server side is something as simple as sleep(30)/sleep(10)
Any idea about why I don't get the "parallelization" when run from the same browser?
Thanks
When your two requests are sent from the same browser, they both share the same session.
When sessions are stored in files (that's the default), there is a locking mecanism that's used, to ensure that two scripts will not use the same session at the same time -- allowing that could result in the session data of the first script being overwriten by the second one.
That's why your second script doesn't start before the first one is finished : it's waiting for the lock (created by the first script) on the session data to be released.
For more informations, take a look at the manual page of session_write_close() -- which might be a solution to your problem : close the session before the sleep() (quoting) :
Session data is usually stored after
your script terminated without the
need to call session_write_close(),
but as session data is locked to
prevent concurrent writes only one
script may operate on a session at any
time. When using framesets
together with sessions you will
experience the frames loading one by
one due to this locking. You can
reduce the time needed to load all the
frames by ending the session as soon
as all changes to session variables
are done.
Browsers typically have a limit of two connections to the same site (although you may increase that limit in some browsers). Some browsers will keep one connection for downloading things like images etc. and another connection for XHR. Which means that your two XHR calls actually goes out in the same connection, one after the other.
Your browser will return immediately after each XHR call because they are async, but internally it may just batch up the requests.
When you run on two different browsers, obviously they each have the two connections, so the two XHR requests go out in different connections. No problem here.
Now it depends on the browser. If the browser allows you to occupy both connections with XHR calls, then you can get up to two requests running simultaneously. Then it will be up to the server which one to do first.
In any rate, if you try with three (or any number >2) XHR requests simultaneously, you will not get more than 2 executed on the server at the same time on modern browsers.