I was reading the CodeIgniter 3 documentation on using sessions and high traffic with AJAX, and it recommends using session_write_close().
My application has functions that write and read directly from $_SESSION, as an example below.
function setSession($index, $value) {
$_SESSION[$index] = $value;
}
function getSession($index) {
if (isset($_SESSION[$index]) {
return $_SESSION[$index];
} else {
return FALSE;
}
}
From what I understand from the documentation these two functions should always have session_start() at the beginning and session_write_close() at the end, correct?
That way they will avoid block the session and even the other requests being faster, since the session is unblocked.
Another issue that I found strange, CodeIgniter already has by default a session_start() function that is automatically triggered at the beginning of execution, without intervention from my code.
What will happen to this default session_start() if my functions were to call it every time something was fetched from the session?
Related
I'm trying to stop the execution of an php script in runtime triggered by the user.
My planned approach is to make an ajax call to the server, setting a session variable which is then looked up by the first already running script every now and then to determine if the process should stop or continue running. Hope that makes sense:)
I already found some examples of people doing the same here or with files instead of sessions here.
Now my problem is that I would like to use the session for my approach instead of the solution doing it with temporary files and the mentioned approach with sessions doesn't seem to work for me in Laravel 9.
The result I'm looking for:
start first php script (runs maybe 30 seconds)
make ajax call to server & set session variable stop_execution = true
the first php script which is still running detects the change in stop_execution === true & stops execution.
The behaviour I get:
start first php script (runs maybe 30 seconds)
make ajax call to server & set session variable stop_execution = true
the first php script which is still running doesn't detect the change in stop_execution === true & runs until it finishes by itself.
the next time I run the first php script again it immediately detects the change in stop_execution === true & stops execution.
My thought on why this is happening is that the session variables doesn't get refreshed inside the first script after checking them for the first time. Maybe there is a way to force pull all new changes from the session variables while the first script is running? Did somebody have the same issues with Laravel? It seems like this is working with session variables when not using Laravel. I think it has something to do with Laravel and how the sessions are handled.
I would appreciate any advice 😊
Thanks a lot!
Code:
First script executed at the beginning
private function longRunningFunction()
{
// check session variable every iteration and exit while loop if === true
while ($condition === true && ! session('cancel_computation')) {
// do stuff ...
}
// reset session variable to false after finishing
session(['cancel_computation' => false]);
return $result;
}
Second script executed on ajax call
public function cancelComputationFunction()
{
// set session variable to be true
session(['cancel_computation' => true]);
}
I would not advice you to use sessions for this.
They are initialized on script start and I have never seen somebody re-fetch them in the same script.
Nor am I able to find such functionality when researching online.
What you could do though, is to utitlize the Cache facade.
It is very well suited for what you want and it very lightweight no matter which driver you choose to use under the hood in Laravel.
So instead of:
public function cancelComputationFunction()
{
// set session variable to be true
session(['cancel_computation' => true]);
}
You could do something like this:
public function cancelComputationFunction()
{
// set cache variable to be true
Cache::put('cancel_computation_' . session()->getId(), true);
}
And likewise inside the long-running part of the script.
I'm building a chat function using Zend Framework.
In javascript, I use ajax to request to http://mydomain.com/chat/pull with function pullAction like this
public function pullAction() {
while ( true ) {
try {
$chat = Eezy_Chat::getNewMessage();
if($chat){
$chat->printMessage();
break;
}
sleep ( 1 ); // sleep 1 secound between each loop
} catch ( Zend_Db_Adapter_Exception $ex ) {
if ($ex->getCode () == 2006) { // reconnect db if timeout
$dbAdapter = Zend_Db_Table::getDefaultAdapter ();
$dbAdapter->closeConnection ();
$dbAdapter->getConnection ();
}
}
}
}
This action will running until other user send some message.
But while this request is running, I can not go to any other page on my site. All of them wait for http://mydomain.com/chat/pull to finished it execution.
I searching for a solution all over Google but still not found.
Thank for your help.
This sounds like Session locking.
When you use Sessions stored on the file system, PHP will lock the session file on each request and only give it free when that request is through. While the file is locked, any other requests wanting to access that file will hang and wait.
Since your chat script will loop forever, checking for new messages, the session file will be locked forever, too, preventing the same user from accessing different sections of the site requiring session access as well.
A solution is to load all the Session Data required to fulfill a Request into memory and then use Zend_Session::writeClose as soon as possible to release the lock.
I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.
Is there a way to prevent a code-block or a function within a code from running more than once even if I re-execute (or reload) the PHP file?
I mean, can I restrict someone from executing a php script more than once? I can't seem to find the way to do this.
Yes, you can use a $_SESSION variable to determine if the code has been executed. The session variable will be set until the user closes their browser. If you want to extend it further than that, you can set a cookie. Please see the following links for more details.
Session Variables
Cookies
If you are using sessions, then you can set a flag in the user's session array after the code has executed:
function doSomething(){
if (empty($_SESSION['completed'])){
//Do stuff here if it has not been executed.
}
$_SESSION['completed'] = TRUE;
}
You should also check the sesison variable to see if the task has been executed previously. This assumes that the user can accept a session cookie.
I have an app that does that.
What we did was create a table in the db called version, and stored a version number in there. When the script is ran, it compared the version number in the database with that in the php script. And perform whatever it needs to "upgrade" it to the new version, and then updates the version number in the database.
Of couse, if the version table does not exist, the code will create it and mark it as storing version zero.
Just put a counter in the function. If the counter is greater that 0, then don't do anything. The counter variable should be static so it "remembered" across multiple calls.
function sample() {
static $call_counter = 0;
if ( $call_counter>0 ) {
return;
}
...
$call_counter++;
}
As for making sure a file is only executed once, just use "include_once()" instead of "include()".
I have several time consuming database queries to run. Each has been built to be triggered from an option chosen on a web page. I thought I was being quite cunning by firing off the work via several AJAX requests.
I presumed that multiple requests would be split over multiple processes/threads meaning the work would be completed relatively quickly for the user.
However, the requests seem to be processed in serial, meaning that no speed benefit is felt by the user.
Worse still, AJAX requests to update the page also wait in line, meaning they fail to respond until the previous requests have all completed.
I have read that this may be caused by the PHP sessions being locked.
What is the usual approach for this kind of issue?
Is there a way to force AJAX requests to work asynchronously?
Can I stop PHP from locking the sessions?
Should I use a seperate process via cron to fire background workings?
Thanks!
NB This project has been built using the symfony framework.
AJAX uses jQuery
// Get the content
$.get('/ajax/itemInformation/slug/'+slug, function(data) {
$('#modal-more-information').html(data);
});
If you are using sessions at all during any of the given AJAX requests, they will effectively execute serially, in order of request. This is due to locking of the session data file at the operating system level. The key to getting those requests to be asynchronous is to close (or never start) the session as quickly as possible.
You can use session_write_close (docs) to close the session as soon as possible. I like to use a couple of helper functions for this, the set_session_var function below will open the session, write the var, then close the session - in and out as quickly as possible. When the page loads, you can call session_start to get the $_SESSION variable populated, then immediately call session_write_close. From then on out, only use the set function below to write.
The get function is completely optional, since you could simply refer to the $_SESSION global, but I like to use this because it provides for a default value and I can have one less ternary in the main body of the code.
function get_session_var($key=false, $default=null) {
if ($key == false || strlen($key) < 0)
return false;
if (isset($_SESSION[$key]))
$ret = $_SESSION[$key];
else
$ret = $default;
return $ret;
}
function set_session_var($key=false, $value=null) {
if ($key == false || strlen($key) < 0)
return false;
session_start();
if ($value === null)
unset($_SESSION[$key]);
else
$_SESSION[$key] = $value;
session_write_close();
}
Be aware that there are a whole new set of considerations once the AJAX requests are truly asynchronous. Now you have to watch out for race conditions (you have to be wary of one request setting a variable that can impact another request) - for you see, with the sessions closed, one request's changes to $_SESSION will not be visible to another request until it rebuilds the values. You can help avoid this by "rebuilding" the $_SESSION variable immediately before a critical use:
function rebuild_session() {
session_start();
session_write_close();
}
... but this is still susceptible to a race condition.