Prevent PHP script from being flooded - php

I want to prevent my script, from being flooded - if user hit F5 it is executing the script every time.
I want to prevent from this and allow one script execution per 2 seconds, is there any solution for that?

You can use memcache to do this ..
Simple Demo Script
$memcache = new Memcache ();
$memcache->connect ( 'localhost', 11211 );
$runtime = $memcache->get ( 'floodControl' );
if ((time () - $runtime) < 2) {
die ( "Die! Die! Die!" );
}
else {
echo "Welcome";
$memcache->set ( "floodControl", time () );
}
This is just a sample code .. there are also other thing to consider such as
A. Better IP address detection (Proxy , Tor )
B. Current Action
C. Maximum execution per min etc ...
D. Ban User after max flood etc
EDIT 1 - Improved Version
Usage
$flood = new FloodDetection();
$flood->check();
echo "Welcome" ;
Class
class FloodDetection {
const HOST = "localhost";
const PORT = 11211;
private $memcache;
private $ipAddress;
private $timeLimitUser = array (
"DEFAULT" => 2,
"CHAT" => 3,
"LOGIN" => 4
);
private $timeLimitProcess = array (
"DEFAULT" => 0.1,
"CHAT" => 1.5,
"LOGIN" => 0.1
);
function __construct() {
$this->memcache = new Memcache ();
$this->memcache->connect ( self::HOST, self::PORT );
}
function addUserlimit($key, $time) {
$this->timeLimitUser [$key] = $time;
}
function addProcesslimit($key, $time) {
$this->timeLimitProcess [$key] = $time;
}
public function quickIP() {
return (empty ( $_SERVER ['HTTP_CLIENT_IP'] ) ? (empty ( $_SERVER ['HTTP_X_FORWARDED_FOR'] ) ? $_SERVER ['REMOTE_ADDR'] : $_SERVER ['HTTP_X_FORWARDED_FOR']) : $_SERVER ['HTTP_CLIENT_IP']);
}
public function check($action = "DEFAULT") {
$ip = $this->quickIP ();
$ipKey = "flood" . $action . sha1 ( $ip );
$runtime = $this->memcache->get ( 'floodControl' );
$iptime = $this->memcache->get ( $ipKey );
$limitUser = isset ( $this->timeLimitUser [$action] ) ? $this->timeLimitUser [$action] : $this->timeLimitUser ['DEFAULT'];
$limitProcess = isset ( $this->timeLimitProcess [$action] ) ? $this->timeLimitProcess [$action] : $this->timeLimitProcess ['DEFAULT'];
if ((microtime ( true ) - $iptime) < $limitUser) {
print ("Die! Die! Die! $ip") ;
exit ();
}
// Limit All request
if ((microtime ( true ) - $runtime) < $limitProcess) {
print ("All of you Die! Die! Die! $ip") ;
exit ();
}
$this->memcache->set ( "floodControl", microtime ( true ) );
$this->memcache->set ( $ipKey, microtime ( true ) );
}
}

Store the last execution time of your script in a database or a
file.
Read from that file/database and compare to the current time.
If the difference is under 2 seconds, terminate the script.
Else, continue normally.

you can either use cookies (which can be disabled) so not a very good idea, or you can use store his ip address in the database, so if more then X tries from the same IP address then do not execute the code, just an if else statement, you will need a table with ip addresses time of request, number of tries
IF you do not want to use databases then you can use the following code
$file = "file.txt";
$file_content = file_get_contents($file);
$fh = fopen($file, 'w') or die("could not open file");
$now = time();
if($now - $file_content > 60){
// your code here
fwrite($fh, $now);
}else{
echo "Try again later";
}
fclose($fh);
but in this case, it won't be for each visitor but rather for all of them (so say user A came and execute the script, user B won't be able to execute it until 60 seconds pass.

The best way would be to store the time on serverside. If you leave the information on client side it would be easy to by pass.
I would for example save the timestamp in a table. That inputs and checks against spamming your script. And would be easy to set tolerence.

use either apc cache or mencache to store information storing to database or reading from file i believe is time/resource consuming

Related

PHP file_get_contents check source online or not before do the execution

I'm using PHP file_get_contents to read text file data.
Assuming I have 2 IP Address, 1 online and 1 offline:
192.168.180.181 - Online
192.168.180.182 - Offline
And the PHP
$fileAccept = file_get_contents("\\\\192.168.180.181\\Reports\\".$dModel['MODEL_NAME'].$source."\\Accept\\Accept_".$dDtl['MODEL_CODE']."_".$dateCode."_".$dDtl['TS_CODE'].".txt");
As We Know IP Address 192.168.180.182 is offline, then I tried to run the code. And result the page always loading.
My question, how can I prevent it maybe first need to check the IP is alive or not, if alive then can continue to next step.
Maybe something like this:
if(IP IS OFFLINE)
{
echo "do not do anything";
}
else
{
echo "do something";
}
you can try something like that
$scc = stream_context_create(array('http'=>
array(
'timeout' => 120, //120 seconds
)
));
$url = "http://192.168.180.181/....";
$handle = file_get_contents('$url, false, $scc);
you can create two handles and check whether is ok with if statement, of course you can change the timeout to suites you
Update:
if accessing file locally you can check this stream_set_timeout() function , the documentation is here
This solution is based on pinging the IP you need to check
class IPChecker{
public static function isIPOnline($ip){
switch (SELF::currentOS()){
case "windows":
$arg = "n";
break;
case "linux":
$arg = "c";
break;
default: throw new \Exception('unknown OS');
}
$result = "";
$output = [];
// to debug errors add 2>&1 to the command to fill $output
// https://stackoverflow.com/questions/16665041/php-why-isnt-exec-returning-output
exec("ping -$arg 2 $ip " , $output, $result);
// if 0 then the there is no errors like "Destination Host Unreachable"
if ($result === 0) return true;
return false;
}
public static function currentOS(){
if(strpos(strtolower(PHP_OS), "win") !== false) return 'windows';
elseif (strpos(strtolower(PHP_OS), "linux") !== false) return 'linux';
//TODO: extend other OSs here
else return 'unknown';
}
}
usage example
var_dump( IPChecker::isIPOnline("192.168.180.181") );// should outputs bool(true)
var_dump( IPChecker::isIPOnline("192.168.180.182") );// should outputs bool(false)
I have used these answers (1, 2) in my answer

PHP Server-side rate limit on AJAX callbacks?

I'm using PHP in conjunction with AJAX for a lot of functions on my site. I've implemented a sort of rate limiting system on the client-side with Javascript. Basically, disabling a button for 1 sec after it's clicked.
Although this works fine for the more innocent cases, but I feel like I need something on the server-side of things to limit requests as well.
Basically, I want users to have a maximum amount of AJAX calls they can make per second. In fact, one per second seems reasonable.
One way would be somehow logging every request to my AJAX callback, and reading from that table before a new request is made. But this would immensely increase the work load on my server and database.
Are there any alternative methods of doing this?
PHP
function comment_upvote_callback() {
// Some sort of rate limit??
// $_POST data validation
// Add upvote to database
// Return success or error
}
add_action( 'wp_ajax_comment_upvote_callback', 'comment_upvote_callback' );
jQuery
var is_clicked = false;
if ( is_clicked == false ) {
$('#comments').on('click', '.comment-upvote', function(event) {
event.preventDefault();
// Button disabled as long as isClicked == true
isClicked = true;
// Data to be sent
var data = {
'action': 'comment_upvote_callback',
'security': nonce,
'comment_id': comment_id
};
// Send Data
jQuery.post(ajaxurl, data, function(response) {
// Callback
// Client-side rate limit
setTimeout( function() {
is_clicked = false;
}, 1000);
});
});
}
As already told above, you should be using rate limiting at server-side
I have written a code to implement the same. You can copy the code into a file and simply include in your server-side script at the top. It accepts maximum 3hits/5secs. You can change the rate according to your needs
session_start();
const cap = 3;
$stamp_init = date("Y-m-d H:i:s");
if( !isset( $_SESSION['FIRST_REQUEST_TIME'] ) ){
$_SESSION['FIRST_REQUEST_TIME'] = $stamp_init;
}
$first_request_time = $_SESSION['FIRST_REQUEST_TIME'];
$stamp_expire = date( "Y-m-d H:i:s", strtotime( $first_request_time )+( 5 ) );
if( !isset( $_SESSION['REQ_COUNT'] ) ){
$_SESSION['REQ_COUNT'] = 0;
}
$req_count = $_SESSION['REQ_COUNT'];
$req_count++;
if( $stamp_init > $stamp_expire ){//Expired
$req_count = 1;
$first_request_time = $stamp_init;
}
$_SESSION['REQ_COUNT'] = $req_count;
$_SESSION['FIRST_REQUEST_TIME'] = $first_request_time;
header('X-RateLimit-Limit: '.cap);
header('X-RateLimit-Remaining: ' . ( cap-$req_count ) );
if( $req_count > cap){//Too many requests
http_response_code( 429 );
exit();
}

Configuring Laravel for Reverse Proxy

I have to upload a website that uses Laravel.
The server I must use are using a reverse proxy and when I put the files i developped on my computer, I'm getting a DNS error.
I do not have access to the server configuration, I can only upload/download files on the website's server partition.
I searched to find a solution but anything I can find was sort of related to this question.
So, this is not Laravel version, but this can help you, I hope !
Here is some code I wrote in cakePHP 2.X because I have some issues with a reverse proxy too
Cache class is the one from cakePHP, it's very simple (60 sec expiration, automatically serialized data).
LogError function is a simple log function
The code is the following
// Constants
define('REVERSE_PROXY_ADDR' , 'r-proxy.internal-domain.com');
define('REVERSE_PROXY_CACHE', 'r-proxy');
// Cache Config
Cache::config(REVERSE_PROXY_CACHE, array(
'engine' => 'File',
'prefix' => REVERSE_PROXY_CACHE . '_',
'path' => CACHE . REVERSE_PROXY_CACHE . '/',
'serialize' => true,
'duration' => 60,
));
function clientIp()
{
// Originaly from CakePHP 2.X
// ------------------------------
if ( isset($_SERVER['HTTP_CLIENT_IP']) )
{
$ipaddr = $_SERVER['HTTP_CLIENT_IP'];
}
else
{
$ipaddr = $_SERVER['REMOTE_ADDR'];
}
if ( isset($_SERVER['HTTP_CLIENTADDRESS']) )
{
$tmpipaddr = $_SERVER['HTTP_CLIENTADDRESS'];
if ( !empty( $tmpipaddr ) )
{
$ipaddr = preg_replace('/(?:,.*)/', '', $tmpipaddr);
}
}
$ip = trim($ipaddr);
// ------------------------------
// Reverse proxy stuff
if ( defined('REVERSE_PROXY_ADDR') && defined('REVERSE_PROXY_CACHE') )
{
$xfor = preg_replace('/(?:,.*)/', '', $_SERVER['HTTP_X_FORWARDED_FOR']);
$list = Cache::read('iplist', REVERSE_PROXY_CACHE);
if ( $list === false )
{
$list = gethostbynamel(REVERSE_PROXY_ADDR);
Cache::write('iplist', $list, REVERSE_PROXY_CACHE);
}
// empty or unreadable
if ( empty( $list ) )
{
logError('Unable to gather reverse proxy ip list, or empty list');
logError('Type : ' . gettype($list));
logError('IP : ' . $ip . ' - X-FORWARDED-FOR : ' . $xfor);
return $ip;
}
// not array ?!?!
if ( !is_array($list) )
{
logError('Given list was not an array');
logError('Type : ' . gettype($list));
logError($list);
return $ip;
}
// if in array, give forwarded for header
if ( in_array($ip, $list, true) )
{
return $xfor;
}
}
return $ip;
}
Then you just have to call the clientIp(); function.
If you have a static IP Address for your reverse proxy, you can just set it manually in the code, but that's not a good practice. But you will not need to use cache, and it will simplify a lot the code.
If you use a dynamic reverse proxy, you will have to query it on its hostname like this (what I did in the posted code) :
gethostbynamel('reverse-proxy-addr') to get a list of possible rproxy IPs
OR
gethostbyname('reverse-proxy-addr') to get one IP for rproxy
In other case you just have to check that REMOTE_ADDR is in a list of IPs marked as Reverse-proxy IPs
Hope it helps !

Show captcha when unexpected navigation detected to prevent traffic abuse [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 months ago.
Improve this question
I noticed that some user overloading my website by downloading multiple files (for example 500 files at same time) and opening more pages in small duration, I want to show captcha if unexpected navigation detected by user.
I know how to implement Captcha, but I can't figure out what is the best approach to detect traffic abuse using (PHP)?
A common approach is to use something like memcached to store the requests on a minute basis, I have open sourced a small class that achieves this: php-ratelimiter
If you are interested in a more thorough explanation of why the requests need to be stored on a minute basis, check this post.
So to sum it up, your code could end up looking like this:
if (!verifyCaptcha()) {
$rateLimiter = new RateLimiter(new Memcache(), $_SERVER["REMOTE_ADDR"]);
try {
$rateLimiter->limitRequestsInMinutes(100, 5);
} catch (RateExceededException $e) {
displayCaptcha();
exit;
}
}
Actually, the code is based on a per-minute basis but you can quite easily adapt this to be on a per 30 seconds basis:
private function getKeys($halfminutes) {
$keys = array();
$now = time();
for ($time = $now - $halfminutes * 30; $time <= $now; $time += 30) {
$keys[] = $this->prefix . date("dHis", $time);
}
return $keys;
}
Introduction
A similar question has be answered before Prevent PHP script from being flooded but it might not be sufficient reasons :
It uses $_SERVER["REMOTE_ADDR"] and they are some shared connection have the same Public IP Address
There are so many Firefox addon that can allows users to use multiple proxy for each request
Multiple Request != Multiple Download
Preventing multiple request is totally different from Multiple Download why ?
Lest Imagine a file of 10MB that would take 1min to download , If you limit users to say 100 request per min what it means you are given access to the user to download
10MB * 100 per min
To fix this issue you can look at Download - max connections per user?.
Multiple Request
Back to page access you can use SimpleFlood which extend memcache to limit users per second. It uses cookies to resolve the shared connection issue and attempts to get the real IP address
$flood = new SimpleFlood();
$flood->addserver("127.0.0.1"); // add memcache server
$flood->setLimit(2); // expect 1 request every 2 sec
try {
$flood->check();
} catch ( Exception $e ) {
sleep(2); // Feel like wasting time
// Display Captcher
// Write Message to Log
printf("%s => %s %s", date("Y-m-d g:i:s"), $e->getMessage(), $e->getFile());
}
Please note that SimpleFlood::setLimit(float $float); accepts floats so you can have
$flood->setLimit(0.1); // expect 1 request every 0.1 sec
Class Used
class SimpleFlood extends \Memcache {
private $ip;
private $key;
private $prenalty = 0;
private $limit = 100;
private $mins = 1;
private $salt = "I like TO dance A #### Lot";
function check() {
$this->parseValues();
$runtime = floatval($this->get($this->key));
$diff = microtime(true) - $runtime;
if ($diff < $this->limit) {
throw new Exception("Limit Exceeded By : $this->ip");
}
$this->set($this->key, microtime(true));
}
public function setLimit($limit) {
$this->limit = $limit;
}
private function parseValues() {
$this->ip = $this->getIPAddress();
if (! $this->ip) {
throw new Exception("Where the hell is the ip address");
}
if (isset($_COOKIE["xf"])) {
$cookie = json_decode($_COOKIE["xf"]);
if ($this->ip != $cookie->ip) {
unset($_COOKIE["xf"]);
setcookie("xf", null, time() - 3600);
throw new Exception("Last IP did not match");
}
if ($cookie->hash != sha1($cookie->key . $this->salt)) {
unset($_COOKIE["xf"]);
setcookie("xf", null, time() - 3600);
throw new Exception("Nice Faking cookie");
}
if (strpos($cookie->key, "floodIP") === 0) {
$cookie->key = "floodRand" . bin2hex(mcrypt_create_iv(50, MCRYPT_DEV_URANDOM));
}
$this->key = $cookie->key;
} else {
$this->key = "floodIP" . sha1($this->ip);
$cookie = (object) array(
"key" => $this->key,
"ip" => $this->ip
);
}
$cookie->hash = sha1($this->key . $this->salt);
$cookie = json_encode($cookie);
setcookie("xf", $cookie, time() + 3600); // expire in 1hr
}
private function getIPAddress() {
foreach ( array(
'HTTP_CLIENT_IP',
'HTTP_X_FORWARDED_FOR',
'HTTP_X_FORWARDED',
'HTTP_X_CLUSTER_CLIENT_IP',
'HTTP_FORWARDED_FOR',
'HTTP_FORWARDED',
'REMOTE_ADDR'
) as $key ) {
if (array_key_exists($key, $_SERVER) === true) {
foreach ( explode(',', $_SERVER[$key]) as $ip ) {
if (filter_var($ip, FILTER_VALIDATE_IP) !== false) {
return $ip;
}
}
}
}
return false;
}
}
Conclusion
This is a basic prove of concept and additional layers can be added to it such as
Set different limit for differences URLS
Add support for penalties where you block user for certain number of Mins or hours
Detection and Different Limit for Tor connections
etc
I think you can use sessions in this case.
Initialize a session to store a timestamp[use microtime for better results] and then get timestamp of the new page.The difference can be used to analyzed the frequency of pages being visited and captcha can be shown.
You can also run a counter on pages being visited and use a 2d array to store the page and timestamp.If the value of pages being visited increases suddenly then you can check for timestamp difference.

session_start() takes VERY LONG TIME

Mys site works very slowly (and I didn't have any idea about why). It is based on Zend Application, I used to make about tens of such sites, so I'm sure that MY code is OK.
I installed xdebugger on server, tried to profile it and guess what? php::session_start() took 48.675 seconds. Fourty Eight and a Half Seconds! It's unbelievable! What could be the reason of this? It's common operation, why could it execute SO long? How to fix such behaviour, which configs to edit? Searched over Google, but found no good answer (almost everywhere there's a question, but no answer). Thanks in before!
session_start (with sessions stored in files) is blocking in PHP, so this issue will appear if you try to start several server sessions for the same browser session (AJAX or multiple browser tabs/windows). Each session_start will wait until the other sessions have been closed.
See here: http://konrness.com/php5/how-to-prevent-blocking-php-requests/
Try changing from files to database storage of sessions.
My guess would be the garbage collection routine, which gets run inside of the native session_start() function. Maybe you've done something that keeps many old session files around, like changed the max life time? Or maybe you've decided it would be a good idea to store them in a database, but forgot to create a suitable index? The native GC routine stat()'s every single session file to check for expiration. This is time consuming if there's a lot of files built up.
edit: to help you for debugging only, disable garbage collection by temporarily setting session.gc-probability:
session.gc-probability = 0
Make sure the settings stick, I don't know what the zend framework might be doing here.
P.S. It's difficult to suggestion a fix without knowing the cause. My answer is meant to guide you towards identifying the cause.
I have had this problem and am surprised that nobody has posted this specific response. It may not be it but it is worth checking.
PHP LOCKS THE SESSION FILE while a page is processing, so that page can have exclusive access to it. Think about it, the sess_184c9aciqoc file is not a database, so two calls in the same session can't access it simultaneously. So if you have a lot of ajax calls, you can get a "traffic jam". Once you start doing advanced scripting this is a gotcha to beware of. by the way, here is a function to store an array of timestamps. I used this to figure out session start was the culprit:
//time function for benchmarking
if( function_exists('gmicrotime')){
function gmicrotime($n=''){
#version 1.1, 2007-05-09
//store array of all calls
global $mT;
list($usec, $sec) = explode(' ',microtime());
if(!isset($mT['_base_']))$mT['_base_']=$sec;
$t=round((float)$usec + (float)(substr($sec,-4)),6);
$mT['all'][]=$t;
if($n){
if(isset($mT['indexed'][$n])){
//store repeated calls with same index. If in a loop, add a $i if needed
if(is_array($mT['indexed'][$n])){
$mT['indexed'][$n][]=$t;
}else{
$mT['indexed'][$n]=array($mT['indexed'][$n],$t);
}
}else $mT['indexed'][$n]=$t;
}
//return elapsed since last call (in the local array)
$u=$mT['all'];
if(count($u)>1){
$mT['_total_']=$u[count($u)-1] - $u[0];
return round(1000*($u[count($u)-1]-$u[count($u)-2]),6);
}
}
gmicrotime('pageStart');
}
then i call as follows:
gmicrotime('beforeSessionStart');
session_start();
gmicrotime('afterSessionStart');
do_something_slow();
gmicrotime('afterSlowProcess');
//etc..
echo '<pre>';
print_r($mT);
Hope this is helpful!
Another approach might be that you have set a large memory_limit in PHP.ini.
I did that for uploading huge mysql dumps into PHPMyAdmin and load time spiked, perhaps (as said above) a lot of session files piled up now that PHP had room to spare. The default is 128M, I think. I had quadrupled that.
One way to avoid this problem is to ask PHP to store sessions in a database table instead of files.
Firstly, I will give you a few links as real credits for this solution:
http://www.tonymarston.net/php-mysql/session-handler.html
http://shiflett.org/articles/storing-sessions-in-a-database
http://culttt.com/2013/02/04/how-to-save-php-sessions-to-a-database/
Then a code implementation I derived from these readings:
<?php
class TLB_Sessions_in_Database
{
private $debug;
private $dbc;
function __construct()
{
$this->debug = false;
session_set_save_handler(
array($this, '_open'),
array($this, '_close'),
array($this, '_read'),
array($this, '_write'),
array($this, '_destroy'),
array($this, '_clean')
);
}
function _open()
{
if( $this->debug ) echo '_open:'.PHP_EOL;
if( ($this->dbc = mysql_connect(DB_HOST, DB_USER, DB_PASSWORD)) !== false )
{
$select_db = mysql_select_db(DB_NAME, $this->dbc);
$set_charset = mysql_set_charset(DB_CHARSET, $this->dbc);
if( $this->debug ) echo '- return: '.(( $select_db && $set_charset ) ? 'true' : 'false').PHP_EOL;
return( $select_db && $set_charset );
}
else
{
if( $this->debug ) echo '- error: '.mysql_error($this->dbc).PHP_EOL;
}
return( false );
}
function _close()
{
if( $this->debug ) echo '_close:'.PHP_EOL;
return( mysql_close($this->dbc) );
}
function _read($session_id)
{
if( $this->debug ) echo '_read:'.PHP_EOL;
$session_id = mysql_real_escape_string($session_id);
$sql = "SELECT `session_data` FROM `".DB_NAME."`.`php_sessions` WHERE `session_id` = '".$session_id."'";
if( $this->debug ) echo '- query: '.$sql.PHP_EOL;
if( ($result = mysql_query($sql, $this->dbc)) !== false )
{
if( !in_array(mysql_num_rows($result), array(0, false), true) )
{
$record = mysql_fetch_assoc($result);
return( $record['session_data'] );
}
}
else
{
if( $this->debug ) echo '- error: '.mysql_error($this->dbc).PHP_EOL;
}
return( '' );
}
function _write($session_id, $session_data)
{
if( $this->debug ) echo '_write:'.PHP_EOL;
$session_id = mysql_real_escape_string($session_id);
$session_data = mysql_real_escape_string($session_data);
//$sql = "REPLACE INTO `php_sessions` (`session_id`, `last_updated`, `session_data`) VALUES ('".$session_id."', '".time()."', '".$session_data."')";
$sql = "INSERT INTO `".DB_NAME."`.`php_sessions` (`session_id`, `date_created`, `session_data`) VALUES ('".$session_id."', NOW(), '".$session_data."') ON DUPLICATE KEY UPDATE `last_updated` = NOW(), `session_data` = '".$session_data."'";
if( ($result = mysql_query($sql, $this->dbc)) === false )
{
if( $this->debug ) echo '- error: '.mysql_error($this->dbc).PHP_EOL;
}
return( $result );
}
function _destroy($session_id)
{
if( $this->debug ) echo '_destroy:'.PHP_EOL;
$session_id = mysql_real_escape_string($session_id);
$sql = "DELETE FROM `".DB_NAME."`.`php_sessions` WHERE `session_id` = '".$session_id."'";
if( ($result = mysql_query($sql, $this->dbc)) === false )
{
if( $this->debug ) echo '- error: '.mysql_error($this->dbc).PHP_EOL;
}
return( $result );
}
function _clean($max)
{
if( $this->debug ) echo '_clean:'.PHP_EOL;
$sql = 'DELETE FROM `'.DB_NAME.'`.`php_sessions` WHERE `last_updated` < DATE_SUB(NOW(), INTERVAL '.$max.' SECOND)';
if( ($result = mysql_query($sql, $this->dbc)) === false )
{
if( $this->debug ) echo '- error: '.mysql_error($this->dbc).PHP_EOL;
}
return( $result );
}
}
new TLB_Sessions_in_Database();
END.
If you have multiple concurrent ajax calls on the same page this situation may cause your problem.
In my case it was incorrect memcache server settings in /etc/php.d/memcached.ini
Here is information on memcache properties and here is how to setup storage in memcache.
I just had this issue. session_start was taking about 5sec.
My issue was I had declared some variables above it.
I moved session_start to the top and it now takes a few milliseconds.
My page opens concurrent sessions within many <img src="download_image.php"> tags where download_image.php run session_start() and then downloading the image.
Inserting a session_write_close() in download_image.php fixed my problem.
session_start();
session_write_close();
download_image();
I have tried memcached and session_start(['read_and_close'=>true]). But only session_write_close() works for me.

Categories