How does session_start lock in PHP? - php

Originally, I just want to verify that session_start locks on session. So, I create a PHP file as below. Basically, if the pageview is even, the page sleeps for 10 seconds; if the pageview is odd, it doesn't. And, session_start is used to obtain the page view in $_SESSION.
I tried to access the page in two tabs of one browser. It is not surprising that the first tab takes 10 seconds since I explicitly let it sleep. The second tab would not sleep, but it should be blocked by sessiont_start. That works as expected.
To my surprise, the output of the second page shows that session_start takes almost no time. Actually, the whole page seems takes no time to load. But, the page does take 10 seconds to show in browser.
obtained lock
Cost time: 0.00016689300537109
Start 1269739162.1997
End 1269739162.1998
allover time elpased : 0.00032305717468262
The page views: 101
Does PHP extract session_start out of PHP page and execute it before other PHP statements?
This is the code.
<?php
function float_time()
{
list($usec, $sec) = explode(' ', microtime());
return (float)$sec + (float)$usec;
}
$allover_start_time = float_time();
$start_time = float_time();
session_start();
echo "obtained lock<br/>";
$end_time = float_time();
$elapsed_time = $end_time - $start_time;
echo "Cost time: $elapsed_time <br>";
echo "Start $start_time<br/>";
echo "End $end_time<br/>";
ob_flush();
flush();
if (isset($_SESSION['views']))
{
$_SESSION['views'] += 1;
}
else
{
$_SESSION['views'] = 0;
}
if ($_SESSION['views'] % 2 == 0)
{
echo "sleep 10 seconds<br/>";
sleep(10);
}
$allover_end_time = float_time();
echo "allover time elpased : " . ($allover_end_time - $allover_start_time) . "<br/>";
echo "The page views: " . $_SESSION['views'];
?>

That seems to be a firefox related "issue". If you request the same url in two tabs/windows the second request waits until the first request is finished (could also be an addon that blocks the second request, haven't tested that).
Take e.g.
<?php // test.php
$start = microtime(true);
echo "<pre>start: $start</pre>";
sleep(5);
$end = microtime(true);
echo '<pre>', $start, "\n", $end, "\n", $end-$start, '</pre>';
I called it twice and the output was
start: 1269742677.6094
1269742677.6094
1269742682.609
4.9995958805084
and
start: 1269742682.6563
1269742682.6563
1269742687.6557
4.9994258880615
Note that there's already a 5 second gap between the start times.
When called as http://localhost/test.php and http://localhost/test.php?a=b instead of the exact same url twice this does not happen.
Both IE8 and Chrome do not show that behavior.

Yes, This could be because of session_start() blocking other requests in the same session (file based). I was able to verify the issue in Firefox(4.x) and Chrome(10.x) in Windows XP/PHP 5.2 with default session handler(file). I am not sure if this issue is reproducible for non-file session handlers.
obtained lock
**Cost time: 9.90100598335**
Start 1303227658.67
End 1303227668.57
sleep 10 seconds
allover time elpased : 19.9027831554
The page views: 4
This is a very interesting issue and the Firefox tab locking described in the above answer would have eclipsed this one from being detected.
http://www.php.net/manual/en/function.session-start.php#101452

Since php does not have a container. How do two calls to same session get serialized? Who does this? How do the two processes talk? Is PHP module always active and only spawning threads after doing session check? In that case the PHP module is indeed behaving like a container that is, in this case, providing session management service to this extent.

Related

creating schedule task without Cron job

Scheduled task needs to be created but its not possible to use Cron job (there is a warning from hosting provider that "running the cron Job more than once within a 45-minute period is a infraction of their rules and could result in shutting down the account."
php script (which insert data from txt to mysql database) should be executed every minute, ie this link should be called http://www.myserver.com/ImportCumulusFile.php?type=dayfile&key=letmein&table=Dayfile&file=./data/Jan10log.txt
Is there any other way?
There are multiple ways of doing repetitive jobs. Some of the ways that I can think about right away are:
Using: https://www.setcronjob.com/
Use an external site like this to fire off your url at set intervals
Using meta refresh. More here. You'd to have to open the page and leave it running.
Javascript/Ajax refresh. Similar to the above example.
Setting up a cron job. Most shared hosting do provide a way to set up cron jobs. Have a look at the cPanel of your hosting.
if you have shell access you could execute a php script via the shell
something like this would be an endless loop, that would sleep 60 seconds execute, collect garbage and repeat until the end of time.
while(true) {
sleep(60);
//script here
//end your script
}
or you could do a "poor mans cron" with ajax or meta refresh. i've done it before. basically, you just place a redirect with either javascript or html's meta refresh at the beggining of your script. access this script from your browser, and just leave it open. it'll refresh every 60 seconds, just like a cronjob.
yet another alternative to a cronjob, would be a bash script such as:
#!/bin/bash
while :
do
sleep 60
wget http://127.0.0.1/path/to/cronjob.php -O Temp --delete-after
done
all this being said, you probably will get caught by the host and get terminated anyway.
So your best solution:
go and sign up for a 5-10 dollar a month vps, and say good bye to shared hosting and hello to running your own little server.
if you do this, you can even stop using crappy php and use facebook's hhvm instead and enjoy its awesome performance.
There's a free service at
http://cron-job.org
That lets you set up a nice little alternative.
Option A
An easy way to realize it would be to create a file/database entry containing the execution time of your php script:
<?php
// crons.php
return [
'executebackup.php' => 1507979485,
'sendnewsletter.php' => 1507999485
];
?>
And on every request made through your visitors you check the current time and if its higher you include your php script:
<?php
// cronpixel.php
$crons = #include 'cache/crons.php';
foreach ($crons as $script => $time) {
if ($time < time()) {
// create lock to avoid race conditions
$lock = 'cache/' . md5($script) . '.lock';
if (file_exists($lock) || !mkdir($lock)) {
continue;
}
// start your php script
include($script);
// now update crons.php
$crons[ $script ] += 86400; // tomorrow
file_put_contents('cache/crons.php', '<?php return ' . var_export($crons, true) . '; ?' . '>')
// finally delete lock
rmdir($lock);
}
}
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
// image data
$im = imagecreate(1, 1);
$blk = imagecolorallocate($im, 0, 0, 0);
imagecolortransparent($im, $blk);
// image output
header("Content-type: image/gif");
imagegif($im);
// free memory
imagedestroy($im);
?>
Note: It will be rarily called on the exact second, because you do not know when your visitor will open your page (maybe 2 seconds later). So it makes sense to set the new time for the next day not through adding 86400 seconds. Instead use mktime.
Option B
This is a little project I realized in the past, that is similar to #r3wt 's idea, but covers race conditions and works on exact times like a cronjob would do in a scheduler without hitting the max_execution_time. And it works most of the time without the need to resurrect it (as done through visitors in Option A).
Explanation:
The script writes a lock file (to avoid race conditions) for the 15th, 30th, 45th and 60th second of a minute:
// cron monitoring
foreach ($allowed_crons as $cron_second) {
$cron_filename = 'cache/' . $cron_second . '_crnsec_lock';
// start missing cron requests
if (!file_exists($cron_filename)) {
cron_request($cron_second);
}
// restart interrupted cron requests
else if (filemtime($cron_filename) + 90 < time()) {
rmdir($cron_filename);
cron_request($cron_second);
}
}
Every time a lock file is missing the script creates it and uses sleep() to reach the exact second:
if (file_exists($cron_filename) || !mkdir($cron_filename)) {
return;
}
// add one minute if necessary
$date = new DateTime();
$cron_date = new DateTime();
$cron_date->setTime($cron_date->format('H'), $cron_date->format('i'), $sec);
$diff = $date->diff($cron_date);
if ($diff->invert && $diff->s > 0) {
$cron_date->setTime($cron_date->format('H'), $cron_date->format('i') + 1, $sec);
}
$diff = $date->diff($cron_date);
// we use sleep() as time_sleep_until() starts one second to early (https://bugs.php.net/bug.php?id=69044)
sleep($diff->s);
After waking up again, it sends a request to itself through fopen():
// note: filter_input returns the unchanged SERVER var (http://php.net/manual/de/function.filter-input.php#99124)
// note: filter_var is unsecure (http://www.d-mueller.de/blog/why-url-validation-with-filter_var-might-not-be-a-good-idea/)
$url = 'http' . isSecure() . '://' . filter_input(INPUT_SERVER, 'HTTP_HOST', FILTER_SANITIZE_URL) . htmlspecialchars($request_uri, ENT_QUOTES, 'UTF-8');
$context = stream_context_create(array(
'http' => array(
'timeout' => 1.0
)
));
// note: return "failed to open stream: HTTP request failed!" because timeout < time_sleep_until
if ($fp = #fopen($url, 'r', false, $context)) {
fclose($fp);
}
rmdir($cron_filename);
By that it calls itself infinitely and you are able to define different starting times:
if (isset($_GET['cron_second'])) {
if ($cron_second === 0 && !(date('i') % 15)) {
mycron('every 15 minutes');
}
if ($cron_second === 0 && !(date('i') % 60)) {
mycron('every hour');
}
}
Note: It produces 5760 requests per day (4 per minute). Not much, but a cronjob uses much less ressources. If your max_execution_time is high enough you could change it to calling itself only once per minute (1440 requests/day).
I understand that this question is bit old but I stumbled on it a week ago with this very question and the best and secure option we found was using a Web Service.
Our context:
We have our system in both shared hosting and private clouds.
We need that a script is activated once in a month (there are plans to create more schedules and to allow users to create some predetermined actions)
Our system provides access to many clients, so, when anyone uses the system it calls for a Web Service via Ajax and doesn't care about the response (after all everything is logged in our database and must run without user interaction)
What we've done is:
1 - An ajax call is called upon access in any major screen.
2 - The Web Service reads a schedule table on our database and calls whatever needs calling
3 - To avoid many stacked Web Service calls we check datetime with an interval of 10 mins before actually performing any actions
That's also a way to distribute the load balance and the schedules doesn't affect the system with user interaction.

PHP slow if requested with AJAX?

I made a simple Webpage with an empty form Tag. This Tag is filled with the response of an AJAX request. The request asks a PHP script for data. This script returns its execution time. Now there is something really odd. If I type in the address by hand then the script tells me
<!-- Duration: 0.8 milliseconds (~1242 pages per second) -->
But if I use the build-in network request logger of Chrome (for watching what has been loaded) then I got this
<!-- Duration: 52.7 milliseconds (~19 pages per second) -->
Any ideas why it is 80 times slower?
I repeat: Same script, same parameter, identical response (but the duration time of cause), same server, different request types: AJAX and browser address line.
<?php
class AbstractModule
{
final function __construct(..)
{
// for measuring creation time
$this->starttime = microtime(true);
}
public final function return_duration()
{
$duration = (microtime(true) - $this->starttime) * 1000;
return "\n<!-- Duration: " . number_format($duration, 1, '.', '') . " milliseconds (~" . number_format(1000 / $duration, 0, '.', '') . " pages per second) -->";
}
}
$demo = new AbstractModule();
// doing very much :)<
echo $demo->return_duration();
?>
Thanks.
Do you use sessions? The difference might be that the ajax request restarts a session each time, because you don't send any cookies along.
Otherwise, I suggest you break out a debugger and track down the culprit.

PHP parent/child timing communication

I am working on an app which will allow me to login to a remote telnet server and monitor statistics. The problem is that the telnet server has a minimum refresh rate of 10 seconds, and the refresh rate varies slightly depending on server load (the report itself has this refresh rate, not the server). I need the front-end of this system to refresh more often than every 10 seconds (5 seconds minimum). I have somewhat been able to accomplish this by forking, but eventually the timings synchronize (it's a gradual process which I am assuming is due to remote server loads).
Child Code:(i have removed quite a bit of code relating to how the app logs in and accesses the report, but it is basically key-presses through 7 menus to get to the page which refreshes - I can include if needed):
// 1 - Telnet Handshaking and initial screen load
$sHandshaking = '<IAC><WILL><OPT_BINARY><IAC><DO><OPT_BINARY><IAC><WILL><OPT_TSPEED><IAC><SB><OPT_TSPEED><IS>38400,38400<IAC><SE>';
// 2-4 - Keypresses to get to report (login, menus, etc. - Removed)
// Loop and cache
while(1){
// 4 - View Report
$oVT100->listen();
$reference = $temp;
$screen = $oVT100->getScreenFull();
if(empty($screen)){
Failed:
echo "FAILED";
file_put_contents($outFile,array('html'=>"<div class=\"header\"><font color='red'>Why are things always breaking?!<font color='red'></div>"));
goto restartIt; // If screen does not contain valid report, assume logout and start at top
}else{
$screen = parseReport($oVT100->getScreenFull());
$temp = json_decode($screen);
// Check old report file, if different save, else sleep a bit
$currentFile = file_get_contents($outFile);
if($screen !== $currentFile){
file_put_contents($outFile,$screen);
sleep(5);
}else{
sleep(1);
//usleep(500000);
}
}
}
As you can see with the child's code above, it logs in then infinite loops the report. It then writes the screen out to a cache file (if it is different from the existing file). As a dirty workaround to the refresh issue, I wrote a parent to fork:
$pids = array();
for($i = 0; $i < 2; $i++) {
sleep(5);
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
require_once(dirname(__FILE__).'/checkQueue.php');
exit();
}
}
for($i = 0; $i < 2; $i++) {
pcntl_waitpid($pids[$i], $status, WUNTRACED);
}
I immediately noticed the timing offset and had to spawn three children instead of two to keep under 5 seconds; but the timing worked for a few days. Eventually, all the children were updating the file at the same time and I had to restart the parent. What would be the best solution to monitor the child processes so that I maintain a refresh interval of less than five seconds?
[EDIT]
This is not a running log, it is a file which contains current call statistics for a call center. The data in the cache file needs to be no less than 5 seconds old at any time, but eventually all of the children sync and write to the log at nearly the same time. The issue isn't really with the local file, it is inconsistent remote server response times which eventually leads to the child processes running getting their report at the same time.
I'm hoping there's a better solution, but here's how I solved it - I feel kind of stupid for not having thought of this in the beginning:
$currentFile = file_get_contents($outFile);
if($screen !== $currentFile){
if(time()-filemtime($outFile) > 2){
file_put_contents($outFile,$screen);
sleep(5);
}else{
echo "Failed, sleeping\r";
sleep(2);
}
}else{
sleep(1);
//usleep(500000);
}
basically, I check the file write time of the cache file and if it is less than 2 seconds ago, sleep and re-execute loop. I was somewhat hoping there would be a solution contained within the parent app but I guess this works...

PHP won't let page update automatically every day/6 hours

A friend build a ranking system on his site and I am trying to host in on mine via wordpress and godaddy. It updates for him but when I load it to my site, it works for 6 hours, but as soon as the reload is supposed to occur, it errors and I get a 500 timeout error.
His page is at: http://www.jeremynoeljohnson.com/yakezieclub
My page is currently at http://sweatingthebigstuff.com/yakezieclub but when you ?reload=1 it will give the error.
Any idea why this might be happening? Any settings that I might need to change?
I can give you all the code, but which part? the index.php file? I'm not sure which part is messing up. I literally uploaded the same code as him.
Here's the reload part:
$cachefile = "rankings.html";
$daycachefile = "rankings_history.xml";
$cachetime = (60 * 60) * 6; // every 6 hours, the cache refreshes
$daycachetime = (60 * 60) * 24; // every 24 hours, the history will be written to - or whenever the page is requested after 24 hours has passed
$writenewdata = false;
if (!empty($_GET['reload']))
{
if ($_GET['reload']== 1)
{
$cachetime = 1;
}
}
if (!empty($_GET['reloadhistory']))
{
if ($_GET['reloadhistory'] == 1)
{
$daycachetime = 1;
$cachetime = 1;
}
}
if (file_exists($daycachefile) && (time() - $daycachetime < filemtime($daycachefile)))
{
// Do nothing
}
else
{
$writenewdata = true;
$cachetime = 1;
}
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile)))
{
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start(); // start the output buffer
?>
Any particular reason you're starting output buffering after the cache file is included? If the cache file is just raw html, it'll have to dumped into the output stream already, followed by the cache date comment before you start buffering.
Is it possible that perhaps that something in the script (or another script) is slapping a lock on the output file, such that the reload checking portion hangs while waiting for the lock to clear?
It seems really slow, which makes me thing you're doing something very intensive or something with a high latency. If your web server is hitting its internal timeout value, then you'll get a 500 error. Optimize your code for better speed or increase your server's timeout to fix this problem.
If you post your server platform, I can let you know what you can do to increase the timeout.
Hope this helps.
Two leads :
First, check that your file system returns actual date/time when creating a new file. On some systems, file modification time are localized to a timezone, on other they aren't (GMT instead).
Secondly, be careful using filemtime. PHP use a cache system to avoid access to hdd ressources, this is great, but I never find out how this cache is managed internally and is lifetime. I recommend calling clearstatcache at every time you run the update script

Log/Graph PHP execution time

Are there any tools available to log the page load time for a php site?
Mainly looking for something that I can see trends of load times over time, I was considering dumping them into a file using error_log(), but I don't know what I could use to parse it and display graphs
You can record the microtime at the start of execution, hold that variable until the end, check the time, subtract them, and there you have your execution time. Output buffering will be required to make this work in most cases, unless it's a situation in which a particular thing always runs last (like footer()).
$time_start = microtime_float();
function microtime_float() {
list($usec, $sec) = explode(" ", microtime());
return ((float)$usec + (float)$sec);
}
//at the start.
//at the end:
$time_end = microtime_float();
$time = round($time_end - $time_start, 4);
echo "Last uncached content render took $time seconds";
Use the Firebug extension for Firefox, it has a Net panel that shows you load times.
If you want to do load testing, apache comes with a utility called apache bench, try ab --help in a console window near you.
See PEAR Benchmark. It allows you to add benchmarks into your code. You can have it dump an HTML table on your pages, or you can loop through the data and write to a log file.

Categories