A friend build a ranking system on his site and I am trying to host in on mine via wordpress and godaddy. It updates for him but when I load it to my site, it works for 6 hours, but as soon as the reload is supposed to occur, it errors and I get a 500 timeout error.
His page is at: http://www.jeremynoeljohnson.com/yakezieclub
My page is currently at http://sweatingthebigstuff.com/yakezieclub but when you ?reload=1 it will give the error.
Any idea why this might be happening? Any settings that I might need to change?
I can give you all the code, but which part? the index.php file? I'm not sure which part is messing up. I literally uploaded the same code as him.
Here's the reload part:
$cachefile = "rankings.html";
$daycachefile = "rankings_history.xml";
$cachetime = (60 * 60) * 6; // every 6 hours, the cache refreshes
$daycachetime = (60 * 60) * 24; // every 24 hours, the history will be written to - or whenever the page is requested after 24 hours has passed
$writenewdata = false;
if (!empty($_GET['reload']))
{
if ($_GET['reload']== 1)
{
$cachetime = 1;
}
}
if (!empty($_GET['reloadhistory']))
{
if ($_GET['reloadhistory'] == 1)
{
$daycachetime = 1;
$cachetime = 1;
}
}
if (file_exists($daycachefile) && (time() - $daycachetime < filemtime($daycachefile)))
{
// Do nothing
}
else
{
$writenewdata = true;
$cachetime = 1;
}
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile)))
{
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start(); // start the output buffer
?>
Any particular reason you're starting output buffering after the cache file is included? If the cache file is just raw html, it'll have to dumped into the output stream already, followed by the cache date comment before you start buffering.
Is it possible that perhaps that something in the script (or another script) is slapping a lock on the output file, such that the reload checking portion hangs while waiting for the lock to clear?
It seems really slow, which makes me thing you're doing something very intensive or something with a high latency. If your web server is hitting its internal timeout value, then you'll get a 500 error. Optimize your code for better speed or increase your server's timeout to fix this problem.
If you post your server platform, I can let you know what you can do to increase the timeout.
Hope this helps.
Two leads :
First, check that your file system returns actual date/time when creating a new file. On some systems, file modification time are localized to a timezone, on other they aren't (GMT instead).
Secondly, be careful using filemtime. PHP use a cache system to avoid access to hdd ressources, this is great, but I never find out how this cache is managed internally and is lifetime. I recommend calling clearstatcache at every time you run the update script
Related
Scheduled task needs to be created but its not possible to use Cron job (there is a warning from hosting provider that "running the cron Job more than once within a 45-minute period is a infraction of their rules and could result in shutting down the account."
php script (which insert data from txt to mysql database) should be executed every minute, ie this link should be called http://www.myserver.com/ImportCumulusFile.php?type=dayfile&key=letmein&table=Dayfile&file=./data/Jan10log.txt
Is there any other way?
There are multiple ways of doing repetitive jobs. Some of the ways that I can think about right away are:
Using: https://www.setcronjob.com/
Use an external site like this to fire off your url at set intervals
Using meta refresh. More here. You'd to have to open the page and leave it running.
Javascript/Ajax refresh. Similar to the above example.
Setting up a cron job. Most shared hosting do provide a way to set up cron jobs. Have a look at the cPanel of your hosting.
if you have shell access you could execute a php script via the shell
something like this would be an endless loop, that would sleep 60 seconds execute, collect garbage and repeat until the end of time.
while(true) {
sleep(60);
//script here
//end your script
}
or you could do a "poor mans cron" with ajax or meta refresh. i've done it before. basically, you just place a redirect with either javascript or html's meta refresh at the beggining of your script. access this script from your browser, and just leave it open. it'll refresh every 60 seconds, just like a cronjob.
yet another alternative to a cronjob, would be a bash script such as:
#!/bin/bash
while :
do
sleep 60
wget http://127.0.0.1/path/to/cronjob.php -O Temp --delete-after
done
all this being said, you probably will get caught by the host and get terminated anyway.
So your best solution:
go and sign up for a 5-10 dollar a month vps, and say good bye to shared hosting and hello to running your own little server.
if you do this, you can even stop using crappy php and use facebook's hhvm instead and enjoy its awesome performance.
There's a free service at
http://cron-job.org
That lets you set up a nice little alternative.
Option A
An easy way to realize it would be to create a file/database entry containing the execution time of your php script:
<?php
// crons.php
return [
'executebackup.php' => 1507979485,
'sendnewsletter.php' => 1507999485
];
?>
And on every request made through your visitors you check the current time and if its higher you include your php script:
<?php
// cronpixel.php
$crons = #include 'cache/crons.php';
foreach ($crons as $script => $time) {
if ($time < time()) {
// create lock to avoid race conditions
$lock = 'cache/' . md5($script) . '.lock';
if (file_exists($lock) || !mkdir($lock)) {
continue;
}
// start your php script
include($script);
// now update crons.php
$crons[ $script ] += 86400; // tomorrow
file_put_contents('cache/crons.php', '<?php return ' . var_export($crons, true) . '; ?' . '>')
// finally delete lock
rmdir($lock);
}
}
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
// image data
$im = imagecreate(1, 1);
$blk = imagecolorallocate($im, 0, 0, 0);
imagecolortransparent($im, $blk);
// image output
header("Content-type: image/gif");
imagegif($im);
// free memory
imagedestroy($im);
?>
Note: It will be rarily called on the exact second, because you do not know when your visitor will open your page (maybe 2 seconds later). So it makes sense to set the new time for the next day not through adding 86400 seconds. Instead use mktime.
Option B
This is a little project I realized in the past, that is similar to #r3wt 's idea, but covers race conditions and works on exact times like a cronjob would do in a scheduler without hitting the max_execution_time. And it works most of the time without the need to resurrect it (as done through visitors in Option A).
Explanation:
The script writes a lock file (to avoid race conditions) for the 15th, 30th, 45th and 60th second of a minute:
// cron monitoring
foreach ($allowed_crons as $cron_second) {
$cron_filename = 'cache/' . $cron_second . '_crnsec_lock';
// start missing cron requests
if (!file_exists($cron_filename)) {
cron_request($cron_second);
}
// restart interrupted cron requests
else if (filemtime($cron_filename) + 90 < time()) {
rmdir($cron_filename);
cron_request($cron_second);
}
}
Every time a lock file is missing the script creates it and uses sleep() to reach the exact second:
if (file_exists($cron_filename) || !mkdir($cron_filename)) {
return;
}
// add one minute if necessary
$date = new DateTime();
$cron_date = new DateTime();
$cron_date->setTime($cron_date->format('H'), $cron_date->format('i'), $sec);
$diff = $date->diff($cron_date);
if ($diff->invert && $diff->s > 0) {
$cron_date->setTime($cron_date->format('H'), $cron_date->format('i') + 1, $sec);
}
$diff = $date->diff($cron_date);
// we use sleep() as time_sleep_until() starts one second to early (https://bugs.php.net/bug.php?id=69044)
sleep($diff->s);
After waking up again, it sends a request to itself through fopen():
// note: filter_input returns the unchanged SERVER var (http://php.net/manual/de/function.filter-input.php#99124)
// note: filter_var is unsecure (http://www.d-mueller.de/blog/why-url-validation-with-filter_var-might-not-be-a-good-idea/)
$url = 'http' . isSecure() . '://' . filter_input(INPUT_SERVER, 'HTTP_HOST', FILTER_SANITIZE_URL) . htmlspecialchars($request_uri, ENT_QUOTES, 'UTF-8');
$context = stream_context_create(array(
'http' => array(
'timeout' => 1.0
)
));
// note: return "failed to open stream: HTTP request failed!" because timeout < time_sleep_until
if ($fp = #fopen($url, 'r', false, $context)) {
fclose($fp);
}
rmdir($cron_filename);
By that it calls itself infinitely and you are able to define different starting times:
if (isset($_GET['cron_second'])) {
if ($cron_second === 0 && !(date('i') % 15)) {
mycron('every 15 minutes');
}
if ($cron_second === 0 && !(date('i') % 60)) {
mycron('every hour');
}
}
Note: It produces 5760 requests per day (4 per minute). Not much, but a cronjob uses much less ressources. If your max_execution_time is high enough you could change it to calling itself only once per minute (1440 requests/day).
I understand that this question is bit old but I stumbled on it a week ago with this very question and the best and secure option we found was using a Web Service.
Our context:
We have our system in both shared hosting and private clouds.
We need that a script is activated once in a month (there are plans to create more schedules and to allow users to create some predetermined actions)
Our system provides access to many clients, so, when anyone uses the system it calls for a Web Service via Ajax and doesn't care about the response (after all everything is logged in our database and must run without user interaction)
What we've done is:
1 - An ajax call is called upon access in any major screen.
2 - The Web Service reads a schedule table on our database and calls whatever needs calling
3 - To avoid many stacked Web Service calls we check datetime with an interval of 10 mins before actually performing any actions
That's also a way to distribute the load balance and the schedules doesn't affect the system with user interaction.
I am working on an app which will allow me to login to a remote telnet server and monitor statistics. The problem is that the telnet server has a minimum refresh rate of 10 seconds, and the refresh rate varies slightly depending on server load (the report itself has this refresh rate, not the server). I need the front-end of this system to refresh more often than every 10 seconds (5 seconds minimum). I have somewhat been able to accomplish this by forking, but eventually the timings synchronize (it's a gradual process which I am assuming is due to remote server loads).
Child Code:(i have removed quite a bit of code relating to how the app logs in and accesses the report, but it is basically key-presses through 7 menus to get to the page which refreshes - I can include if needed):
// 1 - Telnet Handshaking and initial screen load
$sHandshaking = '<IAC><WILL><OPT_BINARY><IAC><DO><OPT_BINARY><IAC><WILL><OPT_TSPEED><IAC><SB><OPT_TSPEED><IS>38400,38400<IAC><SE>';
// 2-4 - Keypresses to get to report (login, menus, etc. - Removed)
// Loop and cache
while(1){
// 4 - View Report
$oVT100->listen();
$reference = $temp;
$screen = $oVT100->getScreenFull();
if(empty($screen)){
Failed:
echo "FAILED";
file_put_contents($outFile,array('html'=>"<div class=\"header\"><font color='red'>Why are things always breaking?!<font color='red'></div>"));
goto restartIt; // If screen does not contain valid report, assume logout and start at top
}else{
$screen = parseReport($oVT100->getScreenFull());
$temp = json_decode($screen);
// Check old report file, if different save, else sleep a bit
$currentFile = file_get_contents($outFile);
if($screen !== $currentFile){
file_put_contents($outFile,$screen);
sleep(5);
}else{
sleep(1);
//usleep(500000);
}
}
}
As you can see with the child's code above, it logs in then infinite loops the report. It then writes the screen out to a cache file (if it is different from the existing file). As a dirty workaround to the refresh issue, I wrote a parent to fork:
$pids = array();
for($i = 0; $i < 2; $i++) {
sleep(5);
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
require_once(dirname(__FILE__).'/checkQueue.php');
exit();
}
}
for($i = 0; $i < 2; $i++) {
pcntl_waitpid($pids[$i], $status, WUNTRACED);
}
I immediately noticed the timing offset and had to spawn three children instead of two to keep under 5 seconds; but the timing worked for a few days. Eventually, all the children were updating the file at the same time and I had to restart the parent. What would be the best solution to monitor the child processes so that I maintain a refresh interval of less than five seconds?
[EDIT]
This is not a running log, it is a file which contains current call statistics for a call center. The data in the cache file needs to be no less than 5 seconds old at any time, but eventually all of the children sync and write to the log at nearly the same time. The issue isn't really with the local file, it is inconsistent remote server response times which eventually leads to the child processes running getting their report at the same time.
I'm hoping there's a better solution, but here's how I solved it - I feel kind of stupid for not having thought of this in the beginning:
$currentFile = file_get_contents($outFile);
if($screen !== $currentFile){
if(time()-filemtime($outFile) > 2){
file_put_contents($outFile,$screen);
sleep(5);
}else{
echo "Failed, sleeping\r";
sleep(2);
}
}else{
sleep(1);
//usleep(500000);
}
basically, I check the file write time of the cache file and if it is less than 2 seconds ago, sleep and re-execute loop. I was somewhat hoping there would be a solution contained within the parent app but I guess this works...
I am looking for a possibility to check the user connection speed. It is supposed to be saved as a cookie and javascript files as well as css files will be adapted if the speed is slow.
The possibility for testing speed i have at the moment ist the following
$kb = 512;
flush();
//
echo "<!-";
$time = explode(" ",microtime());
for($x=0;$x<$kb;$x++){
echo str_pad('', 512, '.');
flush();
}
$time_end = explode(" ",microtime());
echo "->";
$start = $time[0] + $time[1];
$finish = $time_end[0] + $time_end[1];
$deltat = $finish - $start;
return round($kb / $deltat, 3);
While it works, I do not like it to put so many characters into my code also if I echo all this I can not save the result in a cookie because there has already been an output.
Could one do something like this in a different file wor something? Do you have any solution?
Thanks in advance.
Do you have any solution?
My solution is to not bother with the speed test at all. Here's why:
You stated that the reason for the test is to determine which JS/CSS files to send. You have to keep in mind that browsers will cache these files after the first download (so long as they haven't been modified). So in effect, you are sending 256K of test data to determine if you should send, say, an additional 512K?
Just send the data and it will be cached. Unless you have MBs of JS/CSS (in which case you need a site redesign, not a speed test) the download time will be doable. Speed tests should be reserved for things such as streaming video and the like.
The only idea what i can come up is a redirect.
Measure users' speed
Redirect to index
While this isn't a nice solution it only need to measure users' speed only once so i think it's excusable.
How about using javascript to time how long it takes to load a page. Then use javascript to set the cookie.
microtime in javascript http://phpjs.org/functions/microtime:472
Using jQuery
<head>
<!-- include jquery & other html snipped -->
<script>
function microtime (get_as_float) {
// http://kevin.vanzonneveld.net
// + original by: Paulo Freitas
// * example 1: timeStamp = microtime(true);
// * results 1: timeStamp > 1000000000 && timeStamp < 2000000000
var now = new Date().getTime() / 1000;
var s = parseInt(now, 10);
return (get_as_float) ? now : (Math.round((now - s) * 1000) / 1000) + ' ' + s;
}
function setCookie(c_name, value, expiredays) {
var exdate=new Date();
exdate.setDate(exdate.getDate()+expiredays);
document.cookie=c_name+ "=" +escape(value)+
((expiredays==null) ? "" : ";expires="+exdate.toUTCString());
}
start = microtime(true);
$(window).load(function () {
// everything finished loading
end = microtime(true);
diff = end - start;
// save in a cookie for the next 30 days
setCookie('my_speed_test_cookie', diff, 30);
});
</script>
</head>
<body>
<p>some page to test how long it loads</p>
<img src="some_image_file.png">
</body>
Some pitfalls:
- The page would need to start loading first. JQuery would need to be loaded (or you can rework the above code to avoid jQuery)
testing speed on ASCII / Latin data may not give the best result, because the characters may get compressed. Besides the high level gzip compression, Some modems / lines (if not all) have basic compression that is able to detect repeating characters and tell the other end that the next 500 are repeat of ' '. I guess it would be best to use binary data that has been compressed
The problem here is that you can't really solve this nicely, and probably not in pure PHP. The approach you've taken will make the user download (512x512) = 262 144 bytes of useless data, which is much bigger than most complete pages. If the user is on a slow connection, they may assume your site is down before the speed test is over (with 10 kB/sec, it'd take half a minute before anything interesting shows up on screen!).
You could make an AJAX request for a file of a known size and time how long that takes. The problem here is that the page needs to be already loaded for that to work, so it'd only work for subsequent pages.
You could make a "loading" page (like you see on GMail when accessing it from a slow connection) that preloads the data, with a link to the low-bandwidth version (or maybe a redirect if the loading is taking too long).
Or you could save the "start" time in the cookie and make an AJAX request when the page is done loading - that would give you the actual loading time of your page; if that's, say, over 10 seconds, you may want to switch to the low-bandwidth version.
None of these, however, will get you the speed on the very first access; and sending a big empty page up front is not a very good first impression either.
you visit the first page(maybe 100kB with all external files), a session is immeadeatly started with
$_SESSION["start_time"] = time();
when page finished loading(jQuery window load or smth:) u send a request again with time,
u compute the speed (jQueryRequestTime - $_SESSION["start_time"] / PageSize) and set another session variable, the next link he clicks then can include custom css/js approved for that
ofc this is not perfect:)
After you've determined the user's speed, send javascript to the browser to set the cookie and then do a refresh or redirect in cases where the speed is below what you'd like.
The only thing I can think of would be to subscribe to a service which offers an IP to net speed lookup. These services work by building a database of IP addresses and cataloging their registered intended use. They're not always accurate, but they do provide a starting point. Look up the user's IP address against one of these and see what it returns.
Ip2Location.com provides such a database, beginning with their DB13 product.
Of course, if your goal is a mobile version of the site, user agent sniffing is a better solution.
Originally, I just want to verify that session_start locks on session. So, I create a PHP file as below. Basically, if the pageview is even, the page sleeps for 10 seconds; if the pageview is odd, it doesn't. And, session_start is used to obtain the page view in $_SESSION.
I tried to access the page in two tabs of one browser. It is not surprising that the first tab takes 10 seconds since I explicitly let it sleep. The second tab would not sleep, but it should be blocked by sessiont_start. That works as expected.
To my surprise, the output of the second page shows that session_start takes almost no time. Actually, the whole page seems takes no time to load. But, the page does take 10 seconds to show in browser.
obtained lock
Cost time: 0.00016689300537109
Start 1269739162.1997
End 1269739162.1998
allover time elpased : 0.00032305717468262
The page views: 101
Does PHP extract session_start out of PHP page and execute it before other PHP statements?
This is the code.
<?php
function float_time()
{
list($usec, $sec) = explode(' ', microtime());
return (float)$sec + (float)$usec;
}
$allover_start_time = float_time();
$start_time = float_time();
session_start();
echo "obtained lock<br/>";
$end_time = float_time();
$elapsed_time = $end_time - $start_time;
echo "Cost time: $elapsed_time <br>";
echo "Start $start_time<br/>";
echo "End $end_time<br/>";
ob_flush();
flush();
if (isset($_SESSION['views']))
{
$_SESSION['views'] += 1;
}
else
{
$_SESSION['views'] = 0;
}
if ($_SESSION['views'] % 2 == 0)
{
echo "sleep 10 seconds<br/>";
sleep(10);
}
$allover_end_time = float_time();
echo "allover time elpased : " . ($allover_end_time - $allover_start_time) . "<br/>";
echo "The page views: " . $_SESSION['views'];
?>
That seems to be a firefox related "issue". If you request the same url in two tabs/windows the second request waits until the first request is finished (could also be an addon that blocks the second request, haven't tested that).
Take e.g.
<?php // test.php
$start = microtime(true);
echo "<pre>start: $start</pre>";
sleep(5);
$end = microtime(true);
echo '<pre>', $start, "\n", $end, "\n", $end-$start, '</pre>';
I called it twice and the output was
start: 1269742677.6094
1269742677.6094
1269742682.609
4.9995958805084
and
start: 1269742682.6563
1269742682.6563
1269742687.6557
4.9994258880615
Note that there's already a 5 second gap between the start times.
When called as http://localhost/test.php and http://localhost/test.php?a=b instead of the exact same url twice this does not happen.
Both IE8 and Chrome do not show that behavior.
Yes, This could be because of session_start() blocking other requests in the same session (file based). I was able to verify the issue in Firefox(4.x) and Chrome(10.x) in Windows XP/PHP 5.2 with default session handler(file). I am not sure if this issue is reproducible for non-file session handlers.
obtained lock
**Cost time: 9.90100598335**
Start 1303227658.67
End 1303227668.57
sleep 10 seconds
allover time elpased : 19.9027831554
The page views: 4
This is a very interesting issue and the Firefox tab locking described in the above answer would have eclipsed this one from being detected.
http://www.php.net/manual/en/function.session-start.php#101452
Since php does not have a container. How do two calls to same session get serialized? Who does this? How do the two processes talk? Is PHP module always active and only spawning threads after doing session check? In that case the PHP module is indeed behaving like a container that is, in this case, providing session management service to this extent.
So, on my arcade, howlingdoggames.com. I have a points system that gives you a point every time you visit a page with a game on. To reduce abuse of this, I would like to make some sort of delay, so its only awarded after 45 seconds. Here's what I've tried:
if ($_SESSION['lastgame'] != $gameid) {
sleep(45);
$points = $points + $game_points;
$_SESSION['lastgame'] = $gameid;
}
But this just seems to halt my whole website for 45 seconds, because this is in index.php, along with a lot of other code for my site.
Is there anyway I can isolate that bit of code, so it only makes the statement
$points = $points + $game_points;
wait for 45 seconds?
There is (mostly) no multithreading in PHP. You can sort of do this with forking processes on Unix systems but that's irrelevant because multithreading isn't really what you're after. You just want simple logic like this:
$now = time();
session_start();
$last = $_SESSION['lastvisit'];
if (!isset($last) || $now - $last > 45) {
$points = $_SESSION['points'] ?? 0;
$_SESSION['points'] = $points + 10;
$_SESSION['lastvisit'] = $now;
}
Basically only give the points if the increment between the last time you gave points is greater than 45 seconds.
It is session block your script. not "There is no multithreading in PHP".
session_write_close() before sleep() will solve block your whole script. but may not fit in your problem.
so you had to save the bonus using settimeout of js and AJAX.
from comment of sleep() in php.net:
http://www.php.net/manual/en/function.sleep.php#96592
Notice that sleep() delays execution for the current session, not just the script. Consider the following sample, where two computers invoke the same script from a browser, which doesn't do anything but sleep.
No, not directly. You need to take a different approach, like remembering the timestamp of last visit and only add points if sufficient amount of time has passed since that.
There is no multithreading in PHP, so sleep() is always going to block your whole script.
The way you should solve this is to record the the time of the last game, and only award points if it is more than 45 seconds later than that.
<?php
session_start();
if (!isset($_SESSION['last_game_time'])
|| (time() - $_SESSION['last_game_time']) > 45) {
// code to award points here
$_SESSION['last_game_time'] = time();
}
Bear in mind that users could still abuse this if they disable cookies (thus they will have no session data). So if that really worries you, check that they have cookies enabled before allowing them to use the feature (there are probably several questions that cover this).
Instead of blocking the script, save the current time in the session, don't add the points and let the page render. Then on later page views if you see that the saved time in session is older than 45 seconds, add the points, store them wherever you need, and clear the time.
You cannot, but you can make this code a javascript one, and save the bonus using AJAX.
Problem is session_start();
Try
if ($_SESSION['lastgame'] != $gameid) {
session_write_close();
sleep(45);
session_start();
$points = $points + $game_points;
$_SESSION['lastgame'] = $gameid;
}