We serve several high load sites, and so far we came with this code for caching remote banners that support some basic local banner rotation. It should be fairly easy for all to understand, what I would like to hear from you about the possible improvement or suggestions on this code.
Here it is...
$cachetime = 6 * 60 * 60; // 6 hours
$bannercache = $_SERVER['DOCUMENT_ROOT']."/banner-".$bpos.".txt";
// Serve from the cache if it is younger than $cachetime
if (file_exists($bannercache) && (time() - $cachetime
< filemtime($bannercache)))
{
// if it's ok don't update from remote
} else {
// if cache is old, update from remote
$bannercachecontent = #file_get_contents('ADSERVER.com/showad.php?category='.$adcat.'&dimensions='.$dimensions);
if ($bannercachecontent === FALSE) {
// on error, just update local time, so that it's not pulled again in case of remote mysql overload
$fb = #fopen($bannercache, 'a+');
fwrite($fb, "\n<!-- Changed date on ".date('jS F Y H:i', filemtime($cachefile))."-->\n");
fclose($fb);
} else {
// if it's ok, save new local file
$fb = #fopen($bannercache, 'w');
fwrite($fb, $bannercachecontent);
fwrite($fb, "\n<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))."-->\n");
fclose($fb);
}
}
$fhm = file_get_contents($bannercache);
$fhmpos = strpos($fhm, '-----#####-----'); // check if it needs to be exploded for rotation
if ($fhmpos === false) {
echo $fhm;
} else {
$fhmpicks = explode("-----#####-----", $fhm);
foreach ($fhmpicks as $fhmkey => $fhmvalue)
{
if (trim($fhmpicks[$fhmkey]) == '')
{
unset($fhmpicks[$fhmkey]);
}
}
$fhmpick = array_rand($fhmpicks,1);
echo $fhmpicks[$fhmpick]; // show only one banner
}
Don't let your clients update your banners. You will always have 1 user that has to do this, and that isn't necessary:
Instead: let you clients always load the local image, and run a background process (cron, with a manual override if you want to pull the image NOW) that fetches the right image(s).
The code would be fairly simple:
your client does: $yourImage= $_SERVER['DOCUMENT_ROOT']."/banner-".$bpos.".txt";
and your update-script can just cURL the right image.
Related
I am opening two windows in one session, which should have different behaviours. As the windows open at the same time, they are using th same options and interfering with each other. Therefore I would need to check in the code, how they were opened.
If it was opened with something like
window.open(href, target='pdf');
I would want to check this
if(tab.target = "pdf")
{
$check_target = 1
}
else
{
$check_target = 2
}
now "tab.target" does not exist - can I achieve this somehow?
Thanks!
Max
I would add a something to the url like this:
window.open(href+'?mytarget=pdf', target='pdf');
Then in your php code:
if($_GET['mytarget'] == "pdf")
{
$check_target = 1;
}
else
{
$check_target = 2;
}
I have made a script that checks a server's availability.
The site was down and I was awaiting a fix(I was on call for a client and was awaiting a ticket from the provider), to limit calls I have used sleep():
$client = new \GuzzleHttp\Client();
$available = false;
date_default_timezone_set('doesntMatter');
//The server was more likely to respond after 5 AM, hence the decrese between intervals
$hours = array( //Minutes between calls based on current hour
0=>30,
1=>30,
2=>30,
3=>30,
4=>20,
5=>20,
6=>10,
7=>10,
8=>10
);
$lastResponse = null;
while(!$available) {
$time = time();
$hour = date('G', $time);
echo "\n Current hour ".$hour;
try {
$crawler = $client->request('GET', 'www.someSiteToCheck.com');
$available = true; //When the server returns a stus code of 200 available is set to TRUE
} catch (\GuzzleHttp\Exception\ServerException $e) {}
if(!$available) {
$secondsToSleep = $hours[$hour]*60;
echo "\n Sleeping for ".$secondsToSleep;
sleep($hours[$hour]*$secondsToSleep); //Sleep until the next request
} else {
exec('start ringtone.mp3'); //Blast my stereo to wake me up
}
}
Problem:
When I started the script it went in a 1800 second sleep and froze, it didn't re-execute anything
Given:
I tested my script with a sleep of 160 (for ex) and it made multiple calls
Checked my power settings so that the machine wouldn't go in stand-by
Checked error logs
(Even if obvious) I ran in CLI
Checked sleep() documentation for issues but nothing
Couldn't find anithing related
I think you have an error in your logic.
For example:
When it's 5AM
Then $secondsToSleep is 20*60 = 1200sec;
When you call the sleep function you multiply it again with 20
sleep($hours[$hour]*$secondsToSleep); => sleep(20*1200); => 24000sec => 6,66... hours
If you simply update your sleep parameter the result should be as expected.
if(!$available) {
$secondsToSleep = $hours[$hour]*60;
echo "\n Sleeping for ".$secondsToSleep;
sleep($secondsToSleep); //Sleep until the next request
}
I have an ipcamera from which /video/feed/1.jpg (mounted as a ramdrive) is being written to at approx 5fps. Sometimes this can be less than 1 fps if the connection is poor.
I'm trying to update the image in a browser every 500ms but i have two goals:
I don't want the server to output the same image if it has not been updated by the camera.
I dont want to output a partial image if the camera had not finished writing it yet.
I tried to achieve this by creating an md5 of the image and storing it in the session, if on the next browser request the md5 is unchanged, the server loops until the md5 is different. The server will also loop until the md5 matches the previous time it was loaded, this way I can be sure that the camera had finished creating the image.
The process works as expected but the cpu usage goes through the roof, so I'm looking for suggestions on improvements.
test.php
<?php
session_start();
$imgFile = '/video/feed/1.jpg';
$lastImg = $_SESSION['image'];
$imgMd5 =0;
do {
sleep(.2);
$img = (file_get_contents($imgFile));
$lastMd5 = $imgMd5;
$imgMd5 = md5($img);
if ($lastMd5 != $imgMd5) {
continue;
}
if ($imgMd5 != $lastImg) {
break;
}
} while (0 == 0);
header("Content-type: image/jpg");
$_SESSION['image'] = md5($img);
echo $img;
exit;
?>
JS
<script>
img = new Image
function f() {
img.src = "test.php?rnd=" + Date.now();
img.onload = function() {
feed.src = img.src;
setTimeout(function() {
f();
}, 500);
};
img.onerror= function(){
setTimeout(function() {
f();
}, 500);
};
}
f();
</script>
What I really needed was usleep(200000)
sleep(.2) did not work as I expected.
Instead of using an MD5 hash to check if the file changed, use the last modified time which should be less resource-intensive than computing an MD5 hash each time (but for some reason on my Windows machine it still uses a lot of CPU so I'm not sure), you can try my code :
<?php
session_start();
$path = "/video/feed/1.jpg";
if (isset($_SESSION["lastmodified"])) { // if there's no previous timestamp, do not execute all of this and jump straight into serving the image.
while ($_SESSION["lastmodified"] == filemtime($path)) {
sleep(1); // sleep while timestamp is the same as before.
clearstatcache(); // PHP caches file informations such as last modified timestamp, so we need to clear it each time or we'll run into an infinite loop.
}
}
$_SESSION["lastmodified"] = filemtime($path); // set the new time variable.
header("Content-type: image/jpg");
echo file_get_contents($path);
?>
Edit: you can just let your webserver handle all of this by directly serving the RAMdisk directory, setting appropriate cache-control headers and using a meta refresh tag to reload the page every second. When the page reloads, the webserver will serve a new image only if it exists (and if not, it'll just return "not modified" to the browser and it'll serve the last image from its cache).
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 months ago.
Improve this question
I noticed that some user overloading my website by downloading multiple files (for example 500 files at same time) and opening more pages in small duration, I want to show captcha if unexpected navigation detected by user.
I know how to implement Captcha, but I can't figure out what is the best approach to detect traffic abuse using (PHP)?
A common approach is to use something like memcached to store the requests on a minute basis, I have open sourced a small class that achieves this: php-ratelimiter
If you are interested in a more thorough explanation of why the requests need to be stored on a minute basis, check this post.
So to sum it up, your code could end up looking like this:
if (!verifyCaptcha()) {
$rateLimiter = new RateLimiter(new Memcache(), $_SERVER["REMOTE_ADDR"]);
try {
$rateLimiter->limitRequestsInMinutes(100, 5);
} catch (RateExceededException $e) {
displayCaptcha();
exit;
}
}
Actually, the code is based on a per-minute basis but you can quite easily adapt this to be on a per 30 seconds basis:
private function getKeys($halfminutes) {
$keys = array();
$now = time();
for ($time = $now - $halfminutes * 30; $time <= $now; $time += 30) {
$keys[] = $this->prefix . date("dHis", $time);
}
return $keys;
}
Introduction
A similar question has be answered before Prevent PHP script from being flooded but it might not be sufficient reasons :
It uses $_SERVER["REMOTE_ADDR"] and they are some shared connection have the same Public IP Address
There are so many Firefox addon that can allows users to use multiple proxy for each request
Multiple Request != Multiple Download
Preventing multiple request is totally different from Multiple Download why ?
Lest Imagine a file of 10MB that would take 1min to download , If you limit users to say 100 request per min what it means you are given access to the user to download
10MB * 100 per min
To fix this issue you can look at Download - max connections per user?.
Multiple Request
Back to page access you can use SimpleFlood which extend memcache to limit users per second. It uses cookies to resolve the shared connection issue and attempts to get the real IP address
$flood = new SimpleFlood();
$flood->addserver("127.0.0.1"); // add memcache server
$flood->setLimit(2); // expect 1 request every 2 sec
try {
$flood->check();
} catch ( Exception $e ) {
sleep(2); // Feel like wasting time
// Display Captcher
// Write Message to Log
printf("%s => %s %s", date("Y-m-d g:i:s"), $e->getMessage(), $e->getFile());
}
Please note that SimpleFlood::setLimit(float $float); accepts floats so you can have
$flood->setLimit(0.1); // expect 1 request every 0.1 sec
Class Used
class SimpleFlood extends \Memcache {
private $ip;
private $key;
private $prenalty = 0;
private $limit = 100;
private $mins = 1;
private $salt = "I like TO dance A #### Lot";
function check() {
$this->parseValues();
$runtime = floatval($this->get($this->key));
$diff = microtime(true) - $runtime;
if ($diff < $this->limit) {
throw new Exception("Limit Exceeded By : $this->ip");
}
$this->set($this->key, microtime(true));
}
public function setLimit($limit) {
$this->limit = $limit;
}
private function parseValues() {
$this->ip = $this->getIPAddress();
if (! $this->ip) {
throw new Exception("Where the hell is the ip address");
}
if (isset($_COOKIE["xf"])) {
$cookie = json_decode($_COOKIE["xf"]);
if ($this->ip != $cookie->ip) {
unset($_COOKIE["xf"]);
setcookie("xf", null, time() - 3600);
throw new Exception("Last IP did not match");
}
if ($cookie->hash != sha1($cookie->key . $this->salt)) {
unset($_COOKIE["xf"]);
setcookie("xf", null, time() - 3600);
throw new Exception("Nice Faking cookie");
}
if (strpos($cookie->key, "floodIP") === 0) {
$cookie->key = "floodRand" . bin2hex(mcrypt_create_iv(50, MCRYPT_DEV_URANDOM));
}
$this->key = $cookie->key;
} else {
$this->key = "floodIP" . sha1($this->ip);
$cookie = (object) array(
"key" => $this->key,
"ip" => $this->ip
);
}
$cookie->hash = sha1($this->key . $this->salt);
$cookie = json_encode($cookie);
setcookie("xf", $cookie, time() + 3600); // expire in 1hr
}
private function getIPAddress() {
foreach ( array(
'HTTP_CLIENT_IP',
'HTTP_X_FORWARDED_FOR',
'HTTP_X_FORWARDED',
'HTTP_X_CLUSTER_CLIENT_IP',
'HTTP_FORWARDED_FOR',
'HTTP_FORWARDED',
'REMOTE_ADDR'
) as $key ) {
if (array_key_exists($key, $_SERVER) === true) {
foreach ( explode(',', $_SERVER[$key]) as $ip ) {
if (filter_var($ip, FILTER_VALIDATE_IP) !== false) {
return $ip;
}
}
}
}
return false;
}
}
Conclusion
This is a basic prove of concept and additional layers can be added to it such as
Set different limit for differences URLS
Add support for penalties where you block user for certain number of Mins or hours
Detection and Different Limit for Tor connections
etc
I think you can use sessions in this case.
Initialize a session to store a timestamp[use microtime for better results] and then get timestamp of the new page.The difference can be used to analyzed the frequency of pages being visited and captcha can be shown.
You can also run a counter on pages being visited and use a 2d array to store the page and timestamp.If the value of pages being visited increases suddenly then you can check for timestamp difference.
I'm trying to write a small PHP script that uses a session to store a folder structure of images. Every time the side gets called it reads the next image out of session list and display it as content type of the side. When I call my script I sometimes get not the next image out of list but the next but one. When I write an output file to register every page request, I see that there were more than just one request. But if I look to my fire bug time line I don't see more than one request and there is no javascript running. If I show the image as part of an normal HTML page everything works gread. So what is going on here.
Would be nice if somebody can help me with this...
<?php
include("readDir.class.php");
define("IMAGE_SOURCE_PATH","img");
session_start();
//Inititalize new session context
try
{
if(!isset($_SESSION['id']))
initSessionConext();
}
catch (Exception $e)
{
exit();
}
$fotos = $_SESSION['fotos'];
//Handle wrapp around
try
{
if($_SESSION['id'] >= count($fotos))
initSessionConext();
}
catch (Exception $e)
{
exit();
}
$foto = $fotos[$_SESSION['id']];
if(strcasecmp($_SERVER['REQUEST_METHOD'],"get") == 0)
$_SESSION['id'] += 1;
//Error in session context return nothing
if(empty($foto))
exit(); //
switch(readDir::extension($foto))
{
case "png":
header('Content-Type: image/png');
break;
case "jpg": //Fall through to jpeg case
case "jpeg":
header('Content-Type: image/jpeg');
break;
}
$fp = fopen("test.txt","a");
fwrite($fp,$foto."\r\n");
fclose($fp);
header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1
readfile(IMAGE_SOURCE_PATH."/".$foto);
//echo $foto."<br>";
//echo '<img src="'.IMAGE_SOURCE_PATH."/".$foto.'" />';
//--------------- F U N C T I O N S -------------------------------
function initSessionConext()
{
$_SESSION['id'] = 0;
$_SESSION['fotos'] = getNewData(IMAGE_SOURCE_PATH);
}
function getNewData($path)
{
$extensions = array("jpg","png","jpeg"); //get data out of file system
$fotos = array();
$source = new readDir($path);
if(!$source->check())
throw new Exception('Could not find source for given path');
$fotos = $source -> readFilesWithextension($extensions);
if(!sort($fotos,SORT_STRING))
throw new Exception('Could not sort foto list in natural order');
return $fotos;
}
?>
So if I understand correctly, you're returning each image, one per time the image is loaded?
It seems likely to me that the browser is requesting the image twice: Once as a HEAD request, and the second time to get the content. This is commonly used to find out things like the Content-Length header before blindly downloading.
I would suggest making sure that strcasecmp($_SERVER['REQUEST_METHOD'],"get") == 0 before modifying the session.
Didn't solve the request problem at least. Use now time difference to differ between requests. Not nice but works out of the box...