PHP site problems - php

My code of file.php:
<?php
if(!isset($_GET['filename']) OR $_GET['filename'] == NULL) {
print("Error!");
exit();
}
$_GET['filename'] = htmlentities($_GET['filename'], ENT_QUOTES, "utf-8");
session_start();
include_once("/var/www/html/get.php");
for($i = 0; $i < $GLOBALS['files']['ile']; $i++) {
if($GLOBALS['files'][$i]['name'] == $_GET['filename']) {
if($GLOBALS['files'][$i]['priv'] == NULL OR $GLOBALS['files'][$i]['owner'] == $_SESSION['id'] OR (isset($_SESSION['privs']) AND in_array($GLOBALS['files'][$i]['priv'], $_SESSION['privs']))) {
if(file_exists($GLOBALS['files'][$i]['loc'])) {
header("Content-length: ".filesize($GLOBALS['files'][$i]['loc']));
header("Content-type: ".mime_content_type($GLOBALS['files'][$i]['loc']));
readfile($GLOBALS['files'][$i]['loc']);
} else {
print("Can't find that file!");
}
}
}
}
?>
In get.php file, I loads (from database) information about files I wanted to have access using that file above in site.
$_SESSION['privs'] //it's an array that holds privileges, i.e: site.priv.mess
$GLOBALS['files'] //holds info about all files that user can load, i.e: $GLOBALS['files'][0]['name'] is a name of first file in array
$GLOBALS['files'][0]['loc'] //holds info about first file localisation
$GLOBALS['files']['ile'] // holds sizeof($GLOBALS['files'])
With pictures that works well, but if I try to load larger file, i.e. video that weights 300MB, then file loads, all looks good, but if I reloads site, it won't work anymore...
I tried to delete my cookies in browser (to change my session ID) and it works... But what can I do to make it works better?
EDIT: On Firefox all looks good, only freezes on Chrome :(
EDIT2: Closing session with: session_write_close(); before reading file fixed my curse :P Thanks y'all

Check your php.ini or use phpinfo() to check your values for upload_max_filesize, post_max_size and memory_limit. Maybe also check the max execution time of the php script.

Related

How to remove the included file in PHP

I"m loading a file in start of PHP. I have declared some activity and assign them to buttons which are aligned left and right side of HTML document. On click of each button will included some Other PHP file but I want to remove the previous loaded on startup file before including any other file.
Here is my code
<?php
// php file on startup of file
include("dashboard.php");
$activity = $_REQUEST['activity'];
if($activity) {
if($activity == 'addMember'){
include("addMember.php");
remove("dashboard.php");
}
if($activity == 'dashboard'){
}
if($activity == 'issueBooks'){
include("issueBooks.php");
}
if($activity == 'returnBooks'){
include("returnBooks.php");
}
<?php
I tried with
if($activity == 'addMember'){
include("addMember.php");
remove("dashboard.php");
}
But since I"m very new to PHP, it didn't work as expected.
Any Help anyone.
That isn't how include works. When you include a file, the contents of that file are executed at that point in the code. There's no straightforward way to undo that.
It looks like you want the dashboard activity to be the default view if no activity has been selected yet. To do that, you can just include it only when there is no activity value in request, or when the dashboard activity has been specifically requested.
if (empty($_REQUEST['activity']) || $_REQUEST['activity'] == 'dashboard') {
include 'dashboard.php';
}

Continuously updating php page

I have created a php page to display a image from a set of five images. The image is displayed based on the data read from a text file.Another application is continuously updating data in that text file.So my php page need to read data from that file whenever the file is updated and display image based on that data. I created a infinite loop to read data. But when i tried to access the php page from a browser , it is not loading because of the infinite loop.
$myfile1 = fopen("readfile.txt", "r") or die("Unable to open file!");
$readJsonData = fread($myfile1,filesize("readfile.txt"));
fclose($myfile1);
$obj = json_decode($readJsonData);
$urllogo = 'tatalogo.png';
if(($obj->{"FrontLeft"}) == "TRUE")
{
$url = 'images/FrontLeft.png';
}
else if(($obj->{"FrontRight"}) == "TRUE")
{
$url = 'images/FrontRight.png';
}
else if(($obj->{"RearLeft"}) == "TRUE")
{
$url = 'images/RearLeft.png';
}
else if(($obj->{"RearRight"}) == "TRUE")
{
$url = 'images/RearRight.png';
}
else
{
$url = 'images/Normal.png';
}
// infinite loop
while(1)
{
//reading from the file and refreshing the page.
}
In PHP Set header like this to refresh the php page
header("Refresh: 5;url='pagename.php'");
In HTML Head tag
<html>
<head>
<meta http-equiv="refresh" content="5;URL='pagename.php'">
</head>
</html>
<?php
Your php script here..
?>
Using Javascript
<script type="text/javascript>
window.setInterval(function(){
reload_page();
}, 5000);
//Here 5000 in miliseconds so for 5 seconds use 5000
function reload_page()
{
window.location = 'pagename.php';
}
The most reasonable way to do it would be to use the client side to refresh the page.
Get rid of all that infinite loop stuff on the PHP side. PHP will only output the image as it stands at the moment it was generated.
On the client side you could do something as simple as:
<META http-equiv="refresh" content="5;"> to force a refresh every 5 seconds.
If you only want to update when the file is updated you have to get more advanced. You could do an ajax call that checks if the file has changed and if so it refreshes. Websockets would be another option.
You could possibly do some nasty hack on the PHP side to make it work using ob_flush and sleep and header within a loop that checks to see if the file has changed but this will cause you to lose sleep once you realize what you've done. As pointed out below, this would never work.

Make checks for submit buttons

I encountered some problems, I want this script to:
Open test.txt file.
Check if user have added any text to the txt file.
If user have added any text, delete the existing line and replace it with the new. From $_POST.
If user have not, add $_POST in test.txt
Problem:
When I spam the submit button, the .txt will mess up. Anyone know how to make checks, so it does not mess up?
Please don't suggest MYSQL, I need these in .txt file.
Thanks.
function cutline($filename,$line_no=-1) {
$strip_return=FALSE;
$data=file($filename);
$pipe=fopen($filename,'w');
$size=count($data);
if($line_no==-1) $skip=$size-1;
else $skip=$line_no-1;
for($line=0;$line<$size;$line++)
if($line!=$skip)
fputs($pipe,$data[$line]);
else
$strip_return=TRUE;
return $strip_return;
}
if ($userid = 1) {
if(!isset($_POST['submit'])){
?>
<center><form action="" method="POST">
<b>HWID</b>
<input type="text" name="HWID" />
<input type="submit" value="Add HWID" name="submit">
</form>
</center>
<?php
}else{
$userid= 1;
$userid = "user=" . $userid;
$file = "test.txt";
$lines = file($file);
$count = 1;
foreach ($lines as $e) {
if(strpos($e, $userid) !== FALSE){
cutline($file,$count);
++$count;
}
}
$fh = fopen($file, 'a') or die("can't open file");
$stringData = $userid . $_POST['HWID'] . "\n";
fwrite($fh, $stringData);
}
}
}else{
echo "You're not logged in";
}
?>
I am not 100% sure how the text file is messing up, but I guess locking won't help here as locks are released when the script finished (or is reloaded).
It looks like you just "kill" your cutline while in progress and the remaining lines will not be written. One way to fix this could be to save the new content of the file in a temporary variable and call fwrite only once. (I am not 100% sure if this will work)
Another possibility is to write the results of cutline into a temporary file and replace the old file with the new one, when the cutline method is done. This can happen inside the method.
In either ways the existing file will not be touched if the script gets killed in an unsafe state. But you can still loose the new input from the user when he manages to reload the page right after the function call of cutline and before you add the new input in this line
fwrite($fh, $stringData);
I think this is really hard to force as this operation is quite fast.
EDIT:
Don't forget to test the script using multiple users at the same time, if this is a valid use case. If two or more guys are editing the same file at the same time it will mess up as well. So you might end up with some locking but that will not solve the problem described here.

php , read file code problem

I was using this piece of php code for a site.
Now its old and I recently had a few attacks. Script was used for to include another file from someplace else and send spam. Obviously this makes my script as spam sender.
for the content
$htm = ".htm";
$pid = "$details$htm";
function show_details($pid)
{
if (!preg_match("/http/", $pid)) {
require($pid);
} else {
die;
}
}
and for the title, desc , keywords etc..
$txt = ".txt";
$title = "$details$txt";
function show_title($title)
{
if (!preg_match("/http/", $title)) {
if (file_exists($title)) {
require($title);
} else {
die;
}
}
}
and a display.php file with
print '
<!-- CONTENT -->
';
show_details("$pid");
print '
by this code ı was able to call any content by "/display.php?details=mycontentpage"
mycontentpage.htm
mycontentpage.txt
.............
Now this code has to be re-coded .. I can not change the construction as the site is just too big.
So I guess I just have to stick to this..
Can anyone help ? Any comments ?
To make scripts like this more secure, you have to ensure register_globals is set to OFF. This means you'll have to add a line like:
php_flag register_globals off
...To .htaccess. Then, declare all your user variables the first time you use them like:
$details = $_GET['details']
...Which assigns the data from the URI piece "details" to the PHP variable $details.
I can very much see how your attackers were able to get in via your code and register_globals set to on -- they'd need to merely create a .htm file with PHP code in it that reassigns other variables, include it, then viola.
For more info, see:
http://us2.php.net/manual/en/security.globals.php
Hope this helps!

Quick and easy flood protection?

I have a site where a user submits a message using AJAX to a file called like.php. In this file the users message is submitted to a database and it then sends a link back to the user. In my Javascript code I disabled the text box the user types into when they submit the AJAX request.
The only problem is, a malicious user can just constantly send POST requests to like.php and flood my database. So I would like to implement simple flood protection.
I don't really want the hassle of another database table logging users IPs and such... as if they are flooding my site there will be a lot of database read/writes slowing it down. I thought about using sessions, like have a session that contains a timestamp that gets checked every time they send data to like.php, and if the current time is before the timestamp let them add data to the database, otherwise send out an error and block them. If they are allowed to enter something into the database, update their session with a new timestamp.
What do you think? Would this be the best way to go about it or are there easier alternatives?
Thanks for any help. :)
Session is the easiest to do this, and has the least overhead as well. You can store two bits of data in the session, timestamp of last post, and the ip the post is comming from. Here is how you check legitimacy then:
session_start();
if(isset($_SESSION['ip']) && $_SESSION['last_post'] + MININTERVAL < time()) die('too early');
$_SESSION['last_post'] = time();
$_SESSION['ip'] = $_SERVER['REMOTE_ADDR'];
// store the message
Use a token. You generate the token and add it to the page originating the request. In like.php you verify that the request contains a valid token, which means it comes from your page instead of an external one POSTing directly.
You don't need to go through the whole record file. Instead:
<?php
define("FLOODPOOL", ".");
define("FLOODPOOL_LIMIT", 30);
define("FLOODPOOL_DURATION", 60 * 60 * 24);
define("FLOODPOOL_AUTOCLEAN", true);
// Record and check flood.
// Return true for hit.
function floodpool_check($id){
$fp = fopen(FLOODPOOL . DIRECTORY_SEPARATOR . 'fp_' . basename($id), 'a+');
fwrite($fp, pack('L', time()));
if(fseek($fp, -4 * FLOODPOOL_LIMIT, SEEK_END) === -1) {
return false;
}
$time = reset(unpack('L', fread($fp, 4)));
fclose($fp);
if(time() - $time < FLOODPOOL_DURATION) {
if(FLOODPOOL_AUTOCLEAN){
#floodpool_clean();
}
return true;
}
return false;
}
// Clean the pool.
function floodpool_clean(){
$handle = opendir(FLOODPOOL);
while(false!==($entry=readdir($handle))){
$filename = FLOODPOOL . DIRECTORY_SEPARATOR . $entry;
if(time() - filectime($filename) > FLOODPOOL_DURATION && substr($entry, 0, 3) === 'fp_'){
unlink($filename);
}
}
closedir($handle);
}
Usage example:
if(floodpool_check($_SERVER['REMOTE_ADDR'])){
header("HTTP/1.1 429 Too Many Requests");
exit("Hit some *");
}
Another way to do this is to write a hidden form input to the page (that calls like.php) using jQuery. A bot won't be using javascript so your hidden form field won't exist.
Check for the hidden field (assign it a value and a name) and if it exists, then hit the database with the request.
Another way; code a hidden element into the page (<input style='display:none;' name='nospam' value='' />). A bot will auto-fill every field in the form, so you just check if this field is populated - a user can't see it so you know it's a bot if you've got content there.
Set the style (display:none;) using jQuery tho... again, a bot won't see the jQuery, so it will think this is a legit form input.
You may want to specify a 'this page requires javascript to run' notice somewhere for the user. Some alternative suggestions. After all - you said 'simple' ;)
Well I made a script to handle it for core requests only (no session requests or other requests who aren't calling the core). If you have a look to google you'll find scripts/classes which will kill your server because of high loads every time. The fact, that many use SESSIONs and maybe ALSO SQL/Database will let you get a flooding protection as a server-killer. Also the fact that SESSIONs need a Cookie (or a GET SID) so you can manipulate SESSIONs easy to get a new SESSION ID.
My function is text-based and do a simple handling. The bad thing is that you maybe have to use a CronJob to delete ips from time to time. Comparing to other scripts its about 10* faster (and more save than sessions).
I don't know if its really useful at all. ;)
You maybe like to change the rpm value to less or/and also the 200 req. My setting is a ban for a bot doing interval requests in <=6 seconds.
<?php
function ht_request_limiter() {
if (!isset($_SERVER['REMOTE_ADDR'])) { return; } // Maybe its impossible, however we check it first
if (empty($_SERVER['REMOTE_ADDR'])) { return; } // Maybe its impossible, however we check it first
$path = '/your/path/ipsec/'; // I use a function to validate a path first and return if false...
$path = $path.$_SERVER['REMOTE_ADDR'].'.txt'; // Real file path (filename = <ip>.txt)
$now = time(); // Current timestamp
if (!file_exists($path)) { // If first request or new request after 1 hour / 24 hour ban, new file with <timestamp>|<counter>
if ($handle = fopen($path, 'w+')) {
if (fwrite($handle, $now.'|0')) { chmod($path, 0700); } // Chmod to prevent access via web
fclose($handle);
}
}
else if (($content = file_get_contents($path)) !== false) { // Load existing file
$content = explode('|',$content); // Create paraset [0] -> timestamp [1] -> counter
$diff = (int)$now-(int)$content[0]; // Time difference in seconds from first request to now
if ($content[1] == 'ban') { // If [1] = ban we check if it was less than 24 hours and die if so
if ($diff>86400) { unlink($path); } // 24 hours in seconds.. if more delete ip file
else {
header("HTTP/1.1 503 Service Unavailable");
exit("Your IP is banned for 24 hours, because of too many requests.");
}
}
else if ($diff>3600) { unlink($path); } // If first request was more than 1 hour, new ip file
else {
$current = ((int)$content[1])+1; // Counter + 1
if ($current>200) { // We check rpm (request per minute) after 200 request to get a good ~value
$rpm = ($current/($diff/60));
if ($rpm>10) { // If there was more than 10 rpm -> ban (if you have a request all 5 secs. you will be banned after ~17 minutes)
if ($handle = fopen($path, 'w+')) {
fwrite($handle, $content[0].'|ban');
fclose($handle);
// Maybe you like to log the ip once -> die after next request
}
return;
}
}
if ($handle = fopen($path, 'w+')) { // else write counter
fwrite($handle, $content[0].'|'.$current .'');
fclose($handle);
}
}
}
}
Edit: My way to test the request time was with microtime and simulate 10'000 users. I ask google and tested (as example) http://technitip.net/simple-php-flood-protection-class
So I don't know what should be simple there? You have about 3 SQL Requests at one time like:
$this -> user_in_db($ip))
$this->user_flooding($ip);
$this->remove_old_users();
It maybe supply more functions, but all legit users use servertime for nothing. ;)
If you want to stop flooding a search page you can try it like this way:
$flood_protection_interval = 2;
session_start();
if(
isset($_SESSION['ip']) &&
$_SESSION['counter'] > 10 &&
$_SESSION['last_post'] + $flood_protection_interval > time()
){
// $_SESSION['counter'] = 0; // Use this if you want to reset counter
die("<pre>\n\n\n\t<b>FLOOD PROTECTION</b>");
}
$_SESSION['counter']++;
$_SESSION['last_post'] = time();
$_SESSION['ip'] = $_SERVER['REMOTE_ADDR'];
So if your visitor search 10 times under e.g. 2 seconds he will be stopped!
I thought about using sessions, like
have a session that contains a
timestamp that gets checked every time
they send data to like.php
This won't stop bots as they can receive and send the same cookies that users do.
You should really have users logging into such a system. Seems to be worth protecting access. You could also consider limiting posts per minute per ip but multiple bots could still send many spam messages.
If you don't want to implement a login then many sites use captcha to try and cut down on such attempts.
http://www.phpcaptcha.org/

Categories