I want to store some counter and want to increment as desired.
This counters are not related to any client, so i cant use session or cookies.
I tried $GLOBALS but, its not what i want.
I want something like, let say i have 3 php files, each will do some counter manuplation.
init.php
$_GLOBAL_VARIABLE['cntr1'] = 0;
file1.php
$_GLOBAL_VARIABLE['cntr1'] = $_GLOBAL_VARIABLE['cntr1'] + 7;
file2.php
$_GLOBAL_VARIABLE['cntr1'] = $_GLOBAL_VARIABLE['cntr1'] + ($_GLOBAL_VARIABLE['cntr1'] * 0.90);
file3.php
echo $_GLOBAL_VARIABLE['cntr1'];
All three files (except init.php) will called randomly without ant relation and init will called once.
I dont want to try database transaction coz counter manuplation is very frequent, and file i/o is one and the same. I am looking for some way to store my data on server till the time its up and running, somewhat like global class and variables in c#.
If you want the store globally accessible value in the server without the use of database, cookie or session then memcache could be a solution for you. Its a daemon which allows you to store data and use it across different connection requests. If you have frequent visits you will have to somehow handle concurrency within you application.
I think this will work
global $cntr1;
If not then you can make one .inc.php and include this file to all pages.
This will resolve your issue.
im sure this is NOT what you wanted, but i've used fileread/filewrite to store my globals in a file on the drive, which can be read from, written to with updated values etc. This allows for you to set MANY global variables as int's, I've modified my globals code to work as a iterator, counting up or counting down by what ever value you pass.
its a simple quick class i made to handle the request :
<?php
class my_global{
protected $name;
protected $value;
static protected $path = './globals/';
public function __construct()
{
if(!is_dir(self::$path))
mkdir(self::$path);
}
public function change($name, $value)
{
$current = $this->get($name);
$this->set($name,$current+$value);
return $current+$value;
}
protected function set($name, $value)
{
$this->name = $name;
$this->value = $value;
$this->write();
}
protected function get($name)
{
if(file_exists(self::$path.$name))
{
$myFile = self::$path.$name;
$fh = fopen($myFile, 'r');
$value = fread($fh, filesize($myFile));
fclose($fh);
}
else
$value = 0;
$this->name = $name;
$this->value = $value;
return $value;
}
protected function write(){
$myFile = self::$path.$this->name;
$fh = fopen($myFile, 'w') or die("can't open file");
fwrite($fh, $this->value);
fclose($fh);
}
}
$my_global = new my_global();
?>
You can then just call the $my_global->change() method to increase or decrease the counter
<?php
echo $my_global->change('new_global',5).'<br>';
echo $my_global->change('anotherglobal',-2).'<br>';
echo $my_global->change('forme',7).'<br>';
?>
this is more food for thought than anything, but could be used tweaked to work as you need it.
Related
I'm setting session in run method of thread class but i don't access to session from out.
And i create file by fopen in run method but the file also doesn't create.
for example, I'm using by the following codes:
session_start();
class Async extends Thread
{
public function run()
{
$fp = fopen('test.txt', 'w');
fwrite($fp, '1');
fclose($fp);
$_SESSION['test'] = 'test';
}
}
foreach ($tests as $test)
{
$workers[$i] = new Async();
$workers[$i]->start();
}
echo $_SESSION['test'];
Update a SESSION or any other variable by multiple threads is not safe !!
What do you want to do is dangerous: you can easily lose data, because your session's update function is not synchronized between different threads
The solution is to update your code like this :
<?php
session_start();
class Async extends Thread
{
private $_session = NULL;
public function __construct($session)
{
$this->_session = $session;
}
public function run()
{
// imagine if N threads want to open the same file with 'write' mode ?
$fp = fopen(Thread::getCurrentThreadId() . '_test.txt', 'w');
fwrite($fp, '1');
fclose($fp);
$this->_session['test'] = 'test';
}
public function getSession()
{
return $this->_session;
}
}
foreach ($tests as $test)
{
$workers[$i] = new Async($_SESSION);
$workers[$i]->start();
// to synchronize thread operations : wait until the launched thread has terminated
$workers[$i]->join();
$_SESSION = $workers[$i]->getSession();
}
echo $_SESSION['test'];
Notes :
While i'm doing some tests i have found an issue when i try to update an array in thread, so i've opened a new question in SO http://stackoverflow.com/q/32476271/4098311
I'm not very sure that `$_SESSION` is visible inside a thread, so i've passed it as an argument to the constructor
There are two possibilities.
1) run() function is not called due to some error.
2) As you said that fopen is not create a file, So it it possible that due to some error file does not create and code execution is stop before $_SESSION['test'] define.
i' m working on a monitor for a structure composed by a lot of computers. I develop it in PHP and i want to ping all the structure by the quickest way.
For this i use the multi-thread inherited from pthread enter link description here So i follow some tutorials and finally i have one class (extended from Thread) and a caller in an other script :
The class :
class Ping extends Thread{
public $id;
public $name;
public function __construct($id, $name){
$this->id = $id;
$this->name = $name;
}
public function run(){
$ping = exec("ping -n 1 -w 80 " . $this->name);
$h = fopen("ping.json", 'w');
if(preg_match("#perte 100#", $ping)){
fwrite($h,'d');
}
else {
fwrite($h,'c');
}
fclose($h);
}
}
The caller :
$p = array();
foreach($array_computer as $comp){
array_push($p, new Ping(array_search ($comp , $array_computer), $comp->{'name'}));
}
foreach ($p as $p_t){
$p_t->start(PTHREADS_INHERIT_ALL);
}
So i have two problems :
1 . When i want to echo $id or $name, nothing is display
2 . I can't open 'ping.json' because : "failed to open stream: Permission denied"
If i replace
$p_t->start(PTHREADS_INHERIT_ALL);
by
$p_t->run();
The call works but i lose the interest of multithread :P
Could it be var $p_t that is an instance of Ping?
foreach ($p as $p_t)
Also try checking for instance ie.
if ($p_t instanceof Ping) {
$p_t->start();
}
Some question/remarks to help you along:
you are writing something to a file based on a computername, but has
this name a value? Otherwise the output file will be empty and you
won't get a result.
Within the foreach, echo the parameters you want to store in $p so
you know you are putting something in there.
where does the start method come from? Is $p an array of objects
object? Is so, where is the method declared?
$comp->{'name'} looks odd to me; why not use $comp->name if it's an
object or $comp['name'] if it's an array?
you are not storing any resulting values in your public variables; therefore
when an input value is blank, it wil stay blank.
Thanks for answers,
i found an other solution by storing data results in a database into the run() function.
I think that some conflicts appeared when i want to write the json file at the same time, and lock the file with flock() wasn't efficient.
I put the Ping class and the caller in the same file and ordinate the thread excution like this :
$i = 0;
foreach ($p as $p_t){
while($i > 15){}
$i++;
$p_t->start();
if($p_t->join())$i--;
}
3 days crashing my head towards a wall.
I developed a php script for import big text files and populate mysql database. Until i get 2 million records it works perfectly but i need to import like 10 million rows divided in different files.
My application scans files in a folder, get file extension (i have 4 kind of procedures import for 4 different extensions) and call the relative import function.
I have a structure made of theese classes:
CLASS SUBJECT1{ public function import_data_1(){
__DESTRUCT(){$this->childObject = null;}
IMPORT SUBJECT1(){
//fopen($file);
//ob_start();
//PDO::BeginTransaction();
//WHILE (FILE) {
//PREPARED STATEMENT
//FILE READING
//GET FILE LINE
//EXECUTE INSERT
//} END WHILE
//PDO::Commit();
//ob_clean(); or ob_flush();
//fclose($file);
//clearstatcache();
}
};}
CLASS SUBJECT2{ same as SUBJECT1;}
CLASS SUBJECT3{ same as SUBJECT1;}
CLASS SUBJECT4{ same as SUBJECT1;}
and the main class that launches the procedure:
CLASS MAIN{
switch($ext)
case "ext1":
$SUBJECT1 = new SUBJECT1();
IMPORT_SUBJECT1();
unset $SUBJECT1;
$SUBJECT1 = null;
break;
case "ext2": //SAME AS CASE ext1 WITH IMPORT_SUBJECT2();
case "ext3": //SAME AS CASE ext1 WITH IMPORT_SUBJECT3();
case "ext4": //SAME AS CASE ext1 WITH IMPORT_SUBJECT4();
}
It works perfectly with some adjustement of mysql file buffers (ib_logfile0 and ib_logfile1 are set as 512Mb).
The problem is that everytime a procedure is terminated php does not free memory. I'm sure that destructor is called (i put an echo inside __destruct method) and the object is not accesible (var_dump say is NULL). I tried so many ways to free memory but now i'm at a dead point.
I also verified
gc_collect_cycles()
in many different point of code and it always says 0 cycles so all abject are not referenced each other.
I tried even to delete class structure and call all the code sequential but i always get this error:
Fatal error: Out of memory (allocated 511180800) (tried to allocate 576 bytes) in C:\php\index.php on line 219 (line 219 is execute of a PS on the 13th file).
The memory is used in this way:
php script: 52MB
end first file import :110MB
destructors and unset calling: 110MB
new procedure calling: 110MB
end second file import 250MB
destructors and unset calling: 250MB
new procedure calling: 250MB
So as you can see even unsetting objects they don't free memory.
I tried setting php ini memory size to 1024M but it grows up really fast and crashes after 20 files.
Any advice?
Many thanks!
EDIT 1:
posting code:
class SUBJECT1{
public function __destruct()
{
echo 'destroying subject1 <br/>';
}
public function import_subject1($file,$par1,$par2){
global $pdo;
$aux = new AUX();
$log = new LOG();
// ---------------- FILES ----------------
$input_file = fopen($file, "r");
// ---------------- PREPARED STATEMENT ----------------
$PS_insert_data1= $pdo->prepare("INSERT INTO table (ID,PAR1,PAR2,PARN) VALUES (?,?,?,?) ON DUPLICATE KEY UPDATE ID = VALUES(ID), PAR1 = VALUES(PAR1), PAR2 = VALUES(PAR2), PAR3 = VALUES(PAR3), PARN = VALUES(PARN)");
$PS_insert_data2= $pdo->prepare("INSERT INTO table (ID,PAR1,PAR2,PARN) VALUES (?,?,?,?) ON DUPLICATE KEY UPDATE ID = VALUES(ID), PAR1 = VALUES(PAR1), PAR2 = VALUES(PAR2), PAR3 = VALUES(PAR3), PARN = VALUES(PARN)");
//IMPORT
if ($input_file) {
ob_start();
$pdo->beginTransaction();
while (($line = fgets($input_file)) !== false) {
$line = utf8_encode($line);
$array_line = explode("|", $line);
//set null values where i neeed
$array_line = $aux->null_value($array_line);
if(sizeof($array_line)>32){
if(!empty($array_line[25])){
$PS_insert_data1->execute($array_line[0],$array_line[1],$array_line[2],$array_line[5]);
}
$PS_insert_data2->execute($array_line[10],$array_line[11],$array_line[12],$array_line[15]);
}
$pdo->commit();
flush();
ob_clean();
fclose($f_titolarita);
clearstatcache();
}
I do this iterative for all files of my folder, the other procedures are the same concept.
I still have increase of memory and now it crashes with a white page response :-\
Personally, I would go slightly different about it. These are the steps I would do:
Open a PDO connection, set PDO in Exception mode
Get a list of files that I want to read
Create a class that can utilize PDO and the list of files and perform insertions
Prepare the statement ONCE, utilize it many times
Chunk PDO transaction commits to 50 (configurable) inserts - this means that every 50th time I call $stmt->execute(), I issue a commit - which utilizes the HDD better thus making it faster
Read each file line by line
Parse the line and check if it's valid
If yes, add to MySQL, if not - report an error
Now, I've created 2 classes and example on how I'd go about it. I tested only up to the reading part since I don't know your DB structure nor what AUX() does.
class ImportFiles
{
protected $pdo;
protected $statements;
protected $transaction = false;
protected $trx_flush_count = 50; // Commit the transaction at every 50 iterations
public function __construct(PDO $pdo = null)
{
$this->pdo = $pdo;
$this->stmt = $this->pdo->prepare("INSERT INTO table
(ID,PAR1,PAR2,PARN)
VALUES
(?,?,?,?)
ON DUPLICATE KEY UPDATE ID = VALUES(ID), PAR1 = VALUES(PAR1), PAR2 = VALUES(PAR2), PAR3 = VALUES(PAR3), PARN = VALUES(PARN)");
}
public function import($file)
{
if($this->isReadable($file))
{
$file = new FileParser($file);
$this->insert($file);
}
else
{
printf("\nSpecified file is not readable: %s", $file);
}
}
protected function isReadable($file)
{
return (is_file($file) && is_readable($file));
}
protected function insert(FileParser $file)
{
while($file->read())
{
//printf("\nLine %d, value: %s", $file->getLineCount(), $file->getLine());
$this->insertRecord($file);
$this->flush($file);
}
$this->flush(null);
}
// Untested method, no idea whether it does its job or not - might fail
protected function flush(FileParser $file = null)
{
if(!($file->getLineCount() % 50) && !is_null($file))
{
if($this->pdo->inTransaction())
{
$this->pdo->commit();
$this->pdo->beginTransaction();
}
}
else
{
if($this->pdo->inTransaction())
{
$this->pdo->commit();
}
}
}
protected function insertRecord(FileParser $file)
{
$check_value = $file->getParsedLine(25);
if(!empty($check_value))
{
$values = [
$file->getParsedLine[0],
$file->getParsedLine[1],
$file->getParsedLine[2],
$file->getParsedLine[5]
];
}
else
{
$values = [
$file->getParsedLine[10],
$file->getParsedLine[11],
$file->getParsedLine[12],
$file->getParsedLine[15]
];
}
$this->stmt->execute($values);
}
}
class FileParser
{
protected $fh;
protected $lineCount = 0;
protected $line = null;
protected $aux;
public function __construct($file)
{
$this->fh = fopen($file, 'r');
}
public function read()
{
$this->line = fgets($this->fh);
if($this->line !== false) $this->lineCount++;
return $this->line;
}
public function getLineCount()
{
return $this->lineCount;
}
public function getLine()
{
return $this->line;
}
public function getParsedLine($index = null)
{
$line = $this->line;
if(!is_null($line))
{
$line = utf8_encode($line);
$array_line = explode("|", $line);
//set null values where i neeed
$aux = $this->getAUX();
$array_line = $aux->null_value($array_line);
if(sizeof($array_line) > 32)
{
return is_null($index) ? $array_line : isset($array_line[$index]) ? $array_line[$index] : null;
}
else
{
throw new \Exception(sprintf("Invalid array size, expected > 32 got: %s", sizeof($array_line)));
}
}
else
{
return [];
}
}
protected function getAUX()
{
if(is_null($this->aux))
{
$this->aux = new AUX();
}
return $this->aux;
}
}
Usage:
$dsn = 'mysql:dbname=testdb;host=127.0.0.1';
$user = 'dbuser';
$password = 'dbpass';
try
{
$pdo = new PDO($dsn, $user, $password);
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$import = new ImportFiles($pdo);
$files = ['/usr/local/file1.txt', '/usr/local/file2.txt'];
foreach($files as $file)
{
$import->import($file);
}
} catch (Exception $e)
{
printf("\nError: %s", $e->getMessage());
printf("\nFile: %s", $e->getFile());
printf("\nLine: %s", $e->getLine());
}
SOLVED:
i did this approach, maybe is useful for someone who has similar problem:
I opened task manager and looked at memory usage for apache and mysql processes with these cases:
Tried to read and elaborate files without calling MySql procedures (memory usage was ok)
Tried to read, elaborate and inserting in db just files with extension one by one (all .ext1, all .ext2, ....)
Debugged the procedure with big memory encreasing isolating functions one by one finding the problematic one.
Found the problem and solved
The problem was that i called a function passing as parameter the Prepared Statement. I thought that, once prepared, it was just a "static" object to call. What happens is that if you pass the same PS in a function the memory grows up exponentially.
Hope this helps to someone.
Bye!
I'm new to OOP terminology, I am trying to create a class that make a hit counter.
I try the code below but it create just a counter.txt page with inside value 1. I dont know why its not incrementing.
class LOGFILE {
public function READ($FileName) {
$handle = fopen($FileName, 'r');
$fread = file_get_contents($FileName);
return $fread;
fclose($handle);
}
public function WRITE($FileName, $FileData) {
$handle = fopen($FileName, 'w');
$FileData = $fread +1;
fwrite($handle, $FileData);
fclose($handle);
}
}
$logfile = new LOGFILE();
$logfile -> WRITE("counter.txt",$FileData);
echo $logfile -> READ("counter.txt");
The reason is that $fread is local variable for both READ and WRITE methods. You need to make it private global variable for your class:
class LOGFILE {
private $fread;
public function READ($FileName) {
$this->fread = file_get_contents($FileName);
return $this->fread;
}
public function WRITE($FileName) {
$this->READ($FileName);
$handle = fopen($FileName, 'w');
$FileData = $this->fread +1;
fwrite($handle, $FileData);
fclose($handle);
}
}
$logfile = new LOGFILE();
$logfile -> WRITE("counter.txt");
echo $logfile -> READ("counter.txt");
Note: I have removed fopen and fclose because file_get_contents does not need it. In write you can use file_put_contents. Removed not used variable $FileData too. It's always a good practice to create variables methods and classes when they are needed.
Also take a look at best practices how to name your classes, variables, methods and so on. Here's best guide, IMO.
Let's start going over the corrected code and see what was missing:
<?php
class LOGFILE {
public function READ($FileName) {
$handle = fopen($FileName, 'r');
$fread = fgets($handle, 8192);
fclose($handle);
return $fread;
}
public function WRITE($FileName, $FileData) {
$counter = $this->READ($FileName);
$handle = fopen($FileName, 'w');
fwrite($handle, $FileData + $counter);
fclose($handle);
}
}
$logfile = new LOGFILE();
$FileData = 1;
$logfile -> WRITE("counter.txt",$FileData);
echo $logfile -> READ("counter.txt")."\n";
$logfile -> WRITE("counter.txt",$FileData);
echo $logfile -> READ("counter.txt")."\n";
?>
use of fgets instead of file_get_contents in READ (you can choose to use file_get_contents but I rather stay consistent with the other function that uses fopen)
use of READ inside function WRITE (the principal of code-reuse)
open of file with write permissions in WRITE: 'w'
init $FileData = 1;
no need to hold a private member: $fread
most important: do not write statements after return (like you did in READ) - statements that are written after return will not be executed!
This solution was tested successfully.
OOP must be used where it's needed. You need a simple thing so, no need of OOP.
<?php
function addValue($file='counter.txt', $amount=1) {
if( false == is_file($file) ) {
return false;
}
$initial = file_get_contents($file);
return #file_put_contents($initial+$amount);
}
addValue();
?>
Test your OOP knowledge on something complex, like a shopping cart or some other concept.
EDIT // so, if you need a simple example that looks complex, here you go :)
<?php
class log {
public $file = '';
private $amount = 0;
public function __construct( $file ) {
$this->file = $file;
$this->amount = 1;
}
public function makeAdd() {
$initial = file_get_contents($this->file);
return #file_put_contents($this->file, $initial + $this->amount);
}
function __call($f, $args) {
switch( $f ) {
case 'add':
if(isset($args[0]) && !empty($args[0])) {
$this->amount = (int)$args[0];
}
if( $this->amount == 0 ) {
throw new Exception('Not a valid amount.');
}
return $this->makeAdd();
break;
}
}
}
try {
// create log
$L = new log('count.txt');
// this will add 2
var_dump($L->add(2));
// this will also add 2
var_dump($L->add());
// until you rewrite the amount
var_dump($L->add(1));
// final result -> 5
} catch(Exception $e) {
die($e->getMessage());
}
?>
Good luck!
Use UpperCamelCase for class names. LogFile, not LOGFILE. When you have a variable and the most interesting thing about it is that it's expected to hold a reference to something that is_a LogFile you should name it logFile.
Use lowerCamelCase for functions. read and write, not READ and WRITE
No spaces around the arrow operator
Code after a return statement in a method can never be reached, so delete it.
read() does not use the handle returned by fopen, so don't call fopen
the temp variable $freed doesn't help us understand the code, so we can lose it
read is a slightly unconventional name. If we rename the function to getCount it will be more obvious what it does.
You said you wanted to make a hit counter. So rename the class from LogFile to HitCounter, and the variable to hitCounter
the $FileData parameter to write doesn't get used because the variable is re-assigned inside the function. We can lose it.
The write method is supposed to add one to the number in the file. Write doesn't really express that. Rename it to increment.
Use a blank line between functions. The procedural code at the end should generally be in a separate file, but here we can just add a couple of extra lines. Delete the blanks between the last three lines of code.
Don't repeat yourself - we shouldn't have to mention 'counter.txt' more than once. OOP is all about combining data structures and behaviour into classes, so make a class private variable to hold the filename, and pass it via a constructor
$fread doesn't exist in the scope of increment, so we can't use it. This won't work. Replace it with a call to to getCount()
Swap the first two lines of increment, so we're not doing two concurent accesses to the same file, although we might be running inside a server that's running our script twice and still doing two concurrent accesses.
Rename the variable $FileData to $count, since that's what it is.
Replace the fopen,fwrite,fclose sequence with file_put_contents, since that does the same thing and is more succinct.
We need tag, since our php code continues to the end of the file.
That leaves us with:
<?php
class HitCounter {
private $fileName;
public function __construct($fileName){
$this->fileName = $fileName;
}
public function getCount() {
return file_get_contents($this->fileName);
}
public function increment() {
$count = $this->getCount() + 1;
file_put_contents($this->fileName, $count);
}
}
$hitCounter = new HitCounter("counter.txt");
$hitCounter->increment();
echo $hitCounter->getCount();
You can create a static counter and increment it each time (instead of create file)
<?php
class CountClass {
public static $counter = 0;
function __construct() {
self::$counter++;
}
}
new CountClass();
new CountClass();
echo CountClass::$counter;
?>
I'm trying to create a PHP file, which wouldn't run if it's already running. Here's the code I'm using:
<?php
class Test {
private $tmpfile;
public function action_run() {
$this->die_if_running();
$this->run();
}
private function die_if_running() {
$this->tmpfile = #fopen('.refresher2.pid', "w");
$locked = #flock($this->tmpfile, LOCK_EX|LOCK_NB);
if (! $locked) {
#fclose($this->tmpfile);
die("Running 2");
}
}
private function run() {
echo "NOT RUNNNING";
sleep(100);
}
}
$test = new Test();
$test->action_run();
The problem is, when I run this from console, it works great. But when I try to run it from browser, many instances can run simultaneously. This is on Windows 7, XAMPP, PHP 5.3.2. I guess OS is thinking that it's the same process and thus the functionality falls. Is there a cross-platform way to create a PHP script of this type?
Not really anything to promising. You can't use flock for that like this.
You could use system() to start another (php) process that does the locking for you. But drawbacks:
You need to do interprocess communication. Think about a way how to tell the other program when to release the lock etc. You can use stdin for messenging und use 3 constants or something. In this case it's still rather simple
It's bad for performance because you keep creating processes which is expensive.
Another way would be to start another program that runs all the time. You connect to it using some means of IPC (probably just use a tcp channel because it's cross-platform) and allow this program to manage file acces. That program could be a php script in an endless loop as well, but it will probably be simpler to code this in Java or another language that has multithreading support.
Another way would be to leverage existing ressources. Create a dummy database table for locks, create an entry for the file and then do table-row-locking.
Another way would be not to use files, but a database.
I had a similar problem a while ago.
I needed to have a counter where the number returned was unique.
I used a lock-file and only if this instance was able to create the lock-file was it allowed to read the file with the current number.
Instead of counting up perhaps you can allow the script to run.
The trick is to let try a few times (like 5) with a small wait/sleep in between.
function GetNextNumber()
{
$lockFile = "lockFile.txt";
$lfh = #fopen($lockFile, "x");
if (!$lfh)
{
$lockOkay = false;
$count = 0;
$countMax = 5;
// Try ones every second in 5 seconds
while (!$lockOkay & $count < $countMax)
{
$lfh = #fopen($lockFile, "x");
if ($lfh)
{
$lockOkay = true;
}
else
{
$count++;
sleep(1);
}
}
}
if ($lfh)
{
$fh = fopen($myFile, 'r+') or die("Too many users. ");
flock($fh, LOCK_EX);
$O_nextNumber = fread($fh, 15);
$O_nextNumber = $O_nextNumber + 1;
rewind($fh);
fwrite($fh, $O_knr);
flock($fh, LOCK_UN);
fclose($fh);
unlink($lockFile); // Sletter lockfilen
}
return $O_nextNumber;
}