php - How do you delete entry from .html file? - php

I have a file, online.html. The file is supposed to store the names of all the current users who are online on the site. The document might look something like this:
<p>python-b5</p>
<p>Other user</p>
They are added with code like this:
$fp = fopen("online.html", 'a');
fwrite($fp, "<p>" . $_SESSION ['name'] . "</p>");
fclose($fp);
When the user logs out, I would like to delete the entry in online.html for their user. However, I can't figure out how to accomplish it. How would I delete the entry?

The idea of using an HTML file as a database will have serious problems.
There are two big problems.
Making sure that only one process updates the file at any given time.
If more than one process updates the file at the same time, the file will get corrupted or it won't record all the updates.
Finding the right line to delete. You will need a way to unambiguously identify the line to delete.
Assuming you have some sort of user id, you could put that into the html.
<!-- BEGIN -->
<p data-id="1">Promethus</p>
<p data-id="2">Orpheus</p>
<p data-id="99">Tantalus</p>
<p data-id="11895">Marcus</p>
<!-- END -->
Delete the line.
// WARNING ! This function can ONLY be called by one process at a time.
function deleteLine($file, $id)
{
$tmpfile = $file . '.tmp';
$fp = fopen($file, 'r');
$fpout = fopen($tmpfile, 'w');
while ( $line = fgets($fp, 1000) ) {
// only write lines that are not the id we are searching for
if ( ! preg_match("/data-id=\"$id\"/", $line) {
fwrite($fpout, $line . "\n");
}
}
fclose($fp); fclose($fpout);
rename($tmpfile, $file);
}

why do u want to write the online status of users as a static file?
Normally you provide some kind of last_activity timestamp in your users table which can be updated on every request.
Then you can say every user that has the last_activity timestamp within the last 5-10 minutes is treated as online - otherwise he's offline.
If you really want to keep that static file structure you can use the DOMDocument php extension to parse the DOM of the static file in the page and find the entry you want to remove.

Related

Big CSV exportation blocks user session with PHP

I'm trying to export a lot of data trough a CSV export. The amount of data it's really big, around 100.000 records and counting.
My client usually uses two tabs to browse and check several stuff at the same time. So a requirement is that while the export is being made, he can continues browsing the system.
The issue is that when the CSV is being generated on the server, the session is blocked, you cannot load another page until the generation is completed.
This is what I'm doing:
Open the file
Loop trough the amount of data(One query per cycle, each cycle queries 5000 records) pd: I cannot change this, because of certain limitations.
write the data into the file
free memory
close the file
set headers to begin download
During the entire process, it's not possible to navigate the site in another tab.
The block of code:
$temp = 1;
$first = true;
$fileName = 'csv_data_' . date("Y-m-d") . '-' . time() . '.csv';
$filePath = CSV_EXPORT_PATH . $fileName;
// create CSV file
$fp = fopen($filePath, 'a');
// get data
for ($i = 1; $i <= $temp; $i++) {
// get lines
$data = $oPB->getData(ROWS_PER_CYCLE, $i); // ROWS_PER_CYCLE = 5000
// if something is empty, exit
if (empty($data)) {
break;
}
// write the data that will be exported into a file
fwrite($fp, $export->arrayToCsv($data, '', '', $first));
// count element
$temp = ceil($data[0]->foundRows / ROWS_PER_CYCLE); // foundRows is always the same value, doesn't change per query.
$first = false; // hide header for next rows
// free memory
unset($lines);
}
// close file
fclose($fp);
/**
* Begin Download
*/
$export->csvDownload($filePath); // set headers
Some considerations:
The count is being made in the same query, but it's not entering into an infinite loop, works as expected. It's contained into $data[0]->foundRows, and avoids an unnecesary query to count all the available records.
There're several memory limitations due to environment settings, that I cannot change.
Does anyone know How can I improve this? Or any other solution.
Thanks for reading.
I'm replying only because it can be helpful to someone else. A colleague came up with a solution for this problem.
Call the function session_write_close() before
$temp = 1;
Doing this, you're ending the current session and storing the session data, so I'm being able to download the file a continue navigating in other tabs.
I hope it helps some one.
Some considerations about this solution:
You must no require to use session data after session_write_close()
The export script is in another file. For ex: home.php calls trough a link export.php

php remove duplicate if same mobile phone number is in .csv file

I have a separate php script file that saves to file a number csv values via a html contact form.
I would like a maximum of 2 duplicate rows based on mobile phone entries in csv file,
any more and I would want the current record deleted.
I am using the $_GET()(no $_POST() functions) function to record all entries, and then save to file.
Im just having issues with deleting duplicates if the mobile number is already TWICE in the file.
Any help would be greatly appreciated.
**ADDED MORE CODE AND COMMENT BELOW**
I have edited the code, but I am still running into trouble with removing duplicates, let alone check for 2 dupes first. I will do the sanitize and better code 'after' I have some function (help!).
Thanks again for your help :)
<?php
$filename = "input.csv";
$csv_output .= "\n";$title=$_GET[title];$fname=$_GET[fname];
$sname=$_GET[sname];$notes=$_GET[notes];$mobile=$_GET[mobile];
$string="$title,$fname,$sname,$mobile,$notes,$csv_output";
$file = fopen($filename, "c");
// see details on the 'c' mode here http://us3.php.net/manual/en/function.fopen.php - it will create a file if it does not exist.
// Now acquire an exclusive via flock() - http://us3.php.net/manual/en/function.flock.php
flock($file, LOCK_EX); // this will block till some other reader/writer has released the lock.
$stat = fstat($file)
if($stat['size'] == 0) {
// file created for the first time
fwrite($file, "Title,First Name,Last Name,MobileNumber,Notes\n$string");
flock($file, LOCK_UN);
fclose($file);
return;
}
// File not empty - scan thru line by line via fgets(), and detect duplicates
// If a duplicate is detected, just flock($file, LOCK_UN), close the file and return - ///// no need to fwrite the line.
while (($buffer = fgets($file, 2188)) !== false) {
if(!stripos($buffer, ",$mobile,") {
$mobile .= $buffer;
}
else {
flock($file, LOCK_UN);
fclose($file);
return;
}
}
?>
Are you running this on a Linux/Unix system? If so, the way you have accessed the file will lead to race-conditions and possible corruption of the file.
You need to ensure that the write to the file is done in a serialized manner if multiple processes are attempting to write to the same file.
As you don't want to explore other alternatives like a db (even key-value file-based dbs), a pseudo-code approach is:
$file = fopen($filename, "c"); // see details on the 'c' mode here http://us3.php.net/manual/en/function.fopen.php - it will create a file if it does not exist.
// Now acquire a exclusive via flock() - http://us3.php.net/manual/en/function.flock.php
flock($file, LOCK_EX); // this will block till some other reader/writer has released the lock.
$stat = fstat($file)
if($stat['size'] == 0)
{
// file created for the first time
fwrite($file, "Title,First Name,Last Name,MobileNumber,Notes\n$string");
flock($file, LOCK_UN);
fclose($file);
return;
}
// File not empty - scan thru line by line via fgets(), and detect duplicates
// If a duplicate is detected, just flock($file, LOCK_UN), close the file and return - no need to fwrite the line.
// Otherwise fwrite() the line
.
.
flock($file, LOCK_UN);
fclose($file);
You can fill in the details in the middle part - hope you got the gist of it.
You could potentially make it more 'scalable' by initially grabbing a read lock (this will allow multiple readers to run concurrently, and only the writer will block). Once the read portion is done, you need to release the lock, and if a write needs to be done (i.e. no duplicates detected), then grab a writer lock etc...
Clearly this is not a ideal solution but if your file contents are going to be small, it may suffice.
Stating the obvious, you would need to do better error handling with all file-based operations.
A tangential point: you should also sanitize the data from $_GET before going to the core logic to catch for invalid inputs.
Hope this helps.

PHP making blank file when I try to increment my variable?

I recently made a PHP program to count how many users are online using a program I created. It works like this:
Client sends register request to server (registerMember.php)
Server checks if the user is already online
If the user isn't already online, increment a number in a file
Write the persons username to a text file
My problem is that a lot of people would register in at the same time, and that would create a blank file.
$file_handle = fopen("registeredMembers", "r");
while (!feof($file_handle)) {
$line = fgets($file_handle);
$line++;
$fp = fopen('registeredMembers', 'w');
fwrite($fp, $line);
fclose($fp);
}
This is happening because there are two requests happening at the same time, and before the program writes the incremented version of the variable, another request has read a blank file before writing. Is there a better way I could be doing this? Any help would be greatly appreciated!
Try blocking the file with flock after you open it (see the example inside).

How to download the updated version of a server file in IE (PHP)?

Internet Explorer is giving me a serious headache...
What I want to do is - create a button upon clicking which an .csv file is getting downloaded to a client. This .csv file includes information stored in one of the result tables I am producing on the page.
Before I create this button I am calling an internal function to create the .csv file based on the currently displayed table. I create this .csv file on the server. I'll inlcude this function here just in case, but I don't think it is of any help. Like I said I create this file, before I am creating the button.
/*
/ Creates a .csv file including all the data stored in $records
/ #table_headers - array storing the table headers corresponding to $records
/ #records - two dimensional array storing the records that will be written into the .csv file
/ #filename - string storing the name of the .csv file created by this from
*/
public function export_table_csv( $table_headers, $records, $filename )
{
// Open the $filename and store the handle to it in $csv_file
$csv_file = fopen( "temp/" . $filename, "w" );
// Write the $table_headers into $csv_file as a first row
fputcsv( $csv_file, $table_headers );
// Iterate through $records and write each record as one line separated with commas to $csv_file
foreach ( $records as $row )
fputcsv( $csv_file, $row );
// Close $csv_file
fclose( $csv_file );
} // end export_table_csv()
I have that working fine. I got the 'Export' button and I am using its onClick() event where I am using a one-liner:
window.open( 'temp/' . $export_filename );
Now, it works as intended in all browsers, except IE. The file still gets downloaded, but then when I perform some filtering on the table I am displaying on the page (the page gets reloaded whenever new filters are applied), and then press the 'Export' button again, it somehow downloads an old version of the .csv file with the old filters applied, not the current ones, even though this .csv file is re-written every time the new filters are being applied and the page gets reloaded.
It is as if the .csv file I am exporting is stored in IE's cache or something... It is really annoying as the export works fine in all other browsers... Chrome and FF always download the latest version of the file from the server, IE updates the file randomly, sometimes only after I submit the page with different filters a few times...
I didn't include too many lines of my code as I rather think I am simply missing some kind of meta tag or something from my code, rather than have a logical bug in the lines I have already written.
I am really confused by this and annoyed to say the least... I really start to dislike IE now...
I appreciate any suggestions on this matter.
You could use a 'cache buster' to prevent IE from caching the resource.
If you add a GET parameter (with a value that changes every time you load the page) to the URL, IE (or rather: any browser) will think it's a different file to get, so do something like this:
window.open( 'temp/" . $export_filename . "?cachebuster=" . uniqid(true) . "' );
If the value needs to change every time you click (not on page load):
window.open( 'temp/" . $export_filename . "?cachebuster="' + Math.random() );

How to show random content every 15 minutes - php

Ok so i have a .txt file with a bunch of urls. I got a script that gets 1 of the lines randomly. I then included this into another page.
However I want the url to change every 15 minutes. So I'm guessing I'm gonna need to use a cron, however I'm not sure how I should put it all into place.
I found if you include a file, it's still going to give a random output so I'm guessing if I run the cron and the include file it's going to get messy.
So what I'm thinking is I have a script that randomly selects a url from my initial text file then it saves it to another .txt file and I include that file on the final page.
I just found this which is sort of in the right direction:
Include php code within echo from a random text
I'm not the best with writing php (can understand it perfectly) so all help is appreciated!
So what I'm thinking is I have a
script that randomly selects a url
from my initial text file then it
saves it to another .txt file and I
include that file on the final page.
That's pretty much what I would do.
To re-generate that file, though, you don't necessarily need a cron.
You could use the following idea :
If the file has been modified less that 15 minutes ago (which you can find out using filemtime() and comparing it with time())
then, use what in the file
else
re-generate the file, randomly choosing one URL from the big file
and use the newly generated file
This way, no need for a cron : the first user that arrives more than 15 minutes after the previous modification of the file will re-generate it, with a new URL.
Alright so I sorta solved my own question:
<?php
// load the file that contain thecode
$adfile = "urls.txt";
$ads = array();
// one line per code
$fh = fopen($adfile, "r");
while(!feof($fh)) {
$line = fgets($fh, 10240);
$line = trim($line);
if($line != "") {
$ads[] = $line;
}
}
// randomly pick an code
$num = count($ads);
$idx = rand(0, $num-1);
$f = fopen("output.txt", "w");
fwrite($f, $ads[$idx]);
fclose($f);
?>
However is there anyway I can delete the chosen line once it has been picked?

Categories